sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
sequencelengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
sequencelengths 0
201
| languages
sequencelengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
sequencelengths 0
722
| processed_texts
sequencelengths 1
723
| tokens_length
sequencelengths 1
723
| input_texts
sequencelengths 1
61
| embeddings
sequencelengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null | transformers | Title: Introducing Omningotex-7b: The World's Most Accurate 7B LLM
Today, I'm excited to share the creation of a groundbreaking language model, "liminerity/Omningotex-7b-slerp." This model has achieved an impressive accuracy rate of 76.33%, making it the most accurate 7B LLM in the world.
The journey to create Omningotex-7b-slerp began with an experimental process called "merging." I started with a model named "ingot-7b-slerp," which was created by merging two other LLMs, "blurred-beagle-7b-slerp" (by myself, liminerity) and "Macaroni-7b-Tied" (by andrijdavid), a total of eight times over.
After the successful creation of ingot-7b-slerp, I proceeded to merge it with another model, "dpo-binarized-NeuralTrix-7B" by eren23, using gradient slerp. The resulting model, "binarized-ingotrix-slerp-7b," achieved an accuracy rate of 76.04%.
To further enhance the model's performance, I decided to merge "binarized-ingotrix-slerp-7b" with "dpo-binarized-NeutrixOmnibe-7B" by eren23 once again. The resulting model, "Omningotex-7b," is now the most accurate 7B LLM available.
This breakthrough in LLM accuracy was achieved through a combination of careful experimentation and a deep understanding of the underlying algorithms and techniques. I believe that Omningotex-7b-slerp's success demonstrates the potential for further advancements in the field of natural language processing and artificial intelligence.
I look forward to sharing more updates and insights as I continue to explore the possibilities of LLMs and push the boundaries of what is possible in the world of AI. Stay tuned for more exciting developments in the future!
A huge thank you to Maxime Labonne and his creation of LazyMergeKit colab project. Use of it helped me gain a further grasp of the concepts at play and led to the creation of this model. I'm sure it won't be number 1 for long which excited me even more!
Next, I set out to learn how to fine-tune with the resources I have available.
My next overall goal is to try and find a way to produce a smaller model with high accuracy either through merging down using fewer layers after each merge. I may need to include finetuning between each merge or merging larger more accurate models into a smaller base while maintaining accuracy and performance. Every version of "TinyMistral" I come by seems to be bricked in the sense it spits out nonsense. Thank you for your time If you read this all the way.
# Omningotex-7B-slerp
NeuralPipe-7B-slerp is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [liminerity/binarized-ingotrix-slerp-7b](https://huggingface.co/liminerity/binarized-ingotrix-slerp-7b)
* [eren23/dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/dpo-binarized-NeutrixOmnibe-7B)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: liminerity/binarized-ingotrix-slerp-7b
layer_range: [0, 32]
- model: eren23/dpo-binarized-NeutrixOmnibe-7B
layer_range: [0, 32]
merge_method: slerp
base_model: liminerity/binarized-ingotrix-slerp-7b
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "liminerity/NeuralPipe-7B-slerp"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "apache-2.0", "tags": ["merge", "mergekit", "lazymergekit", "liminerity/binarized-ingotrix-slerp-7b", "eren23/dpo-binarized-NeutrixOmnibe-7B"], "base_model": ["liminerity/binarized-ingotrix-slerp-7b", "eren23/dpo-binarized-NeutrixOmnibe-7B"]} | text-generation | liminerity/Omningotex-7b-slerp | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"liminerity/binarized-ingotrix-slerp-7b",
"eren23/dpo-binarized-NeutrixOmnibe-7B",
"base_model:liminerity/binarized-ingotrix-slerp-7b",
"base_model:eren23/dpo-binarized-NeutrixOmnibe-7B",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T10:34:11+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #liminerity/binarized-ingotrix-slerp-7b #eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-liminerity/binarized-ingotrix-slerp-7b #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| Title: Introducing Omningotex-7b: The World's Most Accurate 7B LLM
Today, I'm excited to share the creation of a groundbreaking language model, "liminerity/Omningotex-7b-slerp." This model has achieved an impressive accuracy rate of 76.33%, making it the most accurate 7B LLM in the world.
The journey to create Omningotex-7b-slerp began with an experimental process called "merging." I started with a model named "ingot-7b-slerp," which was created by merging two other LLMs, "blurred-beagle-7b-slerp" (by myself, liminerity) and "Macaroni-7b-Tied" (by andrijdavid), a total of eight times over.
After the successful creation of ingot-7b-slerp, I proceeded to merge it with another model, "dpo-binarized-NeuralTrix-7B" by eren23, using gradient slerp. The resulting model, "binarized-ingotrix-slerp-7b," achieved an accuracy rate of 76.04%.
To further enhance the model's performance, I decided to merge "binarized-ingotrix-slerp-7b" with "dpo-binarized-NeutrixOmnibe-7B" by eren23 once again. The resulting model, "Omningotex-7b," is now the most accurate 7B LLM available.
This breakthrough in LLM accuracy was achieved through a combination of careful experimentation and a deep understanding of the underlying algorithms and techniques. I believe that Omningotex-7b-slerp's success demonstrates the potential for further advancements in the field of natural language processing and artificial intelligence.
I look forward to sharing more updates and insights as I continue to explore the possibilities of LLMs and push the boundaries of what is possible in the world of AI. Stay tuned for more exciting developments in the future!
A huge thank you to Maxime Labonne and his creation of LazyMergeKit colab project. Use of it helped me gain a further grasp of the concepts at play and led to the creation of this model. I'm sure it won't be number 1 for long which excited me even more!
Next, I set out to learn how to fine-tune with the resources I have available.
My next overall goal is to try and find a way to produce a smaller model with high accuracy either through merging down using fewer layers after each merge. I may need to include finetuning between each merge or merging larger more accurate models into a smaller base while maintaining accuracy and performance. Every version of "TinyMistral" I come by seems to be bricked in the sense it spits out nonsense. Thank you for your time If you read this all the way.
# Omningotex-7B-slerp
NeuralPipe-7B-slerp is a merge of the following models using LazyMergekit:
* liminerity/binarized-ingotrix-slerp-7b
* eren23/dpo-binarized-NeutrixOmnibe-7B
## Configuration
## Usage
| [
"# Omningotex-7B-slerp\n\nNeuralPipe-7B-slerp is a merge of the following models using LazyMergekit:\n* liminerity/binarized-ingotrix-slerp-7b\n* eren23/dpo-binarized-NeutrixOmnibe-7B",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #liminerity/binarized-ingotrix-slerp-7b #eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-liminerity/binarized-ingotrix-slerp-7b #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Omningotex-7B-slerp\n\nNeuralPipe-7B-slerp is a merge of the following models using LazyMergekit:\n* liminerity/binarized-ingotrix-slerp-7b\n* eren23/dpo-binarized-NeutrixOmnibe-7B",
"## Configuration",
"## Usage"
] | [
146,
68,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #liminerity/binarized-ingotrix-slerp-7b #eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-liminerity/binarized-ingotrix-slerp-7b #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Omningotex-7B-slerp\n\nNeuralPipe-7B-slerp is a merge of the following models using LazyMergekit:\n* liminerity/binarized-ingotrix-slerp-7b\n* eren23/dpo-binarized-NeutrixOmnibe-7B## Configuration## Usage"
] | [
-0.04041770100593567,
0.04490002989768982,
-0.006876956671476364,
0.003202995052561164,
0.05302927643060684,
0.03966903313994408,
0.14752371609210968,
0.096968874335289,
0.0013751497026532888,
0.05182403698563576,
0.04066407307982445,
0.1410534679889679,
0.023129988461732864,
0.1270202100276947,
-0.07110938429832458,
-0.2061164379119873,
0.08727405220270157,
0.029180891811847687,
-0.09843771904706955,
0.07199730724096298,
0.11129039525985718,
-0.04078727588057518,
0.07940500974655151,
0.003917916677892208,
-0.08142557740211487,
0.003201307263225317,
-0.028111012652516365,
-0.03204420953989029,
0.09395037591457367,
0.08968432992696762,
0.09652871638536453,
0.04121026396751404,
-0.031187161803245544,
-0.18635326623916626,
0.023305529728531837,
0.008702020160853863,
-0.03125739097595215,
0.06262746453285217,
0.06076568737626076,
-0.03639529272913933,
0.10570336878299713,
-0.09070265293121338,
0.037482116371393204,
0.04671366885304451,
-0.09062342345714569,
-0.11618265509605408,
-0.06895717978477478,
0.06816691160202026,
0.04446433484554291,
0.02807042747735977,
-0.02566356211900711,
0.1561272144317627,
0.0047279768623411655,
0.08737362921237946,
0.11317609995603561,
-0.2618327736854553,
-0.017297862097620964,
0.12202147394418716,
0.06378346681594849,
-0.023341555148363113,
0.009338059462606907,
0.017515411600470543,
-0.0001775099808583036,
0.007012215442955494,
0.08236955106258392,
-0.09312056750059128,
0.10785087198019028,
-0.07476214319467545,
-0.13195349276065826,
0.02392071858048439,
0.11102668195962906,
0.03320200368762016,
0.0035713831894099712,
-0.11963852494955063,
-0.11652446538209915,
0.04520295932888985,
-0.04668525233864784,
-0.06585091352462769,
0.03635607659816742,
-0.015390160493552685,
0.07262026518583298,
-0.09247449785470963,
-0.03751515969634056,
0.00444740941748023,
-0.10413560271263123,
0.221156507730484,
0.03064182959496975,
-0.014276118949055672,
-0.016583839431405067,
0.03722488880157471,
-0.11119922250509262,
-0.12989862263202667,
-0.00008192710811272264,
-0.04338102042675018,
-0.02999471127986908,
-0.023052288219332695,
-0.06972865760326385,
-0.11473379284143448,
0.12640319764614105,
0.24480706453323364,
-0.050633493810892105,
0.06558216363191605,
-0.006656014360487461,
0.04197585582733154,
-0.0027933770325034857,
0.023152172565460205,
-0.06441061943769455,
-0.16123777627944946,
0.04243577644228935,
0.05938376858830452,
0.08649857342243195,
-0.004132385365664959,
-0.10077247768640518,
-0.02522576041519642,
0.04290028288960457,
0.001623770222067833,
0.0438285730779171,
0.12458912283182144,
-0.05556270480155945,
-0.06856419146060944,
0.15840493142604828,
-0.09058430045843124,
0.01608816720545292,
-0.014464969746768475,
-0.014806860126554966,
0.028429172933101654,
0.03794068843126297,
0.0414508618414402,
-0.01289365068078041,
0.06875640898942947,
-0.06106652319431305,
-0.025971584022045135,
-0.05685849115252495,
-0.10207302868366241,
0.020497189834713936,
-0.038100577890872955,
-0.02165035717189312,
-0.1301799863576889,
-0.15291085839271545,
-0.010201492346823215,
0.05062447115778923,
-0.03421933948993683,
-0.005892827175557613,
-0.0632922500371933,
-0.0011722991475835443,
-0.0046015153639018536,
-0.0012846917379647493,
-0.03528926894068718,
-0.0017063278937712312,
0.0015130967367440462,
0.04094838723540306,
0.03933463990688324,
-0.16057878732681274,
0.021835526451468468,
-0.10047738999128342,
0.09481047838926315,
-0.18054616451263428,
0.07550501078367233,
-0.04779135063290596,
0.02683076076209545,
-0.10666155070066452,
-0.025736629962921143,
-0.06468842923641205,
0.024950722232460976,
0.03849637135863304,
0.1022745817899704,
-0.06674221903085709,
-0.08672649413347244,
0.1417631059885025,
-0.1346266269683838,
-0.11833593994379044,
0.08899693191051483,
0.004928038455545902,
0.04076642543077469,
0.0471712201833725,
0.22339020669460297,
0.1162506490945816,
0.00913758110255003,
-0.045582134276628494,
-0.013760969042778015,
-0.01417347602546215,
0.030368579551577568,
0.07468008995056152,
-0.015691109001636505,
-0.025075780227780342,
0.03642285242676735,
-0.003942433279007673,
0.06156889349222183,
0.003555823117494583,
-0.051891740411520004,
-0.031132912263274193,
-0.03948458284139633,
0.11898375302553177,
-0.041631195694208145,
0.020304471254348755,
-0.063882015645504,
-0.06957439333200455,
0.1279369741678238,
0.09859610348939896,
-0.06440785527229309,
0.02709878422319889,
-0.0936829149723053,
0.062242548912763596,
-0.03559979796409607,
0.051322974264621735,
-0.12667949497699738,
-0.11444920301437378,
0.014485354535281658,
-0.07474707067012787,
0.01577221229672432,
-0.01910397596657276,
0.07612352073192596,
0.019846323877573013,
-0.07927800714969635,
-0.04120340943336487,
0.08998870104551315,
0.02290099486708641,
-0.009639311581850052,
-0.12878619134426117,
-0.043760426342487335,
-0.06914918124675751,
0.21310381591320038,
-0.09695535153150558,
0.07156045734882355,
-0.006299848202615976,
0.19011962413787842,
0.015978414565324783,
0.013390710577368736,
0.0060073696076869965,
0.03239932283759117,
-0.03516663983464241,
-0.009001101367175579,
0.0870257243514061,
-0.0077187614515423775,
-0.15070487558841705,
0.06368183344602585,
-0.16051658987998962,
0.1448165774345398,
0.10753673315048218,
-0.005875449161976576,
-0.010869193822145462,
-0.09536924958229065,
0.007214019540697336,
-0.0593261793255806,
0.07545045018196106,
-0.07782752066850662,
0.06497837603092194,
0.027682797983288765,
0.10710088908672333,
-0.05570574104785919,
-0.03094646707177162,
-0.004597174935042858,
-0.03495670482516289,
-0.05667692795395851,
0.05498995631933212,
-0.024915365502238274,
-0.2476883977651596,
0.12319584935903549,
0.1317908614873886,
-0.00993554200977087,
0.09391292929649353,
0.01779322884976864,
0.019915351644158363,
-0.060373228043317795,
0.02319205366075039,
0.030803538858890533,
-0.029788954183459282,
-0.10206275433301926,
0.04230551794171333,
0.06528535485267639,
0.020280709490180016,
0.061290811747312546,
-0.035947367548942566,
0.04767877981066704,
0.01676156185567379,
-0.00971512496471405,
0.13609053194522858,
0.0630391463637352,
0.015000367537140846,
0.05685943737626076,
0.01537251751869917,
-0.05517212301492691,
0.05954163148999214,
-0.006018450018018484,
-0.058228347450494766,
0.14886482059955597,
-0.1423041820526123,
-0.25506535172462463,
-0.1722169667482376,
-0.09788773953914642,
-0.10983367264270782,
0.007205952424556017,
0.05078262835741043,
0.018311357125639915,
-0.0400678813457489,
-0.09113892912864685,
0.09416495263576508,
0.011017980985343456,
-0.028248758986592293,
-0.020248252898454666,
0.01391860656440258,
0.0397614911198616,
-0.10933493077754974,
-0.03567950055003166,
0.014769794419407845,
-0.028992071747779846,
0.08473967760801315,
-0.04784420132637024,
0.08137762546539307,
0.10739237070083618,
0.0058180103078484535,
-0.024110209196805954,
-0.02299005538225174,
0.13287872076034546,
-0.05427134409546852,
0.07093312591314316,
0.16321533918380737,
-0.07814651727676392,
0.06450700759887695,
0.17173096537590027,
0.023349538445472717,
-0.0550939105451107,
0.015208086930215359,
0.020240986719727516,
-0.018666164949536324,
-0.16405551135540009,
-0.129518523812294,
-0.058188825845718384,
0.018139302730560303,
0.038075320422649384,
0.049557022750377655,
0.11311282217502594,
0.06756146997213364,
-0.06245679408311844,
-0.0031559132039546967,
0.06033048406243324,
0.07901876419782639,
0.29775333404541016,
0.014291253872215748,
0.11887288093566895,
-0.017198435962200165,
-0.07417920976877213,
0.038132019340991974,
0.04147909954190254,
0.08790542930364609,
0.06468313187360764,
0.16987036168575287,
0.04193362221121788,
0.028413880616426468,
0.039903972297906876,
0.050946276634931564,
-0.01699128746986389,
0.0015904648462310433,
-0.03366990387439728,
-0.08812869340181351,
-0.0038578941021114588,
0.044210370630025864,
-0.01745556853711605,
0.04572933912277222,
0.014070075936615467,
-0.016017111018300056,
0.10202652961015701,
0.08870429545640945,
0.044892750680446625,
-0.2520180344581604,
-0.031200706958770752,
0.04310710355639458,
0.006011623423546553,
-0.041775111109018326,
-0.019085295498371124,
0.04960969090461731,
-0.060160864144563675,
0.14783962070941925,
-0.0648999959230423,
0.07049115747213364,
-0.0644986629486084,
0.028901662677526474,
-0.0030846658628433943,
0.1267385333776474,
0.008748763240873814,
0.056011416018009186,
-0.22020341455936432,
0.11065264046192169,
0.031084248796105385,
0.006907761562615633,
0.005201155785471201,
0.046988826245069504,
0.03334995359182358,
0.11938905715942383,
0.05765050649642944,
-0.012168061919510365,
0.04744671285152435,
-0.02314673736691475,
-0.09189043939113617,
0.009244469925761223,
0.07554928213357925,
-0.03125731274485588,
0.0853630006313324,
-0.045132074505090714,
-0.048303332179784775,
0.022429319098591805,
0.07600431144237518,
-0.1174490749835968,
-0.14993810653686523,
0.08169963955879211,
0.058936767280101776,
0.03975832834839821,
-0.08880425244569778,
-0.02151830680668354,
-0.10989584773778915,
0.2615828514099121,
-0.04322623088955879,
-0.08161430805921555,
-0.10608810186386108,
-0.020707612857222557,
0.12013684213161469,
-0.06903636455535889,
0.05587085336446762,
-0.022777490317821503,
0.06982795894145966,
-0.06891992688179016,
-0.14907206594944,
0.08738518506288528,
-0.062090978026390076,
-0.08018045872449875,
-0.019147785380482674,
0.11288775503635406,
-0.032307110726833344,
0.024675875902175903,
-0.010029676370322704,
0.03705213591456413,
0.0012476664269343019,
-0.05831903591752052,
-0.005858184304088354,
0.10786090046167374,
0.012103264220058918,
0.07788816094398499,
-0.077681764960289,
-0.13188473880290985,
-0.03585454076528549,
0.011477462016046047,
0.16805864870548248,
0.283643901348114,
-0.0039955382235348225,
0.027662165462970734,
0.14792165160179138,
-0.05770936980843544,
-0.1914355307817459,
-0.05102059245109558,
0.015223692171275616,
-0.008840242400765419,
0.026959814131259918,
-0.13608068227767944,
0.06086095795035362,
0.15657196938991547,
0.002042206935584545,
0.07741951942443848,
-0.2877056896686554,
-0.11826575547456741,
0.088863305747509,
0.05360577628016472,
0.09151951223611832,
-0.12167128920555115,
-0.08908852189779282,
-0.06833984702825546,
-0.1836322844028473,
0.11321227252483368,
-0.03264199569821358,
0.08248373866081238,
-0.021732192486524582,
-0.0059778024442493916,
0.027403317391872406,
-0.020639337599277496,
0.12856273353099823,
-0.04332388937473297,
0.03736794739961624,
-0.06297173351049423,
-0.07096180319786072,
0.10554663836956024,
-0.038487862795591354,
0.05984044447541237,
-0.1409960687160492,
0.0031659272499382496,
0.014929543249309063,
-0.05308651924133301,
-0.05133458599448204,
0.09537918120622635,
-0.028162237256765366,
-0.07870105654001236,
-0.04149075970053673,
0.051320724189281464,
0.021874992176890373,
0.027688106521964073,
0.18742412328720093,
-0.035623691976070404,
0.055682409554719925,
0.18333937227725983,
0.08872384577989578,
-0.058240197598934174,
-0.03472882881760597,
-0.010335075668990612,
-0.04534786939620972,
0.04491401091217995,
-0.0012460204306989908,
0.011914798989892006,
0.0945555567741394,
-0.024394230917096138,
0.11327700316905975,
0.046674128621816635,
-0.047167059034109116,
-0.03550250828266144,
0.08920296281576157,
-0.10718779265880585,
-0.16548843681812286,
-0.011765986680984497,
0.0010418451856821775,
-0.05376376211643219,
0.06273157149553299,
0.21320311725139618,
-0.021262820810079575,
-0.0063963537104427814,
0.03705570846796036,
0.004204358905553818,
-0.08366117626428604,
0.11221856623888016,
-0.0005013499176129699,
0.04449496790766716,
-0.07234220206737518,
0.052655406296253204,
0.06198209524154663,
-0.09551199525594711,
-0.02032509632408619,
0.09005481004714966,
-0.1022312119603157,
-0.08680462837219238,
-0.0779661163687706,
0.1548609435558319,
-0.017493750900030136,
-0.050954729318618774,
-0.08404393494129181,
-0.09956557303667068,
0.009235810488462448,
0.10769407451152802,
0.07789558172225952,
0.02937880903482437,
0.003955616150051355,
-0.016880590468645096,
-0.035497505217790604,
0.07284263521432877,
-0.023432545363903046,
0.11302036046981812,
-0.09246300160884857,
0.030741319060325623,
-0.03987862169742584,
-0.023640839383006096,
-0.047502871602773666,
0.006255163345485926,
-0.1434193104505539,
-0.06009887903928757,
-0.13231097161769867,
-0.03192280977964401,
-0.13658735156059265,
-0.021187318488955498,
0.01020646933466196,
0.01553497463464737,
-0.010294291190803051,
-0.010115135461091995,
-0.03261445090174675,
-0.06441087275743484,
-0.010784714482724667,
0.0567113533616066,
-0.08586785942316055,
-0.03090854547917843,
0.012242854572832584,
-0.07630350440740585,
0.07314712554216385,
0.011901137419044971,
-0.012548868544399738,
-0.01950075849890709,
-0.07610020041465759,
-0.04605378955602646,
0.07815402001142502,
0.030117757618427277,
0.016388418152928352,
-0.1005617305636406,
-0.027881620451807976,
0.017863191664218903,
-0.04574748873710632,
-0.0041322470642626286,
0.11347959190607071,
-0.10553863644599915,
0.05406087264418602,
-0.08604172617197037,
-0.05905172973871231,
-0.045483123511075974,
-0.032755296677351,
0.0213956069201231,
0.030123107135295868,
0.1568758338689804,
-0.04425092786550522,
0.021175188943743706,
-0.14697326719760895,
0.0007773452089168131,
0.01560211181640625,
-0.13918520510196686,
0.04651740938425064,
-0.04267726466059685,
0.006207400467246771,
-0.003490488976240158,
0.13446086645126343,
-0.04849199950695038,
-0.12282735109329224,
0.034789275377988815,
-0.055643871426582336,
-0.023385465145111084,
0.027585849165916443,
0.17794926464557648,
0.08068369328975677,
-0.029345229268074036,
-0.03542410954833031,
0.043624699115753174,
0.03763988986611366,
0.00024327084247488528,
0.07706405967473984,
0.13548478484153748,
0.001370765152387321,
0.09037452191114426,
0.07358729839324951,
-0.017377857118844986,
-0.10104350745677948,
0.03955035284161568,
0.01692366786301136,
0.05887438729405403,
-0.008019523695111275,
0.16832350194454193,
0.11701599508523941,
-0.1060265377163887,
0.06673821806907654,
0.018950309604406357,
-0.031149771064519882,
-0.06344551593065262,
-0.13767743110656738,
-0.1251792013645172,
-0.1099555715918541,
-0.04585610702633858,
-0.10653356462717056,
-0.0570434033870697,
0.05178935080766678,
-0.0025437737349420786,
-0.01033331360667944,
0.14927147328853607,
-0.06126084178686142,
-0.03702867031097412,
0.02508111111819744,
-0.011757084168493748,
-0.031665388494729996,
-0.003426557406783104,
-0.05575435981154442,
-0.008705412037670612,
0.05928836017847061,
0.021646598353981972,
0.010637863539159298,
0.02163046784698963,
0.03341621905565262,
-0.053082630038261414,
-0.10090164095163345,
0.00291662965901196,
0.05422148108482361,
0.00765132624655962,
0.02813553251326084,
0.027109310030937195,
-0.06980585306882858,
-0.009844410233199596,
0.12176486104726791,
-0.03416518121957779,
-0.1390453428030014,
-0.05527897924184799,
0.16258014738559723,
0.013140324503183365,
0.07146775722503662,
0.0099505465477705,
-0.0713309496641159,
-0.019908783957362175,
0.14257031679153442,
0.27314284443855286,
-0.028417130932211876,
0.0037326437886804342,
0.0491088442504406,
0.009094814769923687,
0.03288847208023071,
0.04854055121541023,
0.0450347401201725,
0.17270132899284363,
-0.023360319435596466,
0.022457242012023926,
-0.015742283314466476,
-0.039526473730802536,
-0.07186497747898102,
-0.016215305775403976,
0.03408106043934822,
0.0036420428659766912,
0.013048918917775154,
0.05381947010755539,
-0.07132028043270111,
-0.04043387621641159,
0.024889234453439713,
-0.12108822166919708,
-0.11126308143138885,
-0.06922812759876251,
0.07750222831964493,
-0.033159151673316956,
0.07720527052879333,
-0.015112549066543579,
-0.06321313977241516,
0.05293948948383331,
-0.026683086529374123,
-0.06695955991744995,
-0.04836132377386093,
0.03056909888982773,
-0.06817479431629181,
0.07105911523103714,
-0.01782635971903801,
0.04349590837955475,
0.12070419639348984,
0.008591409772634506,
-0.058128029108047485,
0.07199596613645554,
0.010623998939990997,
-0.040944360196590424,
0.06073293462395668,
0.03748476132750511,
-0.023210234940052032,
0.11996494978666306,
0.05705900117754936,
-0.16313090920448303,
0.035797905176877975,
0.08430745452642441,
-0.06355806440114975,
-0.0552658885717392,
0.04802383482456207,
-0.06936922669410706,
0.1049630418419838,
0.16771800816059113,
-0.008848090656101704,
0.004932822193950415,
-0.022779298946261406,
0.05000576749444008,
0.10683445632457733,
0.10944973677396774,
-0.07446052879095078,
-0.1925223022699356,
-0.02300836890935898,
0.031329769641160965,
0.009198925457894802,
-0.22629797458648682,
-0.09792585670948029,
-0.10800770670175552,
0.002959903096780181,
-0.08765991777181625,
0.0343073308467865,
0.12585194408893585,
0.01222344022244215,
-0.024983065202832222,
-0.1382153332233429,
-0.02796894870698452,
0.07781928032636642,
-0.13693976402282715,
-0.07109078019857407
] |
null | null | transformers | ```python
from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM, AutoTokenizer, GenerationConfig
import torch
MODEL_NAME = "Vikhrmodels/Vikhr-7B-instruct_0.2"
DEFAULT_MESSAGE_TEMPLATE = "<s>{role}\n{content}</s>\n"
DEFAULT_SYSTEM_PROMPT = "Ты — Вихрь, русскоязычный автоматический ассистент. Ты разговариваешь с людьми и помогаешь им."
class Conversation:
def __init__(
self,
message_template=DEFAULT_MESSAGE_TEMPLATE,
system_prompt=DEFAULT_SYSTEM_PROMPT,
):
self.message_template = message_template
self.messages = [{
"role": "system",
"content": system_prompt
}]
def add_user_message(self, message):
self.messages.append({
"role": "user",
"content": message
})
def get_prompt(self, tokenizer):
final_text = ""
for message in self.messages:
message_text = self.message_template.format(**message)
final_text += message_text
final_text += 'bot'
return final_text.strip()
def generate(model, tokenizer, prompt, generation_config):
data = tokenizer(prompt, return_tensors="pt")
data = {k: v.to(model.device) for k, v in data.items()}
output_ids = model.generate(
**data,
generation_config=generation_config
)[0]
output_ids = output_ids[len(data["input_ids"][0]):]
output = tokenizer.decode(output_ids, skip_special_tokens=True)
return output.strip()
#config = PeftConfig.from_pretrained(MODEL_NAME)
model = AutoModelForCausalLM.from_pretrained(
config.base_model_name_or_path,
load_in_8bit=True,
torch_dtype=torch.float16,
device_map="auto"
)
#model = PeftModel.from_pretrained( model, MODEL_NAME, torch_dtype=torch.float16)
model.eval()
tokenizer = AutoTokenizer.from_pretrained(MODEL_NAME, use_fast=False)
generation_config = GenerationConfig.from_pretrained(MODEL_NAME)
print(generation_config)
inputs = ["Как тебя зовут?", "Кто такой Колмогоров?"]
for inp in inputs:
conversation = Conversation()
conversation.add_user_message(inp)
prompt = conversation.get_prompt(tokenizer)
output = generate(model, tokenizer, prompt, generation_config)
print(inp)
print(output)
print('\n')
```
[wandb](https://wandb.ai/karina_romanova/vikhr/runs/up2hw5eh?workspace=user-karina_romanova) | {"language": ["ru", "en"], "datasets": ["zjkarina/Vikhr_instruct"]} | text-generation | Vikhrmodels/Vikhr-7B-instruct_0.2 | [
"transformers",
"safetensors",
"llama",
"text-generation",
"ru",
"en",
"dataset:zjkarina/Vikhr_instruct",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T10:35:20+00:00 | [] | [
"ru",
"en"
] | TAGS
#transformers #safetensors #llama #text-generation #ru #en #dataset-zjkarina/Vikhr_instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
wandb | [] | [
"TAGS\n#transformers #safetensors #llama #text-generation #ru #en #dataset-zjkarina/Vikhr_instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
65
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #ru #en #dataset-zjkarina/Vikhr_instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.023780670017004013,
-0.03992897644639015,
-0.0047447863034904,
-0.009788723662495613,
0.15721412003040314,
-0.013524581678211689,
0.13239267468452454,
0.0867416113615036,
-0.02143005281686783,
0.030949817970395088,
0.16948390007019043,
0.17133906483650208,
-0.018824288621544838,
0.10752358287572861,
-0.11623414605855942,
-0.17295588552951813,
0.09291334450244904,
-0.01173845399171114,
-0.015352494083344936,
0.09060672670602798,
0.12364035099744797,
-0.08146271109580994,
0.09717867523431778,
-0.05578707531094551,
-0.11144354939460754,
0.044760219752788544,
0.04059123247861862,
-0.12414321303367615,
0.10459021478891373,
0.06461939960718155,
0.13354900479316711,
0.05710141733288765,
-0.02481040731072426,
-0.19757823646068573,
0.0387008860707283,
0.015584881417453289,
-0.06881722062826157,
0.011310880072414875,
0.056004058569669724,
-0.10533849149942398,
0.0941384956240654,
-0.06284814327955246,
-0.05423511937260628,
0.05282479524612427,
-0.11861177533864975,
0.01488496270030737,
-0.03921457752585411,
-0.005078701768070459,
0.11581258475780487,
0.09404256939888,
-0.014571383595466614,
0.12451884895563126,
-0.05490954592823982,
0.12753476202487946,
0.1704120934009552,
-0.32382404804229736,
-0.01570332609117031,
0.12868183851242065,
0.06608013808727264,
0.06682439148426056,
-0.04903684929013252,
0.12336443364620209,
0.06233121454715729,
-0.047278378158807755,
0.02253040298819542,
-0.07060693949460983,
-0.11178898811340332,
0.03459712490439415,
-0.09445786476135254,
-0.004735156428068876,
0.27191177010536194,
-0.05273815989494324,
0.07239173352718353,
-0.03693161532282829,
-0.09340392798185349,
-0.03794057294726372,
-0.031056176871061325,
0.028773242607712746,
-0.07822785526514053,
0.025966672226786613,
0.03789957985281944,
0.0036978316493332386,
-0.1384856104850769,
-0.023540079593658447,
-0.1172330379486084,
0.19295158982276917,
0.03021901659667492,
0.019115403294563293,
-0.17527082562446594,
0.03779591619968414,
0.0316888764500618,
-0.11630399525165558,
0.02245648391544819,
-0.08635485172271729,
-0.0008751568384468555,
-0.01628299430012703,
-0.051845382899045944,
-0.17910078167915344,
0.1680748462677002,
0.09286294877529144,
0.018536962568759918,
0.04924030601978302,
-0.04069892317056656,
0.06385790556669235,
0.008304533548653126,
0.05770863965153694,
-0.04727795720100403,
-0.039201896637678146,
0.06685635447502136,
-0.08843439072370529,
0.03519396111369133,
-0.056749679148197174,
-0.11829228699207306,
-0.06350169330835342,
0.07384907454252243,
0.1279841512441635,
-0.0263922568410635,
0.11699885129928589,
0.0019969602581113577,
0.028761686757206917,
-0.038580674678087234,
-0.08997360616922379,
-0.011795951053500175,
0.015214277431368828,
0.01518127229064703,
0.036176420748233795,
-0.01986612007021904,
0.02982928790152073,
-0.07795484364032745,
0.06886626034975052,
-0.05101478472352028,
-0.01334274373948574,
-0.025250976905226707,
-0.09263192862272263,
0.028750905767083168,
-0.1262863576412201,
0.05582118406891823,
-0.22168712317943573,
-0.1482086032629013,
-0.012147355824708939,
0.013825063593685627,
-0.03037269599735737,
0.050791021436452866,
-0.07209938764572144,
-0.06370778381824493,
0.05834982916712761,
-0.05289169400930405,
-0.011630065739154816,
-0.06774188578128815,
0.08575374633073807,
-0.02786135859787464,
0.07769684493541718,
-0.07363873720169067,
0.03645889461040497,
-0.08375854045152664,
0.016501935198903084,
0.015695277601480484,
0.03806325048208237,
-0.06553024053573608,
0.11231286823749542,
-0.04253752902150154,
0.03117203712463379,
-0.028697481378912926,
0.045288458466529846,
0.0070581380277872086,
0.1923278570175171,
-0.14996623992919922,
-0.02985716238617897,
0.22034838795661926,
-0.1261187493801117,
-0.18742819130420685,
0.08504045009613037,
-0.004486248828470707,
0.07942494004964828,
0.09031051397323608,
0.1819186955690384,
0.07963357120752335,
-0.07780493050813675,
-0.006872562691569328,
0.044322509318590164,
-0.07543577998876572,
-0.07414316385984421,
-0.012282193638384342,
0.06479912251234055,
-0.10075341165065765,
0.05339048057794571,
0.07531318813562393,
0.061401236802339554,
-0.04221263900399208,
-0.05869877710938454,
-0.06152809411287308,
-0.06432903558015823,
0.05728818103671074,
-0.023691512644290924,
0.08761930465698242,
-0.07246573269367218,
0.01220797747373581,
-0.0822872519493103,
0.048212360590696335,
-0.028895143419504166,
0.01710760034620762,
-0.09151589870452881,
0.10776478052139282,
-0.04518170282244682,
0.07266335189342499,
-0.12149578332901001,
-0.08323284238576889,
-0.03347461670637131,
0.13195039331912994,
-0.051326338201761246,
0.07065161317586899,
0.05976136401295662,
-0.01672220230102539,
-0.020658355206251144,
-0.030008643865585327,
0.1735825538635254,
0.03917443007230759,
-0.06859736144542694,
-0.10273176431655884,
0.08878763765096664,
-0.07106511294841766,
0.002807760611176491,
-0.1402156949043274,
0.011861161328852177,
0.057035211473703384,
0.12930326163768768,
0.026411686092615128,
0.07897607237100601,
-0.009649285115301609,
0.02758214995265007,
-0.09486211836338043,
0.021094737574458122,
0.06364690512418747,
-0.02515806443989277,
-0.07524682581424713,
0.14788654446601868,
-0.15680086612701416,
0.2278413474559784,
0.15298080444335938,
-0.16748224198818207,
0.014109375886619091,
-0.007801448926329613,
0.01081754732877016,
0.00883936695754528,
0.021699586883187294,
-0.008668703027069569,
-0.05494863912463188,
0.012579682283103466,
0.15336237847805023,
-0.04169603809714317,
-0.01723976992070675,
0.04195651412010193,
-0.1181592121720314,
-0.04495464637875557,
0.09268347173929214,
0.06970197707414627,
-0.20046654343605042,
0.14929717779159546,
0.1665337234735489,
-0.018626710399985313,
0.19531500339508057,
-0.01328620221465826,
-0.0006115095457062125,
0.04971254989504814,
0.043907441198825836,
-0.0037260176613926888,
-0.008034896105527878,
-0.08883919566869736,
-0.01815246418118477,
0.059596531093120575,
0.023773564025759697,
0.046480290591716766,
-0.10033009946346283,
-0.048181794583797455,
-0.016934867948293686,
-0.009412129409611225,
0.05967317894101143,
0.1379299759864807,
-0.001849565771408379,
0.13940906524658203,
-0.05181695148348808,
-0.04301082342863083,
0.06927110999822617,
-0.018438754603266716,
-0.10195668041706085,
0.20130813121795654,
-0.11258634924888611,
-0.31351932883262634,
-0.13760459423065186,
-0.16820254921913147,
-0.10236897319555283,
0.030254045501351357,
0.09162520617246628,
-0.14464980363845825,
-0.06470783799886703,
-0.034020788967609406,
0.07480353116989136,
-0.019193803891539574,
0.03391604870557785,
-0.03828775882720947,
0.08005150407552719,
-0.05788451433181763,
-0.07362452149391174,
-0.03258766978979111,
-0.0066785430535674095,
-0.028320014476776123,
0.15359175205230713,
-0.06569119542837143,
0.12019110471010208,
0.101997010409832,
-0.013329841196537018,
0.027866488322615623,
-0.024613581597805023,
0.13530974090099335,
-0.0721147283911705,
0.03946353495121002,
0.19620616734027863,
-0.047814223915338516,
0.04004572331905365,
0.18881148099899292,
-0.028492841869592667,
-0.10197442770004272,
0.055742911994457245,
-0.0009506804635748267,
-0.08522018045186996,
-0.2108220010995865,
-0.14738363027572632,
-0.09873245656490326,
0.06523469090461731,
0.03310522437095642,
0.05490715056657791,
0.10204620659351349,
0.11446889489889145,
-0.02445586770772934,
-0.01705598644912243,
0.02485281229019165,
0.07183539867401123,
0.15849407017230988,
-0.017892349511384964,
0.11974706500768661,
-0.0932496190071106,
-0.10903459042310715,
0.0441281683743,
0.04094785824418068,
0.1122768446803093,
0.05926937982439995,
0.022332634776830673,
0.05773363634943962,
0.11553998291492462,
0.11864293366670609,
0.13553312420845032,
0.022625533863902092,
-0.05910687521100044,
0.013019596226513386,
-0.04379957169294357,
-0.050481073558330536,
0.04825947433710098,
-0.08900319039821625,
-0.115839883685112,
-0.05539052560925484,
-0.00045143780880607665,
0.10915917903184891,
0.06351672857999802,
0.05886424705386162,
-0.20720174908638,
-0.015192748978734016,
0.09412657469511032,
0.004878852050751448,
-0.09570817649364471,
0.0893959254026413,
0.052407585084438324,
-0.07060052454471588,
0.07327763736248016,
-0.060100700706243515,
0.09511683881282806,
-0.01146323699504137,
0.07001078873872757,
-0.11806026101112366,
-0.043349143117666245,
-0.003924354445189238,
0.13192610442638397,
-0.32936859130859375,
0.22342635691165924,
0.008688676171004772,
0.013656637631356716,
-0.0984281376004219,
-0.036834005266427994,
-0.005358883645385504,
0.15761543810367584,
0.12972970306873322,
-0.012328541837632656,
-0.11465517431497574,
-0.11370167881250381,
-0.008796012960374355,
0.03955012559890747,
0.1262916922569275,
-0.004984952509403229,
0.06874265521764755,
-0.06105339899659157,
-0.017252732068300247,
-0.009800159372389317,
-0.022259557619690895,
-0.07465976476669312,
-0.16798004508018494,
0.04366733878850937,
0.12371449172496796,
0.10717732459306717,
-0.02418944425880909,
0.00004172152330284007,
-0.11596357822418213,
0.13455134630203247,
-0.07898665964603424,
-0.05093041807413101,
-0.10153571516275406,
-0.044857386499643326,
0.051655132323503494,
-0.0845489650964737,
0.04989613965153694,
-0.05621122196316719,
0.06377588212490082,
-0.08809145539999008,
-0.1557294875383377,
0.05858960375189781,
-0.1197597086429596,
-0.053579699248075485,
-0.021005596965551376,
0.11533897370100021,
-0.04358766973018646,
-0.05742749944329262,
0.042829979211091995,
0.03688756376504898,
-0.05904192104935646,
-0.09233015775680542,
-0.010827120393514633,
0.021861620247364044,
-0.006400029640644789,
0.010675079189240932,
-0.11091633141040802,
-0.10950684547424316,
-0.04931420832872391,
-0.08159221708774567,
0.27135899662971497,
0.22819718718528748,
-0.055761560797691345,
0.13538320362567902,
0.14038217067718506,
-0.052636269479990005,
-0.37549424171447754,
-0.12137990444898605,
-0.13703003525733948,
0.006510005332529545,
-0.051941219717264175,
-0.09007678925991058,
0.06615675240755081,
-0.03356007859110832,
-0.0250239297747612,
0.07594828307628632,
-0.2420596480369568,
-0.11088840663433075,
0.1782606989145279,
0.07479339838027954,
0.39109328389167786,
-0.1428583860397339,
-0.05441822111606598,
-0.12237974256277084,
-0.017019882798194885,
0.106606625020504,
-0.10372737795114517,
0.10044132173061371,
-0.0041055092588067055,
0.04961169883608818,
0.034619297832250595,
-0.05150938406586647,
0.09322528541088104,
-0.021801678463816643,
0.042067646980285645,
-0.13335676491260529,
-0.016077741980552673,
0.039884913712739944,
-0.035794246941804886,
0.0372469462454319,
-0.10772085189819336,
0.04275679215788841,
-0.09866344183683395,
-0.021435268223285675,
-0.02604232355952263,
0.09675035625696182,
0.02304302528500557,
-0.052658628672361374,
-0.03450281172990799,
-0.034193627536296844,
0.04362054541707039,
-0.0036075636744499207,
0.24628092348575592,
-0.04084179550409317,
0.10167638957500458,
0.12411469966173172,
0.0825452134013176,
-0.06492727249860764,
0.14860956370830536,
-0.013057557865977287,
-0.051786843687295914,
0.08276088535785675,
-0.10807039588689804,
0.06350180506706238,
0.08071015775203705,
-0.04244228079915047,
0.05341310799121857,
0.05672888085246086,
-0.0090711060911417,
-0.00018068078497890383,
0.15739178657531738,
-0.23676513135433197,
-0.022343596443533897,
-0.03669048100709915,
0.03036743402481079,
0.07093490660190582,
0.09296754747629166,
0.18998412787914276,
-0.040480270981788635,
0.0040668887086212635,
-0.035510580986738205,
0.04313863813877106,
-0.037442222237586975,
0.11683981120586395,
0.05691660940647125,
0.04071123152971268,
-0.11102461814880371,
0.08205534517765045,
-0.005265077576041222,
-0.08389347046613693,
0.06503019481897354,
0.07874485850334167,
-0.177899569272995,
-0.13841524720191956,
0.019867392256855965,
0.16848042607307434,
-0.13143637776374817,
-0.08393066376447678,
-0.06310063600540161,
-0.13192224502563477,
0.03044680505990982,
0.2669665813446045,
0.05889624357223511,
0.0567767396569252,
0.005735163111239672,
-0.07964377850294113,
-0.09199747443199158,
0.07913962006568909,
0.05825961008667946,
0.0249981340020895,
-0.16469964385032654,
-0.042777255177497864,
-0.04543827846646309,
0.1093355193734169,
-0.0913650393486023,
-0.0064627183601260185,
-0.1676575094461441,
0.01746232807636261,
-0.1770099401473999,
-0.005843469873070717,
-0.07216428220272064,
-0.0359177403151989,
-0.02518012933433056,
-0.03617741912603378,
-0.047577470541000366,
-0.04174947366118431,
-0.07769707590341568,
0.03777202591300011,
-0.057632219046354294,
0.05808307230472565,
-0.09602794051170349,
-0.046006813645362854,
0.06060964986681938,
-0.03506910800933838,
0.09526904672384262,
0.10465272516012192,
-0.08155207335948944,
0.14306119084358215,
-0.21645709872245789,
-0.031169386580586433,
0.10265116393566132,
0.011961756274104118,
0.041780710220336914,
0.009332983754575253,
0.01737257093191147,
0.06822375953197479,
0.024779116734862328,
0.059999335557222366,
0.004667988512665033,
-0.09549739211797714,
0.05321186035871506,
-0.08905625343322754,
-0.07570061087608337,
-0.04086604341864586,
-0.016988977789878845,
0.04988923668861389,
0.0007467097020708025,
0.1523488312959671,
-0.10016300529241562,
0.08421555161476135,
-0.0024077140260487795,
0.04351023584604263,
0.006364219356328249,
-0.19085505604743958,
-0.0822555348277092,
-0.08932200819253922,
0.029412079602479935,
-0.043061643838882446,
0.17390117049217224,
0.04307136684656143,
-0.03575533628463745,
0.03005313128232956,
0.0037189170252531767,
-0.02208162285387516,
0.049052849411964417,
0.24880169332027435,
0.07704271376132965,
-0.021025290712714195,
-0.13945522904396057,
0.03634503111243248,
0.04629530385136604,
-0.0006924582412466407,
0.07526924461126328,
0.10470853745937347,
-0.07302329689264297,
0.08890051394701004,
-0.024257635697722435,
-0.0413629375398159,
0.035101234912872314,
-0.06229938194155693,
-0.1243591457605362,
0.023891424760222435,
-0.013090420514345169,
0.041131310164928436,
0.23653461039066315,
-0.03638515621423721,
-0.01964874565601349,
-0.05983727052807808,
-0.058499012142419815,
-0.147757887840271,
-0.18112468719482422,
-0.14034299552440643,
-0.09147609025239944,
0.027741510421037674,
-0.10098956525325775,
0.011100497096776962,
0.03151126578450203,
0.0626453161239624,
-0.028249187394976616,
0.1765124350786209,
0.05234145000576973,
-0.060211181640625,
0.049945853650569916,
-0.03821872919797897,
0.002161534735932946,
0.03978673368692398,
-0.022279761731624603,
-0.05218621715903282,
-0.039947669953107834,
-0.04198572412133217,
0.06794166564941406,
-0.010905931703746319,
0.0628962591290474,
-0.13826261460781097,
-0.09931156039237976,
-0.03008664771914482,
0.07894112914800644,
0.007328780833631754,
0.10541065037250519,
0.03535500168800354,
-0.03273143619298935,
0.034578826278448105,
0.23591962456703186,
-0.00878832582384348,
-0.12541316449642181,
-0.04751003161072731,
0.10602927953004837,
0.06814496219158173,
0.1275792121887207,
-0.05314384028315544,
-0.009351953864097595,
-0.04324254393577576,
0.37065747380256653,
0.23994722962379456,
-0.03097734972834587,
0.0529354102909565,
-0.06698975712060928,
0.04963601008057594,
0.03960324451327324,
0.1071968525648117,
0.07259844243526459,
0.17927558720111847,
-0.06502959877252579,
-0.005694301798939705,
-0.05347338691353798,
-0.008594676852226257,
-0.1027797982096672,
0.10552956163883209,
0.024977296590805054,
-0.05785642936825752,
-0.039297204464673996,
0.12364279478788376,
-0.1991058588027954,
0.03988988324999809,
-0.043997276574373245,
-0.11600205302238464,
-0.039711691439151764,
-0.011682738550007343,
0.15425840020179749,
-0.007755588740110397,
0.05924156308174133,
-0.04190296679735184,
-0.07401400059461594,
-0.00686991261318326,
0.033528659492731094,
-0.1105075255036354,
0.014783147722482681,
0.018379870802164078,
-0.0031939088366925716,
0.015406962484121323,
-0.03319825604557991,
0.0317978709936142,
0.10488630086183548,
-0.009772119112312794,
-0.05172416567802429,
0.13510288298130035,
0.02496587671339512,
-0.018963968381285667,
0.05329330638051033,
0.059224992990493774,
-0.009979218244552612,
-0.0045068226754665375,
0.07428750395774841,
-0.1248592957854271,
0.056472282856702805,
-0.0069417268969118595,
-0.021783243864774704,
-0.0060837664641439915,
0.098674476146698,
-0.05965594947338104,
0.09661653637886047,
0.06219121068716049,
-0.03318197652697563,
0.03325090557336807,
-0.05424086004495621,
0.047972097992897034,
-0.03479809686541557,
-0.10352174937725067,
-0.02400795929133892,
-0.17628970742225647,
-0.07467783242464066,
0.11683221161365509,
0.02188705839216709,
-0.22349312901496887,
0.03734391927719116,
-0.11473792046308517,
0.05998683348298073,
-0.14078855514526367,
0.10644925385713577,
0.16534730792045593,
0.0028092784341424704,
-0.02573067881166935,
-0.14456719160079956,
0.04452972114086151,
0.09020531922578812,
-0.07295994460582733,
-0.12976260483264923
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# codeparrot-ds
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 96
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 768
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "gpt2", "model-index": [{"name": "codeparrot-ds", "results": []}]} | text-generation | y-oguchi/codeparrot-ds | [
"transformers",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"generated_from_trainer",
"base_model:gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T10:39:30+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# codeparrot-ds
This model is a fine-tuned version of gpt2 on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 96
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 768
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| [
"# codeparrot-ds\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 96\n- eval_batch_size: 32\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 768\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 1\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# codeparrot-ds\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 96\n- eval_batch_size: 32\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 768\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 1\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
72,
27,
6,
12,
8,
3,
142,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# codeparrot-ds\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 96\n- eval_batch_size: 32\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 768\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 1\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.08502039313316345,
0.07060515880584717,
-0.0031133112497627735,
0.07766449451446533,
0.1459798663854599,
0.04016003757715225,
0.11119260638952255,
0.11838888376951218,
-0.10051953792572021,
0.0732944905757904,
0.046159472316503525,
0.05301246419548988,
0.06565430015325546,
0.10326852649450302,
-0.02201954834163189,
-0.24310053884983063,
0.014140786603093147,
-0.01594085991382599,
-0.09834274649620056,
0.10835772007703781,
0.11687622219324112,
-0.09080396592617035,
0.053058989346027374,
0.0374191552400589,
-0.16162140667438507,
0.01587640680372715,
0.000016424135537818074,
-0.047467831522226334,
0.1116764098405838,
0.02208814024925232,
0.12331412732601166,
0.014941299334168434,
0.14021068811416626,
-0.21571293473243713,
-0.00027126932400278747,
0.11625538021326065,
0.04341758042573929,
0.07985816150903702,
0.07039795815944672,
-0.006135673727840185,
0.10250525921583176,
-0.11553215235471725,
0.10355354100465775,
0.029140161350369453,
-0.11465736478567123,
-0.18078787624835968,
-0.10471414029598236,
0.02894168719649315,
0.10345493257045746,
0.08851251751184464,
-0.004530746955424547,
0.10168663412332535,
-0.08717808872461319,
0.052217233926057816,
0.18085765838623047,
-0.2640100121498108,
-0.07369247823953629,
0.07697110623121262,
0.041007764637470245,
0.057774823158979416,
-0.0894763320684433,
-0.008035991340875626,
0.03566988930106163,
0.018410328775644302,
0.09520377963781357,
0.020653603598475456,
-0.0853089764714241,
-0.004644692875444889,
-0.12795068323612213,
-0.02447855845093727,
0.05304216220974922,
0.031595513224601746,
-0.0337972529232502,
-0.10461539775133133,
-0.039407216012477875,
-0.11880622804164886,
-0.017103448510169983,
-0.042853910475969315,
0.03663138300180435,
-0.016527477651834488,
-0.06332693994045258,
-0.04849490895867348,
-0.07560060918331146,
-0.08894363045692444,
0.0014872803585603833,
0.11028986424207687,
0.03197626397013664,
0.011985433287918568,
-0.029598427936434746,
0.1360870897769928,
-0.030432119965553284,
-0.1170232966542244,
-0.016377393156290054,
0.005514409858733416,
-0.10895559191703796,
-0.0693979561328888,
-0.05082988739013672,
-0.027049235999584198,
-0.0008101910352706909,
0.15571656823158264,
-0.06342554092407227,
0.07743517309427261,
-0.009293430484831333,
0.0038894873578101397,
-0.05881979689002037,
0.15936779975891113,
-0.02833086997270584,
-0.012617133557796478,
-0.009874388575553894,
0.07804331183433533,
-0.009027189575135708,
-0.012127276510000229,
-0.06993264704942703,
-0.019416607916355133,
0.06308811902999878,
0.051809921860694885,
-0.06661660224199295,
0.03468289598822594,
-0.056066256016492844,
-0.010242990218102932,
0.01946072280406952,
-0.12185749411582947,
0.055923327803611755,
-0.0018117680447176099,
-0.07350096106529236,
-0.04076145216822624,
0.015031080693006516,
0.035149529576301575,
-0.018633056432008743,
0.10313768684864044,
-0.06612026691436768,
0.0036172170657664537,
-0.1132272481918335,
-0.06776674091815948,
0.00401658657938242,
-0.060355499386787415,
-0.026869535446166992,
-0.042882077395915985,
-0.19439856708049774,
-0.04542545974254608,
0.05903735011816025,
-0.06547275930643082,
-0.03139495849609375,
-0.036142103374004364,
-0.04584944248199463,
0.03761553019285202,
-0.015769513323903084,
0.17442499101161957,
-0.04880990460515022,
0.0541868582367897,
-0.003243193496018648,
0.03384162113070488,
0.0404251404106617,
0.027271680533885956,
-0.07575226575136185,
0.0313827246427536,
-0.1572374403476715,
0.0866241455078125,
-0.08578474074602127,
0.003125742543488741,
-0.11065580695867538,
-0.09199174493551254,
-0.010411047376692295,
-0.016243064776062965,
0.08023281395435333,
0.08512420207262039,
-0.18287000060081482,
-0.03270258754491806,
0.17895230650901794,
-0.09798278659582138,
-0.06650237739086151,
0.07794453203678131,
-0.054607708007097244,
0.028047069907188416,
0.05870098993182182,
0.1391400247812271,
0.06631043553352356,
-0.11224719882011414,
0.011308282613754272,
-0.022317616268992424,
0.06120401248335838,
0.06669091433286667,
0.04670886695384979,
-0.023943539708852768,
0.04789384827017784,
-0.0007365860510617495,
-0.04727167263627052,
0.006130213383585215,
-0.08037469536066055,
-0.07980045676231384,
-0.06097915396094322,
-0.07089392840862274,
0.018468394875526428,
0.02084405906498432,
0.03390245884656906,
-0.0789896622300148,
-0.1444048136472702,
0.10386569797992706,
0.10278797149658203,
-0.07282323390245438,
0.021123269572854042,
-0.08727099001407623,
-0.004680593032389879,
0.002674416173249483,
-0.0338071845471859,
-0.1837809681892395,
-0.132156640291214,
0.022422833368182182,
-0.09012551605701447,
0.02481997199356556,
0.006268334109336138,
0.07459832727909088,
0.06125799939036369,
-0.05161570385098457,
0.012764384970068932,
-0.1029888316988945,
0.0005332384607754648,
-0.0854787603020668,
-0.19930744171142578,
-0.04009455442428589,
-0.013163688592612743,
0.1911376714706421,
-0.24029594659805298,
0.020118704065680504,
0.01364456582814455,
0.15118393301963806,
0.025348015129566193,
-0.07349628210067749,
-0.037627313286066055,
0.03633035719394684,
-0.003460152307525277,
-0.10611436516046524,
0.026738310232758522,
0.009420890361070633,
-0.08493785560131073,
-0.023881016299128532,
-0.13945850729942322,
-0.002779498230665922,
0.08317636698484421,
0.0251543540507555,
-0.10630328953266144,
-0.023699495941400528,
-0.0649842917919159,
-0.039085905998945236,
-0.08275825530290604,
0.006963444873690605,
0.21155638992786407,
0.012647001072764397,
0.11863826960325241,
-0.06838946789503098,
-0.07859166711568832,
0.001600998337380588,
0.01194026879966259,
0.0325317457318306,
0.06790667027235031,
0.08684373646974564,
-0.0842721089720726,
0.08234243839979172,
0.0761798769235611,
-0.05222588777542114,
0.11276018619537354,
-0.046447690576314926,
-0.08668556064367294,
-0.010985738597810268,
-0.0054974001832306385,
0.006955630611628294,
0.10553518682718277,
-0.08955636620521545,
0.022051570937037468,
0.015957294031977654,
0.041050150990486145,
0.043533287942409515,
-0.20893679559230804,
-0.004191438667476177,
0.030944038182497025,
-0.05742773786187172,
-0.003451951779425144,
-0.035678159445524216,
0.03391103819012642,
0.08856607973575592,
0.02228442206978798,
0.001719775260426104,
0.011590798385441303,
-0.01711283065378666,
-0.0864671915769577,
0.1842329502105713,
-0.1025552898645401,
-0.11913231015205383,
-0.08898831903934479,
0.0436021164059639,
-0.00529410969465971,
-0.021429743617773056,
-0.0075385174714028835,
-0.10200813412666321,
-0.05366703122854233,
-0.08811771869659424,
-0.006067223846912384,
-0.018749291077256203,
-0.008043481968343258,
0.05411801487207413,
0.014265188947319984,
0.07897903025150299,
-0.13447244465351105,
0.009528284892439842,
-0.021144898608326912,
-0.13398899137973785,
0.011542389169335365,
0.06038330867886543,
0.04887618124485016,
0.16357532143592834,
-0.02194966934621334,
0.019090475514531136,
-0.034067586064338684,
0.2102258950471878,
-0.10367229580879211,
-0.003552235197275877,
0.10601140558719635,
-0.000045039716496830806,
0.04484744369983673,
0.09454480558633804,
0.04534340649843216,
-0.11324132978916168,
0.034560464322566986,
0.08986354619264603,
-0.03391321375966072,
-0.25327178835868835,
-0.02807440422475338,
-0.026166286319494247,
-0.09923551976680756,
0.09588069468736649,
0.03922557830810547,
0.01466585136950016,
0.04914248734712601,
-0.005347087047994137,
0.03398736193776131,
0.027179431170225143,
0.06886446475982666,
0.08133600652217865,
0.06486853212118149,
0.11387094110250473,
-0.020819703117012978,
-0.018784239888191223,
0.056829746812582016,
0.0007742647430859506,
0.262931764125824,
-0.015330576337873936,
0.09036093205213547,
0.04836338758468628,
0.11201755702495575,
-0.003534219227731228,
0.028613781556487083,
0.0041142478585243225,
-0.025850363075733185,
0.00952738057821989,
-0.06468897312879562,
-0.004795098211616278,
0.03346693143248558,
-0.05924007296562195,
0.011472380720078945,
-0.06293163448572159,
0.003981509245932102,
0.02115551568567753,
0.24523739516735077,
0.05863800272345543,
-0.2804736793041229,
-0.07442817091941833,
0.012187681160867214,
-0.032715361565351486,
-0.06738661229610443,
0.0020909772720187902,
0.08508762717247009,
-0.12344343215227127,
0.07088052481412888,
-0.07367685437202454,
0.09988908469676971,
-0.035538267344236374,
-0.009418843314051628,
0.09718982875347137,
0.11692631244659424,
-0.013501725159585476,
0.07082740217447281,
-0.21978329122066498,
0.21050675213336945,
0.0064195552840828896,
0.120257169008255,
-0.05979413911700249,
0.052439525723457336,
0.012062372639775276,
0.035032156854867935,
0.06238324195146561,
-0.003131659934297204,
-0.09122354537248611,
-0.14775843918323517,
-0.06043924391269684,
0.045771319419145584,
0.13321644067764282,
-0.04006453603506088,
0.07991167157888412,
-0.04008248448371887,
0.021903306245803833,
0.06310584396123886,
-0.08636363595724106,
-0.19465014338493347,
-0.12816929817199707,
0.022941648960113525,
-0.0001666522875893861,
0.006085431668907404,
-0.09281008690595627,
-0.09580177813768387,
-0.06362884491682053,
0.2103891372680664,
-0.013062513433396816,
-0.04046681523323059,
-0.13465739786624908,
0.09382837265729904,
0.14001832902431488,
-0.027145348489284515,
0.045254334807395935,
0.032620869576931,
0.14210298657417297,
0.04131137579679489,
-0.06222378835082054,
0.08086495101451874,
-0.06019212678074837,
-0.20232132077217102,
-0.06021881103515625,
0.1203274354338646,
0.07820183783769608,
0.04021238908171654,
0.011382123455405235,
0.05086252838373184,
0.014848155900835991,
-0.10120350122451782,
0.03395960479974747,
0.06372157484292984,
0.04857946187257767,
0.046874888241291046,
-0.04432191699743271,
0.03358588367700577,
-0.02542237751185894,
-0.041262347251176834,
0.093082956969738,
0.21025505661964417,
-0.08213019371032715,
0.06843198835849762,
0.015222587622702122,
-0.08756799250841141,
-0.14403600990772247,
0.1038573607802391,
0.13257290422916412,
0.019884606823325157,
0.03677447512745857,
-0.20328015089035034,
0.12118711322546005,
0.13998302817344666,
-0.03648754954338074,
0.06707929819822311,
-0.30670979619026184,
-0.15435637533664703,
0.041626498103141785,
0.09827142208814621,
-0.012718343175947666,
-0.12168096750974655,
-0.03514673188328743,
-0.03528747335076332,
-0.14828833937644958,
0.1548016369342804,
-0.11464779078960419,
0.09652729332447052,
0.006205768324434757,
0.09348928928375244,
0.019683590158820152,
-0.03833823651075363,
0.14927716553211212,
0.01251615583896637,
0.0806405171751976,
-0.048294249922037125,
0.07184522598981857,
0.08644895255565643,
-0.0508471317589283,
-0.010054128244519234,
-0.022730477154254913,
0.05689515918493271,
-0.10746566951274872,
-0.025912003591656685,
-0.06766242533922195,
0.04430396109819412,
-0.05402058735489845,
-0.06805985420942307,
-0.05096853896975517,
0.05901544541120529,
0.049464624375104904,
-0.035593148320913315,
0.05319513380527496,
-0.014961698092520237,
0.14869438111782074,
0.04015666991472244,
0.09732633829116821,
-0.015673747286200523,
-0.0556471049785614,
0.02367398329079151,
-0.001952456310391426,
0.0536070242524147,
-0.13877137005329132,
0.04847530275583267,
0.1284458041191101,
0.035407233983278275,
0.1295659840106964,
0.05754746496677399,
-0.06240672245621681,
0.010930576361715794,
0.05018524453043938,
-0.08580667525529861,
-0.11013396829366684,
0.023639369755983353,
-0.007549038156867027,
-0.09544537216424942,
0.012104573659598827,
0.11668650060892105,
-0.04230194166302681,
-0.012451961636543274,
0.0004627013113349676,
0.014678265899419785,
-0.04024215415120125,
0.18043485283851624,
0.00839587114751339,
0.0643441304564476,
-0.0811978355050087,
0.10156085342168808,
0.07198067009449005,
-0.09516461193561554,
0.026144828647375107,
0.06907495111227036,
-0.08228645473718643,
0.00037194998003542423,
0.05597033351659775,
0.1283847838640213,
-0.03766491636633873,
-0.038760483264923096,
-0.0777522400021553,
-0.10903528332710266,
0.05249432474374771,
0.09687598794698715,
0.032000791281461716,
0.013467478565871716,
-0.02263924665749073,
0.04758189246058464,
-0.13298827409744263,
0.05519456788897514,
0.04191353917121887,
0.08121581375598907,
-0.10515991598367691,
0.1369224488735199,
0.025132957845926285,
-0.015508575364947319,
-0.01203930377960205,
0.030550142750144005,
-0.08754749596118927,
-0.012817836366593838,
-0.12232711911201477,
-0.009974919259548187,
-0.01299076247960329,
-0.0034420157317072153,
-0.0012904616305604577,
-0.04881886765360832,
-0.048803310841321945,
0.03675610572099686,
-0.07890845835208893,
-0.05447705462574959,
-0.011814698576927185,
0.03996671736240387,
-0.1327139139175415,
0.006359272636473179,
0.03864911571145058,
-0.09690675884485245,
0.08272979408502579,
0.0380762480199337,
0.04450725018978119,
0.04495713487267494,
-0.19116248190402985,
0.008571571670472622,
0.017103973776102066,
0.01857231743633747,
0.03888262063264847,
-0.09853293746709824,
0.0025690351612865925,
-0.025822047144174576,
0.05360408499836922,
0.019937630742788315,
0.022287817671895027,
-0.11881153285503387,
-0.008464128710329533,
-0.028446558862924576,
-0.05590646713972092,
-0.03951243683695793,
0.04052593931555748,
0.0849200040102005,
0.04113535210490227,
0.145379900932312,
-0.08607983589172363,
0.03533652797341347,
-0.2255905568599701,
-0.029765434563159943,
0.006540770176798105,
0.003130329539999366,
-0.05371355265378952,
-0.03015615977346897,
0.08314571529626846,
-0.060189731419086456,
0.15670683979988098,
0.007151120807975531,
0.07089363038539886,
0.05015765130519867,
-0.06106958165764809,
-0.006880515720695257,
0.031692903488874435,
0.1996646523475647,
0.0846441239118576,
-0.010733104310929775,
0.08594432473182678,
-0.022033747285604477,
0.036370132118463516,
0.03205963969230652,
0.2102293074131012,
0.14406399428844452,
-0.029972366988658905,
0.044315155595541,
0.05497223883867264,
-0.1300974190235138,
-0.11679107695817947,
0.14374595880508423,
-0.06621889024972916,
0.09931324422359467,
-0.062197715044021606,
0.17782099545001984,
0.08583535254001617,
-0.17543266713619232,
0.04747530817985535,
-0.06074640527367592,
-0.10417013615369797,
-0.13037355244159698,
-0.042883243411779404,
-0.0854625478386879,
-0.1398707628250122,
0.03711473196744919,
-0.11605647951364517,
0.07348514348268509,
0.11370490491390228,
0.025841964408755302,
0.026713062077760696,
0.13494037091732025,
-0.0012628573458641768,
0.013523390516638756,
0.04667900130152702,
0.020533060654997826,
0.023149525746703148,
-0.05814246088266373,
-0.08544342964887619,
0.037083666771650314,
0.02148672193288803,
0.07833531498908997,
-0.05292660742998123,
-0.015485539101064205,
0.022160138934850693,
0.014778525568544865,
-0.05475853756070137,
0.030212527140975,
0.012779545970261097,
0.03728248551487923,
0.012875828891992569,
0.03540028631687164,
0.01979583315551281,
-0.04267670214176178,
0.2900780439376831,
-0.06889443844556808,
-0.10376233607530594,
-0.1344304382801056,
0.21548107266426086,
0.00823119468986988,
-0.00143804878462106,
0.05445088446140289,
-0.09364389628171921,
-0.0009008905617520213,
0.12705422937870026,
0.13960492610931396,
-0.06875903904438019,
-0.019475547596812248,
-0.01316798385232687,
-0.023788195103406906,
-0.04267480969429016,
0.13571365177631378,
0.0860610157251358,
0.02632223069667816,
-0.06004348024725914,
-0.018213143572211266,
-0.002802615286782384,
-0.026493383571505547,
-0.051669877022504807,
0.0718117356300354,
-0.00008642348984722048,
0.014853009954094887,
-0.031160354614257812,
0.09067011624574661,
0.01705106534063816,
-0.1562112271785736,
0.05574052035808563,
-0.1594012826681137,
-0.18931862711906433,
-0.003027055412530899,
0.06633125245571136,
-0.05040517449378967,
0.049276020377874374,
-0.0016562718665227294,
-0.025293737649917603,
0.08836032450199127,
-0.014869577251374722,
-0.02227330580353737,
-0.09968427568674088,
0.07076312601566315,
-0.08211992681026459,
0.2496209293603897,
-0.01397528126835823,
0.07858804613351822,
0.10555511713027954,
0.026957808062434196,
-0.10961620509624481,
0.03334899991750717,
0.05504781752824783,
-0.09994805604219437,
0.018850503489375114,
0.14925354719161987,
-0.050567109137773514,
0.05810559168457985,
0.058861780911684036,
-0.15322710573673248,
0.018026823177933693,
-0.039832036942243576,
-0.04840525612235069,
-0.08904372900724411,
-0.03412022814154625,
-0.06672333180904388,
0.15267543494701385,
0.21258334815502167,
-0.030086517333984375,
0.033122021704912186,
-0.07220978289842606,
0.025583365932106972,
0.026755237951874733,
0.14739802479743958,
-0.04401170089840889,
-0.24012191593647003,
0.03475925698876381,
0.08124028146266937,
0.005891033913940191,
-0.21122883260250092,
-0.08043955266475677,
0.044302113354206085,
-0.06469938158988953,
-0.05411914736032486,
0.12303253263235092,
0.07269278913736343,
0.0465419739484787,
-0.031256962567567825,
-0.16472865641117096,
-0.029457036405801773,
0.15488837659358978,
-0.14371410012245178,
-0.04400596395134926
] |
null | null | diffusers |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# t2iadapter-cosmo3769/t2i-adapter-sdxl
These are t2iadapter weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.
You can find some example images below.
prompt: validation_prompt

prompt: validation_prompt

prompt: validation_prompt

## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | {"license": "creativeml-openrail-m", "library_name": "diffusers", "tags": ["stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "diffusers", "t2iadapter"], "inference": true, "base_model": "runwayml/stable-diffusion-v1-5"} | text-to-image | cosmo3769/test-model-card-template-t2i-adapter-sdxl | [
"diffusers",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"t2iadapter",
"base_model:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
] | 2024-02-12T10:40:53+00:00 | [] | [] | TAGS
#diffusers #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #t2iadapter #base_model-runwayml/stable-diffusion-v1-5 #license-creativeml-openrail-m #region-us
|
# t2iadapter-cosmo3769/t2i-adapter-sdxl
These are t2iadapter weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.
You can find some example images below.
prompt: validation_prompt
!images_0)
prompt: validation_prompt
!images_1)
prompt: validation_prompt
!images_2)
## Intended uses & limitations
#### How to use
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | [
"# t2iadapter-cosmo3769/t2i-adapter-sdxl\n\nThese are t2iadapter weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.\nYou can find some example images below.\nprompt: validation_prompt\n!images_0)\nprompt: validation_prompt\n!images_1)\nprompt: validation_prompt\n!images_2)",
"## Intended uses & limitations",
"#### How to use",
"#### Limitations and bias\n\n[TODO: provide examples of latent issues and potential remediations]",
"## Training details\n\n[TODO: describe the data used to train the model]"
] | [
"TAGS\n#diffusers #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #t2iadapter #base_model-runwayml/stable-diffusion-v1-5 #license-creativeml-openrail-m #region-us \n",
"# t2iadapter-cosmo3769/t2i-adapter-sdxl\n\nThese are t2iadapter weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.\nYou can find some example images below.\nprompt: validation_prompt\n!images_0)\nprompt: validation_prompt\n!images_1)\nprompt: validation_prompt\n!images_2)",
"## Intended uses & limitations",
"#### How to use",
"#### Limitations and bias\n\n[TODO: provide examples of latent issues and potential remediations]",
"## Training details\n\n[TODO: describe the data used to train the model]"
] | [
74,
99,
9,
5,
24,
16
] | [
"passage: TAGS\n#diffusers #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #t2iadapter #base_model-runwayml/stable-diffusion-v1-5 #license-creativeml-openrail-m #region-us \n# t2iadapter-cosmo3769/t2i-adapter-sdxl\n\nThese are t2iadapter weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.\nYou can find some example images below.\nprompt: validation_prompt\n!images_0)\nprompt: validation_prompt\n!images_1)\nprompt: validation_prompt\n!images_2)## Intended uses & limitations#### How to use#### Limitations and bias\n\n[TODO: provide examples of latent issues and potential remediations]## Training details\n\n[TODO: describe the data used to train the model]"
] | [
-0.1132761687040329,
0.09009352326393127,
-0.001494686701335013,
0.059243910014629364,
0.12037340551614761,
-0.01258923765271902,
0.1695919930934906,
0.14295175671577454,
-0.06686313450336456,
0.02241760864853859,
0.0034413980320096016,
0.02277878113090992,
0.07759591192007065,
0.18756768107414246,
-0.026762649416923523,
-0.2500561475753784,
-0.007803674321621656,
-0.014988943003118038,
-0.10207802802324295,
0.11173446476459503,
0.1246323511004448,
-0.07270799577236176,
0.08406668901443481,
0.03916254639625549,
-0.17608605325222015,
0.058721382170915604,
-0.00013139587827026844,
-0.10942837595939636,
0.06877925246953964,
0.022888129577040672,
0.035302210599184036,
0.11590567231178284,
0.13533885776996613,
-0.16168741881847382,
0.016123255714774132,
0.08438361436128616,
-0.0349276103079319,
0.07022993266582489,
0.04637696221470833,
-0.011102200485765934,
0.1837204545736313,
-0.034451425075531006,
0.08056239783763885,
0.05848994106054306,
-0.11487103998661041,
-0.06496257334947586,
-0.024800624698400497,
0.09823673218488693,
0.1460064798593521,
0.10314993560314178,
0.021948859095573425,
0.12468220293521881,
-0.09490606933832169,
0.06983218342065811,
0.2694770097732544,
-0.18098676204681396,
-0.05680679529905319,
0.1181521862745285,
0.0768091008067131,
0.051420580595731735,
-0.11408481746912003,
-0.021991578862071037,
0.06925084441900253,
-0.027378937229514122,
0.05099724233150482,
-0.06904502213001251,
0.05063270777463913,
-0.003043954959139228,
-0.13962621986865997,
-0.02981138974428177,
0.24473436176776886,
0.015736401081085205,
-0.08797142654657364,
-0.07789865881204605,
-0.035692427307367325,
-0.026058638468384743,
-0.00938170775771141,
-0.08227495104074478,
0.005734293255954981,
-0.010228272527456284,
-0.045951876789331436,
-0.06676279008388519,
-0.11113551259040833,
-0.07574352622032166,
0.033515237271785736,
0.1033860370516777,
0.005944211967289448,
0.014396381564438343,
-0.0539296455681324,
0.1778085082769394,
-0.04701168090105057,
-0.11959919333457947,
-0.019095079973340034,
-0.06368723511695862,
-0.07521273195743561,
-0.04538944736123085,
0.01385337021201849,
-0.09516043215990067,
0.004929675720632076,
0.1019187942147255,
0.11111894994974136,
-0.004193955101072788,
-0.06349789351224899,
0.0658581480383873,
0.03005201742053032,
0.05072944983839989,
-0.06609085947275162,
-0.04822399094700813,
0.03788191080093384,
0.052086587995290756,
-0.08216456323862076,
-0.02560601569712162,
-0.08423052728176117,
-0.0415649376809597,
0.05060356855392456,
0.05430857092142105,
0.01210583932697773,
0.03206893056631088,
-0.09642909467220306,
-0.05055331811308861,
0.026007631793618202,
-0.10798907279968262,
-0.007716102059930563,
-0.04942409694194794,
-0.10351066291332245,
-0.05950791761279106,
0.12385228276252747,
0.03595796227455139,
-0.036428362131118774,
0.03174988552927971,
-0.09635575115680695,
0.011457921005785465,
-0.07422660291194916,
-0.09077323973178864,
-0.022341502830386162,
-0.15038728713989258,
-0.01861785352230072,
-0.08806256949901581,
-0.18868368864059448,
-0.02272258885204792,
0.04555884748697281,
-0.04038044810295105,
-0.018816553056240082,
-0.016430091112852097,
-0.05121941864490509,
-0.04141225665807724,
0.03722953423857689,
0.04729966074228287,
-0.02211935445666313,
0.04704193398356438,
-0.014264615252614021,
0.07996541261672974,
0.01522788405418396,
0.016798291355371475,
-0.0998925045132637,
0.025880789384245872,
-0.16578936576843262,
0.06484893709421158,
-0.07217016071081161,
0.04257986694574356,
-0.11947814375162125,
-0.09346320480108261,
-0.02190636657178402,
-0.018870297819375992,
0.03006580099463463,
0.1689937561750412,
-0.2538878917694092,
-0.0019022979540750384,
0.17216041684150696,
-0.1350797712802887,
-0.06738556176424026,
0.039304349571466446,
-0.05918733775615692,
0.24519944190979004,
0.040462784469127655,
0.11431992053985596,
0.0685262680053711,
-0.21324710547924042,
0.15913641452789307,
-0.01697881892323494,
-0.01764741539955139,
0.033143993467092514,
0.05385255068540573,
-0.0055336784571409225,
-0.0019272038480266929,
-0.0072120362892746925,
-0.0730598196387291,
-0.0036203425843268633,
-0.08513205498456955,
-0.060177672654390335,
-0.0018189928960055113,
-0.04245384782552719,
0.011985846795141697,
0.011310766451060772,
-0.007068197708576918,
-0.030121363699436188,
-0.08163948357105255,
0.04163554310798645,
0.07948963344097137,
-0.03692420572042465,
0.003180615371093154,
-0.10069144517183304,
0.039420295506715775,
-0.06712841242551804,
-0.055657483637332916,
-0.15743403136730194,
0.02885258011519909,
-0.006611305754631758,
0.17469127476215363,
0.057361189275979996,
0.18688976764678955,
0.06432154029607773,
0.02516973949968815,
-0.0072434451431035995,
-0.013932463712990284,
-0.07544849067926407,
0.03763934224843979,
-0.053692664951086044,
-0.18496091663837433,
-0.006171592976897955,
-0.06502637267112732,
0.10871732980012894,
-0.21559573709964752,
0.0536947175860405,
0.13217473030090332,
0.11324722319841385,
0.13247209787368774,
-0.037172384560108185,
0.026425868272781372,
0.012035018764436245,
-0.04773774743080139,
-0.09456009417772293,
-0.005216734483838081,
0.04417895898222923,
-0.07793362438678741,
0.05466344207525253,
-0.14049845933914185,
0.11789298802614212,
0.0911245048046112,
0.010107789188623428,
-0.13654397428035736,
-0.08129706978797913,
-0.07402803003787994,
-0.018565313890576363,
-0.03249726817011833,
-0.001925098244100809,
0.06824705004692078,
-0.030646735802292824,
0.09814594686031342,
-0.07479454576969147,
-0.013128634542226791,
0.01804886758327484,
-0.028978494927287102,
0.017192646861076355,
0.07722766697406769,
0.08056475222110748,
-0.11901671439409256,
0.04775810241699219,
0.053405918180942535,
0.01194800715893507,
0.10092443972826004,
-0.004630638752132654,
-0.1321951299905777,
-0.042871106415987015,
0.042664553970098495,
0.0428336076438427,
0.15817026793956757,
-0.0925493836402893,
0.00918851513415575,
0.030823757871985435,
-0.008317303843796253,
0.03666558116674423,
-0.1712930053472519,
-0.03761616721749306,
0.09922312945127487,
-0.03859473019838333,
0.030384210869669914,
0.007752280216664076,
-0.04303278401494026,
0.10719314217567444,
-0.031311649829149246,
-0.05281111225485802,
0.028380876407027245,
-0.02304542437195778,
-0.11922413855791092,
0.14507415890693665,
-0.059060268104076385,
-0.16343314945697784,
-0.08652624487876892,
-0.018526092171669006,
0.026458866894245148,
0.028012637048959732,
0.03968653082847595,
-0.01792694814503193,
-0.056985560804605484,
-0.09849496185779572,
0.010808105580508709,
-0.06577194482088089,
0.002103208564221859,
0.01585390418767929,
0.06689931452274323,
0.02243129350244999,
-0.11084714531898499,
-0.0008529297192580998,
0.00814459566026926,
-0.0956781730055809,
0.04575030133128166,
-0.013209173455834389,
0.12128923088312149,
0.1094522625207901,
-0.057683318853378296,
0.032676275819540024,
-0.04998321831226349,
0.20474672317504883,
-0.07535514235496521,
0.07554338127374649,
0.18654446303844452,
-0.0282667838037014,
0.04005703330039978,
0.1244971975684166,
0.0007697757100686431,
-0.06910412758588791,
0.05889115482568741,
0.00003093621853622608,
-0.08453753590583801,
-0.19843938946723938,
-0.012830118648707867,
-0.08514780551195145,
-0.035703469067811966,
0.07781640440225601,
0.05009588971734047,
0.17727568745613098,
0.09399403631687164,
0.04985390603542328,
0.026291970163583755,
0.0376291386783123,
0.11627916991710663,
-0.026443827897310257,
0.007142353802919388,
0.05212271213531494,
-0.0462869331240654,
-0.01900974102318287,
0.10030876100063324,
0.005713749211281538,
0.26864784955978394,
-0.03731341287493706,
0.05392952263355255,
0.08477900177240372,
0.08014809340238571,
0.025976309552788734,
0.12131579965353012,
-0.010074103251099586,
-0.01821773312985897,
-0.032735660672187805,
-0.09669788181781769,
-0.021862205117940903,
0.09722588956356049,
-0.002526385011151433,
-0.012444892898201942,
-0.058058395981788635,
0.04062448814511299,
-0.016123788431286812,
0.11062651872634888,
0.02532484196126461,
-0.26224830746650696,
0.02560194581747055,
-0.02838057465851307,
0.022996533662080765,
-0.012872998602688313,
0.019849738106131554,
0.14652793109416962,
-0.13489656150341034,
0.0006283599650487304,
-0.039740581065416336,
0.10027918964624405,
-0.019855160266160965,
-0.03712743893265724,
0.046619150787591934,
0.05670471861958504,
-0.05728882923722267,
0.07616426795721054,
-0.2429238110780716,
0.1846722811460495,
-0.014093996025621891,
0.048365019261837006,
-0.09095664322376251,
0.028599336743354797,
-0.005289641674607992,
0.05476051941514015,
0.1754215508699417,
-0.031747251749038696,
0.010957960970699787,
-0.08392266929149628,
-0.08823288977146149,
-0.027112968266010284,
0.07213178277015686,
-0.0981358140707016,
0.0626526027917862,
0.0038636524695903063,
0.01953260600566864,
0.040038883686065674,
0.006365174427628517,
-0.2537372410297394,
-0.13703185319900513,
0.05843285471200943,
-0.0889069214463234,
-0.012612484395503998,
-0.11672918498516083,
-0.06672295928001404,
0.05978831276297569,
0.2155139446258545,
0.022288452833890915,
-0.044253598898649216,
-0.18635773658752441,
0.0028416430577635765,
0.08301598578691483,
-0.014680527150630951,
0.03383944556117058,
0.0694245919585228,
0.11543659120798111,
-0.013498221524059772,
-0.07247377932071686,
0.13241563737392426,
-0.0797978937625885,
-0.13425999879837036,
-0.08059458434581757,
0.11069551110267639,
0.06983203440904617,
0.026745179668068886,
0.038010235875844955,
-0.03591392934322357,
0.03109975904226303,
-0.08356446027755737,
0.01213838066905737,
0.04945147782564163,
0.04274992644786835,
0.017132416367530823,
-0.05281014367938042,
-0.049047719687223434,
-0.055791839957237244,
0.005729274358600378,
0.11435125023126602,
0.20098397135734558,
-0.06866822391748428,
0.10769250243902206,
-0.01371026411652565,
-0.10117100924253464,
-0.1526474803686142,
0.15019232034683228,
0.016786744818091393,
0.03968026861548424,
0.05898769944906235,
-0.12900590896606445,
0.02131730131804943,
0.04783038794994354,
-0.000985919265076518,
0.10933303833007812,
-0.3709816634654999,
-0.11238902062177658,
0.04729193449020386,
0.14629198610782623,
-0.015903018414974213,
-0.12330795079469681,
0.0009014029055833817,
-0.007651594001799822,
-0.19923196732997894,
0.09233593940734863,
-0.06851951032876968,
0.03741570562124252,
-0.016650579869747162,
0.03130647912621498,
0.038255102932453156,
-0.035883065313100815,
0.08655191212892532,
-0.004501230549067259,
0.06764169037342072,
-0.12923160195350647,
-0.002878577448427677,
0.19739800691604614,
-0.05638587847352028,
0.014938787557184696,
-0.0025154207833111286,
0.061721399426460266,
-0.18665795028209686,
-0.051787011325359344,
-0.002545973053202033,
0.057253219187259674,
-0.0635751411318779,
-0.13466091454029083,
-0.034691646695137024,
0.07619328051805496,
0.05071500316262245,
-0.03157224878668785,
-0.0987081229686737,
-0.05988794565200806,
0.04967718571424484,
0.1909538358449936,
0.12271974980831146,
0.07520194351673126,
-0.09489840269088745,
-0.011016596108675003,
0.03520193323493004,
0.06417568027973175,
-0.1656462848186493,
0.017365682870149612,
0.10262127220630646,
0.06985504925251007,
0.15703412890434265,
0.054731231182813644,
-0.06302662193775177,
0.02724885568022728,
0.0848899632692337,
-0.10603398829698563,
-0.0415179543197155,
-0.05181140825152397,
0.0646507740020752,
-0.05236760899424553,
-0.002008738461881876,
0.07833413779735565,
-0.10239054262638092,
0.017463721334934235,
-0.0057749259285628796,
0.04376303777098656,
-0.07614734768867493,
0.12441038340330124,
0.06590662151575089,
0.04060734063386917,
-0.04399015009403229,
0.07029049098491669,
0.05268067866563797,
-0.08513932675123215,
-0.005586947314441204,
0.03205395117402077,
-0.07866283506155014,
-0.020686043426394463,
0.06561879068613052,
0.16172853112220764,
-0.0330742783844471,
-0.014114972203969955,
-0.04960549622774124,
-0.07974269986152649,
0.014647937379777431,
0.10549290478229523,
0.018081223592162132,
-0.012164953164756298,
-0.018580017611384392,
0.002932341769337654,
-0.10770745575428009,
0.04733245074748993,
0.09771664440631866,
0.06247369572520256,
-0.142933189868927,
0.09696805477142334,
-0.038574520498514175,
0.01853935793042183,
-0.05800177529454231,
-0.0165304783731699,
-0.09475043416023254,
0.01956498622894287,
-0.05323325842618942,
0.046977803111076355,
-0.056062132120132446,
-0.0103750079870224,
-0.006412377581000328,
-0.05798887088894844,
-0.023401157930493355,
0.038407307118177414,
-0.10448788106441498,
-0.007782579865306616,
0.013554137200117111,
0.029501106590032578,
-0.058646611869335175,
-0.025373516604304314,
0.037338413298130035,
-0.11023017764091492,
0.08196575194597244,
-0.011003538966178894,
-0.06615211069583893,
-0.05974226072430611,
-0.12255360186100006,
0.04452501982450485,
0.06185249239206314,
-0.02025764063000679,
0.06348738074302673,
-0.03333055227994919,
0.033323146402835846,
-0.015559492632746696,
-0.038676220923662186,
-0.0478924885392189,
0.014951236546039581,
-0.13962161540985107,
0.010267711244523525,
-0.040355291217565536,
-0.0667719617486,
-0.03975412994623184,
0.07291804254055023,
0.15960271656513214,
0.04227149114012718,
0.11441584676504135,
-0.04155222699046135,
0.09867987036705017,
-0.1724044531583786,
-0.047281648963689804,
0.052373431622982025,
0.060072991997003555,
0.04430929198861122,
-0.05950084701180458,
0.043535441160202026,
-0.06904689967632294,
0.12655402719974518,
0.0676359012722969,
-0.01597697101533413,
0.016193948686122894,
-0.10263647139072418,
0.10591446608304977,
0.008771544322371483,
0.23266172409057617,
0.02378775365650654,
0.026424823328852654,
-0.01189191173762083,
0.0100006815046072,
0.04406210035085678,
-0.09255485236644745,
0.1578540951013565,
0.12206931412220001,
-0.02223312109708786,
0.047713108360767365,
0.04277995228767395,
-0.06414586305618286,
-0.10046666860580444,
0.08641457557678223,
0.048962220549583435,
0.08949053287506104,
-0.06300679594278336,
0.06610625982284546,
0.20729291439056396,
-0.11791101098060608,
0.027712512761354446,
0.04513758420944214,
-0.07765132933855057,
-0.06409985572099686,
-0.14332331717014313,
-0.049377381801605225,
-0.15247632563114166,
0.026639791205525398,
-0.10463996231555939,
0.026834139600396156,
0.060902081429958344,
0.05224470794200897,
0.018032405525445938,
0.12265755236148834,
-0.04717940464615822,
-0.07334399968385696,
0.04860958084464073,
-0.0027585451025515795,
-0.015766216441988945,
-0.07890798151493073,
-0.012659554369747639,
0.02988029643893242,
0.04344405233860016,
0.025958513841032982,
0.004482666030526161,
0.03378559648990631,
0.02552022971212864,
-0.028059322386980057,
-0.009006952866911888,
0.019563032314181328,
-0.01608947478234768,
0.022897042334079742,
0.14120496809482574,
0.0402563102543354,
-0.03218425065279007,
-0.009675108827650547,
0.22387664020061493,
-0.07181386649608612,
-0.1282431036233902,
-0.13350608944892883,
0.12634530663490295,
-0.012990965507924557,
0.020013844594359398,
-0.0037533934228122234,
-0.12953977286815643,
0.05407962575554848,
0.1483391672372818,
0.22594286501407623,
-0.04248432070016861,
0.0040476578287780285,
-0.018868451938033104,
-0.012248840183019638,
-0.04091062769293785,
0.05775294452905655,
0.04229394719004631,
0.11773346364498138,
-0.050851382315158844,
0.007983251474797726,
-0.04373757541179657,
-0.03693622723221779,
-0.05211998522281647,
0.041742414236068726,
0.05105980858206749,
-0.01875489577651024,
-0.04973211884498596,
0.12982510030269623,
-0.04545982927083969,
-0.22111915051937103,
0.0723661333322525,
-0.08318818360567093,
-0.11845694482326508,
-0.016761980950832367,
0.05163668468594551,
-0.0025518122129142284,
0.03716588392853737,
-0.039155080914497375,
0.0036578455474227667,
0.03912482038140297,
0.03808820992708206,
-0.07160212844610214,
-0.03249563276767731,
0.08841954916715622,
-0.05633446201682091,
0.15429306030273438,
-0.0493590272963047,
0.06335853040218353,
0.05727966129779816,
0.041365571320056915,
-0.1104651466012001,
0.016503168269991875,
0.04439297318458557,
-0.03825684264302254,
-0.0370347760617733,
0.10720326751470566,
-0.031255222856998444,
0.049175992608070374,
0.046765055507421494,
-0.15657128393650055,
0.004320133943110704,
-0.07951897382736206,
0.0358375608921051,
-0.13987433910369873,
0.017867477610707283,
-0.08544495701789856,
0.13045856356620789,
0.100873664021492,
-0.05907672643661499,
0.04659581556916237,
-0.08626095205545425,
0.05365411937236786,
0.055949460715055466,
-0.016297684982419014,
-0.009673954918980598,
-0.07379996031522751,
-0.026417767629027367,
0.0639943853020668,
0.027821365743875504,
-0.1898900419473648,
-0.07484101504087448,
-0.0324409157037735,
-0.06391341239213943,
-0.037133555859327316,
0.10995370149612427,
0.1558045595884323,
0.060808125883340836,
-0.02974729798734188,
-0.1757763773202896,
0.029559699818491936,
0.12521161139011383,
-0.1092069149017334,
-0.0093911811709404
] |
null | null | setfit |
# SetFit with intfloat/multilingual-e5-large
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [intfloat/multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 12 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 6 | <ul><li>'Are there any major whitespace opportunity in terms of Categories x Pack Segments in Cuernavaca?'</li><li>'In Colas MS which packsegment is not dominated by KOF in TT HM Orizaba 2022? At what price point we can launch an offering'</li><li>'I want to launch a new pack type in csd for kof. Tell me what'</li></ul> |
| 2 | <ul><li>'Since which quarter, the share decline started happening for Colas MS in Cuernavaca?'</li><li>'What is the Market share for Resto in colas MS at each size groups in TT HM Orizaba in 2022'</li><li>'How is the Jumex market share have changed over last three year at quaterly level in TT HM from 2019-2022'</li></ul> |
| 0 | <ul><li>'what are the top brands contributing to share loss for KOF in TT OP Cuernavaca in 2021'</li><li>'Which are the top contributing categoriesXconsumo to the share loss for KOF in Cuernavaca in 2022?'</li><li>'Which packs have driven the shares for the competition in Colas in FY 21-22?'</li></ul> |
| 10 | <ul><li>'Which pack segment shows opportunities to drive my market share in NCBS Colas SS?'</li><li>'How can I strategically expand my presence in specific packaging segments to enhance market penetration in CSD Sabores?'</li><li>'What are my priority pack segments to gain share in AGUA Colas SS?'</li></ul> |
| 5 | <ul><li>'Where should I play in terms\xa0of flavor in Sabores SS?'</li><li>'What are the untapped opportunities in Graffon?'</li><li>'What areas should I focus on to grow my market presence?'</li></ul> |
| 7 | <ul><li>'What are the upsizing benefits being offered in Coca-Cola NR Packs? Is there any recommendation to improve it?'</li><li>'Is Fanta a premium brand? How premium are its offerings as compared to other brands in Sabores?'</li><li>'Is there an opportunity to premiumize any offerings for coca-cola?'</li></ul> |
| 9 | <ul><li>'Which industries to prioritize to gain share in AGUA in Cuernavaca?'</li><li>'What measures can be taken to maximize headroom in the AGUA market?'</li><li>'List headroom opportunities for AGUA'</li></ul> |
| 11 | <ul><li>'How to win in the prioritized pack segments in Colas MS ?'</li><li>'How should KOF gain share in Colas MS in Cuernavaca? '</li><li>'How can I gain share in CSD Colas MS in Cuernavaca'</li></ul> |
| 8 | <ul><li>'What is the price range for CSD in TT HM?'</li><li>'what is PCO share for different price bracket in TT OP 2021'</li><li>'distribution wise, which non csd skus are doing the best?'</li></ul> |
| 3 | <ul><li>'What is the difference in offerings for KOF vs the key competitors in xx price bracket within CSD Colas in TT HM?'</li><li>'How should KOF gain share in <10 price bracket for NCB in TT HM'</li><li>'What are my pricing opportuities?'</li></ul> |
| 1 | <ul><li>"What are the primary factors behind Danone's market share decline in Orizaba for FY21-22?"</li><li>'Why is Resto losing share in Cuernavaca Colas SS RET Original?'</li><li>'What are the main factors contributing to the share gain of Jumex in Still Drinks MS in Orizaba for FY 2022?'</li></ul> |
| 4 | <ul><li>'How has the csd industry evolved in the last two years?'</li><li>'What is the industry mix for all KOF brands in TT HM Cuernavaca in 2022'</li><li>'What is the change in industry mix for coca-cola in TT HM Orizaba in 2021 to 2022'</li></ul> |
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.25 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("vgarg/fw_identification_model_e5_large_v5_12_02_24")
# Run inference
preds = model("Why is KOF losing share in Cuernavaca Colas MS RET Original?")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:----|
| Word count | 4 | 13.7632 | 32 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0 | 10 |
| 1 | 10 |
| 2 | 10 |
| 3 | 8 |
| 4 | 10 |
| 5 | 10 |
| 6 | 10 |
| 7 | 10 |
| 8 | 10 |
| 9 | 10 |
| 10 | 10 |
| 11 | 6 |
### Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (3, 3)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.0035 | 1 | 0.3488 | - |
| 0.1754 | 50 | 0.1594 | - |
| 0.3509 | 100 | 0.0872 | - |
| 0.5263 | 150 | 0.0065 | - |
| 0.7018 | 200 | 0.0033 | - |
| 0.8772 | 250 | 0.0018 | - |
| 1.0526 | 300 | 0.001 | - |
| 1.2281 | 350 | 0.0008 | - |
| 1.4035 | 400 | 0.0008 | - |
| 1.5789 | 450 | 0.0006 | - |
| 1.7544 | 500 | 0.0005 | - |
| 1.9298 | 550 | 0.0005 | - |
| 2.1053 | 600 | 0.0005 | - |
| 2.2807 | 650 | 0.0004 | - |
| 2.4561 | 700 | 0.0003 | - |
| 2.6316 | 750 | 0.0004 | - |
| 2.8070 | 800 | 0.0004 | - |
| 2.9825 | 850 | 0.0004 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.0.3
- Sentence Transformers: 2.3.1
- Transformers: 4.35.2
- PyTorch: 2.1.0+cu121
- Datasets: 2.17.0
- Tokenizers: 0.15.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | {"library_name": "setfit", "tags": ["setfit", "sentence-transformers", "text-classification", "generated_from_setfit_trainer"], "metrics": ["accuracy"], "widget": [{"text": "Why is KOF losing share in Cuernavaca Colas MS RET Original?"}, {"text": "Are there any whitespaces in terms of flavor for KOF within CSD Sabores?"}, {"text": "What is the trend of KOF\"s market share in Colas SS in Cuernavaca from 2019 to YTD 2023?"}, {"text": "Which categories have seen the some of the highest Share losses for KOF in Cuernavaca in 2022?"}, {"text": "Which Category X Pack can we see the major share gain and which parameters are driving the share gain in Cuernavaca?"}], "pipeline_tag": "text-classification", "inference": true, "base_model": "intfloat/multilingual-e5-large", "model-index": [{"name": "SetFit with intfloat/multilingual-e5-large", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "Unknown", "type": "unknown", "split": "test"}, "metrics": [{"type": "accuracy", "value": 0.25, "name": "Accuracy"}]}]}]} | text-classification | vgarg/fw_identification_model_e5_large_v5_12_02_24 | [
"setfit",
"safetensors",
"xlm-roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:intfloat/multilingual-e5-large",
"model-index",
"region:us"
] | 2024-02-12T10:43:13+00:00 | [
"2209.11055"
] | [] | TAGS
#setfit #safetensors #xlm-roberta #sentence-transformers #text-classification #generated_from_setfit_trainer #arxiv-2209.11055 #base_model-intfloat/multilingual-e5-large #model-index #region-us
| SetFit with intfloat/multilingual-e5-large
==========================================
This is a SetFit model that can be used for Text Classification. This SetFit model uses intfloat/multilingual-e5-large as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a Sentence Transformer with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
-------------
### Model Description
* Model Type: SetFit
* Sentence Transformer body: intfloat/multilingual-e5-large
* Classification head: a LogisticRegression instance
* Maximum Sequence Length: 512 tokens
* Number of Classes: 12 classes
### Model Sources
* Repository: SetFit on GitHub
* Paper: Efficient Few-Shot Learning Without Prompts
* Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts
### Model Labels
Evaluation
----------
### Metrics
Uses
----
### Direct Use for Inference
First install the SetFit library:
Then you can load this model and run inference.
Training Details
----------------
### Training Set Metrics
### Training Hyperparameters
* batch\_size: (16, 16)
* num\_epochs: (3, 3)
* max\_steps: -1
* sampling\_strategy: oversampling
* num\_iterations: 20
* body\_learning\_rate: (2e-05, 2e-05)
* head\_learning\_rate: 2e-05
* loss: CosineSimilarityLoss
* distance\_metric: cosine\_distance
* margin: 0.25
* end\_to\_end: False
* use\_amp: False
* warmup\_proportion: 0.1
* seed: 42
* eval\_max\_steps: -1
* load\_best\_model\_at\_end: False
### Training Results
### Framework Versions
* Python: 3.10.12
* SetFit: 1.0.3
* Sentence Transformers: 2.3.1
* Transformers: 4.35.2
* PyTorch: 2.1.0+cu121
* Datasets: 2.17.0
* Tokenizers: 0.15.1
### BibTeX
| [
"### Model Description\n\n\n* Model Type: SetFit\n* Sentence Transformer body: intfloat/multilingual-e5-large\n* Classification head: a LogisticRegression instance\n* Maximum Sequence Length: 512 tokens\n* Number of Classes: 12 classes",
"### Model Sources\n\n\n* Repository: SetFit on GitHub\n* Paper: Efficient Few-Shot Learning Without Prompts\n* Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts",
"### Model Labels\n\n\n\nEvaluation\n----------",
"### Metrics\n\n\n\nUses\n----",
"### Direct Use for Inference\n\n\nFirst install the SetFit library:\n\n\nThen you can load this model and run inference.\n\n\nTraining Details\n----------------",
"### Training Set Metrics",
"### Training Hyperparameters\n\n\n* batch\\_size: (16, 16)\n* num\\_epochs: (3, 3)\n* max\\_steps: -1\n* sampling\\_strategy: oversampling\n* num\\_iterations: 20\n* body\\_learning\\_rate: (2e-05, 2e-05)\n* head\\_learning\\_rate: 2e-05\n* loss: CosineSimilarityLoss\n* distance\\_metric: cosine\\_distance\n* margin: 0.25\n* end\\_to\\_end: False\n* use\\_amp: False\n* warmup\\_proportion: 0.1\n* seed: 42\n* eval\\_max\\_steps: -1\n* load\\_best\\_model\\_at\\_end: False",
"### Training Results",
"### Framework Versions\n\n\n* Python: 3.10.12\n* SetFit: 1.0.3\n* Sentence Transformers: 2.3.1\n* Transformers: 4.35.2\n* PyTorch: 2.1.0+cu121\n* Datasets: 2.17.0\n* Tokenizers: 0.15.1",
"### BibTeX"
] | [
"TAGS\n#setfit #safetensors #xlm-roberta #sentence-transformers #text-classification #generated_from_setfit_trainer #arxiv-2209.11055 #base_model-intfloat/multilingual-e5-large #model-index #region-us \n",
"### Model Description\n\n\n* Model Type: SetFit\n* Sentence Transformer body: intfloat/multilingual-e5-large\n* Classification head: a LogisticRegression instance\n* Maximum Sequence Length: 512 tokens\n* Number of Classes: 12 classes",
"### Model Sources\n\n\n* Repository: SetFit on GitHub\n* Paper: Efficient Few-Shot Learning Without Prompts\n* Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts",
"### Model Labels\n\n\n\nEvaluation\n----------",
"### Metrics\n\n\n\nUses\n----",
"### Direct Use for Inference\n\n\nFirst install the SetFit library:\n\n\nThen you can load this model and run inference.\n\n\nTraining Details\n----------------",
"### Training Set Metrics",
"### Training Hyperparameters\n\n\n* batch\\_size: (16, 16)\n* num\\_epochs: (3, 3)\n* max\\_steps: -1\n* sampling\\_strategy: oversampling\n* num\\_iterations: 20\n* body\\_learning\\_rate: (2e-05, 2e-05)\n* head\\_learning\\_rate: 2e-05\n* loss: CosineSimilarityLoss\n* distance\\_metric: cosine\\_distance\n* margin: 0.25\n* end\\_to\\_end: False\n* use\\_amp: False\n* warmup\\_proportion: 0.1\n* seed: 42\n* eval\\_max\\_steps: -1\n* load\\_best\\_model\\_at\\_end: False",
"### Training Results",
"### Framework Versions\n\n\n* Python: 3.10.12\n* SetFit: 1.0.3\n* Sentence Transformers: 2.3.1\n* Transformers: 4.35.2\n* PyTorch: 2.1.0+cu121\n* Datasets: 2.17.0\n* Tokenizers: 0.15.1",
"### BibTeX"
] | [
70,
61,
52,
8,
8,
31,
7,
177,
4,
58,
6
] | [
"passage: TAGS\n#setfit #safetensors #xlm-roberta #sentence-transformers #text-classification #generated_from_setfit_trainer #arxiv-2209.11055 #base_model-intfloat/multilingual-e5-large #model-index #region-us \n### Model Description\n\n\n* Model Type: SetFit\n* Sentence Transformer body: intfloat/multilingual-e5-large\n* Classification head: a LogisticRegression instance\n* Maximum Sequence Length: 512 tokens\n* Number of Classes: 12 classes### Model Sources\n\n\n* Repository: SetFit on GitHub\n* Paper: Efficient Few-Shot Learning Without Prompts\n* Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts### Model Labels\n\n\n\nEvaluation\n----------### Metrics\n\n\n\nUses\n----### Direct Use for Inference\n\n\nFirst install the SetFit library:\n\n\nThen you can load this model and run inference.\n\n\nTraining Details\n----------------### Training Set Metrics### Training Hyperparameters\n\n\n* batch\\_size: (16, 16)\n* num\\_epochs: (3, 3)\n* max\\_steps: -1\n* sampling\\_strategy: oversampling\n* num\\_iterations: 20\n* body\\_learning\\_rate: (2e-05, 2e-05)\n* head\\_learning\\_rate: 2e-05\n* loss: CosineSimilarityLoss\n* distance\\_metric: cosine\\_distance\n* margin: 0.25\n* end\\_to\\_end: False\n* use\\_amp: False\n* warmup\\_proportion: 0.1\n* seed: 42\n* eval\\_max\\_steps: -1\n* load\\_best\\_model\\_at\\_end: False### Training Results### Framework Versions\n\n\n* Python: 3.10.12\n* SetFit: 1.0.3\n* Sentence Transformers: 2.3.1\n* Transformers: 4.35.2\n* PyTorch: 2.1.0+cu121\n* Datasets: 2.17.0\n* Tokenizers: 0.15.1### BibTeX"
] | [
-0.08092276751995087,
0.13488221168518066,
-0.005385030061006546,
0.06998131424188614,
0.12228512763977051,
0.059432502835989,
0.06500046700239182,
0.1417890191078186,
-0.035211581736803055,
0.1473623514175415,
0.05818699672818184,
0.11504901200532913,
0.11941958218812943,
0.1835976541042328,
-0.00804551225155592,
-0.2919101417064667,
0.016871895641088486,
-0.10975111275911331,
-0.03875916451215744,
0.08913757652044296,
0.10450556129217148,
-0.0756378024816513,
0.05550091341137886,
-0.04995589330792427,
-0.03591758385300636,
-0.01770750805735588,
-0.03904445841908455,
-0.02312321960926056,
0.01913958601653576,
0.04332682117819786,
0.026859339326620102,
-0.013949692249298096,
0.07001688331365585,
-0.311448335647583,
0.005691529251635075,
0.07190779596567154,
-0.007003374397754669,
0.07321098446846008,
0.09890063852071762,
-0.08204378932714462,
0.09229067713022232,
-0.10035387426614761,
0.09340537339448929,
0.04413455352187157,
-0.14286218583583832,
-0.16624972224235535,
-0.08034388720989227,
0.10080838948488235,
0.1526905745267868,
0.0752885639667511,
-0.06253224611282349,
0.03911828249692917,
-0.06207331269979477,
0.07142602652311325,
0.15972913801670074,
-0.2331656515598297,
-0.0752456784248352,
0.04189200699329376,
0.04494214057922363,
0.030211446806788445,
-0.1044231727719307,
-0.04752983897924423,
0.00157797965221107,
0.0510559044778347,
0.04967590793967247,
0.019267404451966286,
0.0722680389881134,
-0.009143900126218796,
-0.11323469877243042,
-0.051064081490039825,
0.07742269337177277,
0.04796077311038971,
-0.026108456775546074,
-0.17251503467559814,
0.00122917874250561,
-0.13242262601852417,
-0.049270521849393845,
0.025864167138934135,
0.008442973718047142,
-0.005982912145555019,
-0.012757213786244392,
0.025041768327355385,
-0.0256577730178833,
-0.04060233756899834,
0.05477815866470337,
0.017148710787296295,
0.05629381537437439,
-0.04183191433548927,
0.04921233654022217,
0.09428157657384872,
-0.0004961153026670218,
-0.17268531024456024,
-0.02526293322443962,
-0.02046908251941204,
-0.09417569637298584,
-0.033578939735889435,
0.010586450807750225,
0.046677008271217346,
0.05109609290957451,
0.21977220475673676,
-0.08033087104558945,
0.11124169081449509,
-0.01330001000314951,
0.02771729975938797,
-0.010320891626179218,
0.060810621827840805,
-0.09160356968641281,
-0.11269248276948929,
-0.07316194474697113,
0.09268121421337128,
-0.012412789277732372,
-0.01460015494376421,
-0.015235974453389645,
0.05074302479624748,
0.06177156791090965,
0.06837917119264603,
0.032033320516347885,
0.04209484905004501,
-0.10301608592271805,
-0.03659438714385033,
0.019461670890450478,
-0.14785772562026978,
0.051198069006204605,
0.04952959716320038,
-0.0903032124042511,
-0.06608200818300247,
0.04950731247663498,
-0.0005975457024760544,
-0.0599931925535202,
0.0833793506026268,
-0.059426017105579376,
0.014495281502604485,
-0.08859039098024368,
-0.10415897518396378,
0.04710961878299713,
-0.025705158710479736,
-0.03289908170700073,
-0.029687989503145218,
-0.12177350372076035,
-0.10556157678365707,
0.06765960156917572,
-0.12292511016130447,
-0.05467487871646881,
-0.11261735111474991,
-0.11738526076078415,
0.046847883611917496,
0.012350977398455143,
0.10484963655471802,
-0.04919048771262169,
0.05710233747959137,
-0.022965066134929657,
0.07166251540184021,
0.15121270716190338,
0.045000456273555756,
-0.03377820923924446,
0.07863738387823105,
-0.1820461004972458,
0.1421816200017929,
-0.1055135726928711,
0.05978095903992653,
-0.15948393940925598,
-0.06319601088762283,
-0.021720806136727333,
0.02089960314333439,
0.08774222433567047,
0.13055644929409027,
-0.20720238983631134,
-0.045198455452919006,
0.23743242025375366,
-0.0665992945432663,
-0.10545992851257324,
0.07343382388353348,
-0.04388332739472389,
0.08088138699531555,
0.032396409660577774,
0.10862415283918381,
0.09336582571268082,
-0.08063193410634995,
-0.007006441708654165,
-0.08844073861837387,
0.046324148774147034,
0.167985737323761,
0.04661683365702629,
-0.0430859737098217,
0.0352134071290493,
0.015304775908589363,
-0.0020701424218714237,
0.011343876831233501,
-0.06657659262418747,
-0.08403139561414719,
0.014091605320572853,
-0.0682935044169426,
0.01658892072737217,
0.035057537257671356,
-0.019941875711083412,
-0.06323375552892685,
-0.14769430458545685,
0.026395834982395172,
0.06669815629720688,
-0.04529030993580818,
-0.0010300701251253486,
-0.08203056454658508,
-0.00020519792451523244,
0.07030568271875381,
0.009926228784024715,
-0.1761941909790039,
-0.03623885661363602,
0.021294014528393745,
-0.02020077593624592,
0.05296595022082329,
-0.07069866359233856,
0.06723622977733612,
0.04694117233157158,
-0.05614297837018967,
-0.049069780856370926,
0.036410413682460785,
0.014146304689347744,
-0.06377965211868286,
-0.24485161900520325,
-0.02766340970993042,
-0.02038394659757614,
0.21884529292583466,
-0.2487073540687561,
0.04593924060463905,
-0.029134904965758324,
0.14338073134422302,
0.010419197380542755,
-0.05136125534772873,
0.020262330770492554,
0.009815465658903122,
-0.007616747170686722,
-0.08651763200759888,
0.020400773733854294,
-0.010513721965253353,
-0.060698386281728745,
-0.0433504544198513,
-0.1950751096010208,
-0.04393380507826805,
0.09617987275123596,
0.018223680555820465,
-0.1897355616092682,
-0.08929649740457535,
-0.028519775718450546,
-0.056376244872808456,
-0.06420813500881195,
-0.04309437796473503,
0.14647196233272552,
0.026475030928850174,
0.07932653278112411,
-0.04242602735757828,
-0.061136554926633835,
-0.0030848123133182526,
-0.016068249940872192,
-0.010856928303837776,
0.18623381853103638,
-0.006631449330598116,
-0.11661519855260849,
0.11105571687221527,
0.02698388695716858,
-0.01946292072534561,
0.09782387316226959,
-0.03171022608876228,
-0.06626470386981964,
-0.08335810154676437,
0.07510685920715332,
0.06572999805212021,
0.06168382614850998,
-0.07577510178089142,
0.03031962923705578,
0.024241428822278976,
0.002470247447490692,
0.0047524468973279,
-0.10198976844549179,
0.0032557102385908365,
0.018242957070469856,
-0.041184552013874054,
0.02349308878183365,
-0.04507829621434212,
0.021496238186955452,
0.08833734691143036,
0.02425401471555233,
0.04056350886821747,
-0.01237969659268856,
-0.056117262691259384,
-0.11998695880174637,
0.19772472977638245,
-0.1342257708311081,
-0.1762700378894806,
-0.08768322318792343,
-0.006370836403220892,
0.013395563699305058,
-0.03325359895825386,
0.009592043235898018,
-0.06791447103023529,
-0.05797818303108215,
-0.1122596263885498,
0.03598153591156006,
0.042831603437662125,
-0.0482848659157753,
-0.05071798339486122,
0.04569632187485695,
0.09410413354635239,
-0.08455409854650497,
0.011466403491795063,
0.022164154797792435,
-0.05876259505748749,
0.01923208124935627,
0.005092830862849951,
0.0320848785340786,
0.15084612369537354,
0.07317740470170975,
0.031898848712444305,
-0.002084338106215,
0.23163335025310516,
-0.08987400680780411,
0.04737410321831703,
0.07187274843454361,
-0.010763738304376602,
0.0725317895412445,
0.21770130097866058,
0.035484347492456436,
-0.08072742074728012,
0.05761304497718811,
0.06124222278594971,
-0.02007865533232689,
-0.21609458327293396,
-0.022838348522782326,
-0.04376959800720215,
0.02066069468855858,
0.14351895451545715,
0.03116338886320591,
0.06197357177734375,
0.0423470176756382,
-0.05409906804561615,
-0.025700414553284645,
0.11150307208299637,
0.08986476808786392,
-0.00536708626896143,
0.03146231546998024,
0.0920461043715477,
-0.015267382375895977,
0.01779111847281456,
0.023332735523581505,
-0.010264805518090725,
0.17300069332122803,
-0.01705513522028923,
0.10994749516248703,
0.08250361680984497,
0.13415023684501648,
-0.042125388979911804,
0.03840266540646553,
-0.02203073352575302,
0.029342181980609894,
0.024944884702563286,
-0.06487034261226654,
-0.004113042261451483,
0.07807736843824387,
0.026951825246214867,
0.03292376175522804,
-0.07227044552564621,
0.002227234421297908,
0.09571779519319534,
0.15770944952964783,
0.09653040021657944,
-0.255490243434906,
-0.052774928510189056,
0.06013791635632515,
-0.06735947728157043,
-0.0664786845445633,
-0.017418893054127693,
0.06971607357263565,
-0.12574341893196106,
0.066621333360672,
-0.06002378091216087,
0.0907701700925827,
-0.04374983161687851,
-0.003983006346970797,
0.06671798229217529,
0.12225471436977386,
-0.008693531155586243,
0.05099133402109146,
-0.21452754735946655,
0.1563791036605835,
0.004220508970320225,
0.07999277859926224,
-0.06826543807983398,
0.05116431042551994,
0.04739363119006157,
-0.10239240527153015,
0.1322854906320572,
-0.01421529334038496,
-0.14246609807014465,
-0.16510450839996338,
-0.07003527134656906,
-0.03567773476243019,
0.11808966100215912,
-0.12694846093654633,
0.11873629689216614,
-0.0025389704387634993,
-0.03171220421791077,
0.00585919851437211,
-0.05098713934421539,
-0.14025717973709106,
-0.1338537633419037,
0.017294591292738914,
-0.08237139135599136,
0.059334393590688705,
-0.07476219534873962,
-0.03287490829825401,
-0.05392675846815109,
0.16558662056922913,
-0.2261994183063507,
-0.06864471733570099,
-0.14183439314365387,
0.11583973467350006,
0.16334421932697296,
-0.08316841721534729,
0.05018756538629532,
0.024760419502854347,
0.11889082193374634,
0.01817430928349495,
-0.019926339387893677,
0.11468949168920517,
-0.06472615152597427,
-0.2134348303079605,
-0.04535245522856712,
0.16730739176273346,
0.10394515842199326,
0.0786639004945755,
-0.009074523113667965,
0.041763272136449814,
0.01019215863198042,
-0.09400174021720886,
0.03234950080513954,
0.08247634768486023,
0.08105187118053436,
0.07362081110477448,
-0.07861772179603577,
-0.05495233088731766,
-0.10860369354486465,
0.0006077579455450177,
0.0846838504076004,
0.21318134665489197,
-0.07879654318094254,
0.08388561010360718,
0.008817803114652634,
-0.0825689435005188,
-0.19172453880310059,
0.006076136138290167,
0.11118107289075851,
0.017695138230919838,
0.05046144872903824,
-0.2019374966621399,
0.08720250427722931,
0.08229272812604904,
-0.003650939790531993,
0.0875452384352684,
-0.3124682307243347,
-0.1420510858297348,
0.06141624227166176,
0.05761121213436127,
-0.17528454959392548,
-0.15811754763126373,
-0.08421362936496735,
-0.016335925087332726,
-0.09308431297540665,
0.11158362776041031,
-0.0767836645245552,
0.0714276134967804,
0.0448751300573349,
0.009204004891216755,
0.03718903660774231,
-0.03228521719574928,
0.15775741636753082,
0.026422584429383278,
0.05925065279006958,
-0.07956225425004959,
-0.043719179928302765,
-0.03300520032644272,
-0.08294069766998291,
0.0753820613026619,
-0.06911318749189377,
0.014547256752848625,
-0.11975327134132385,
-0.02492586337029934,
-0.08259782195091248,
-0.009010312147438526,
-0.10564083606004715,
-0.020514097064733505,
-0.021641798317432404,
0.11907588690519333,
0.1016886904835701,
0.004970171023160219,
0.05541969835758209,
-0.06214357912540436,
0.14528043568134308,
0.17603185772895813,
0.12180382758378983,
0.1042126938700676,
-0.06809836626052856,
0.028278958052396774,
0.004964984487742186,
-0.008004005067050457,
-0.19558654725551605,
0.052354246377944946,
0.11307482421398163,
0.011485820636153221,
0.18636982142925262,
0.031776439398527145,
-0.12453074753284454,
-0.032330192625522614,
0.07948558032512665,
-0.05451539158821106,
-0.09763940423727036,
0.017510414123535156,
0.0736333578824997,
-0.17668461799621582,
-0.08329792320728302,
0.07272536307573318,
-0.028365153819322586,
-0.013005911372601986,
0.030312906950712204,
0.1152569055557251,
-0.02526799403131008,
0.1809113770723343,
0.03893354535102844,
0.0794084370136261,
-0.09615717828273773,
0.10628923773765564,
0.07853839546442032,
-0.04160071164369583,
0.048547182232141495,
0.18776342272758484,
-0.05607816204428673,
-0.04727817326784134,
0.023143701255321503,
0.10313065350055695,
0.055343739688396454,
-0.01474197581410408,
-0.01863861083984375,
-0.1404062956571579,
0.06998158246278763,
0.0957639217376709,
0.006925755180418491,
0.015497330576181412,
0.028367385268211365,
0.01821761019527912,
-0.07307238131761551,
0.13789089024066925,
0.1378311961889267,
0.03652428835630417,
-0.04090476036071777,
0.1344892829656601,
0.009199120104312897,
-0.03798374533653259,
0.013185504823923111,
0.004055480007082224,
-0.16782712936401367,
0.006985720247030258,
-0.07186871767044067,
0.026524558663368225,
-0.1137685552239418,
-0.015918197110295296,
0.015172770246863365,
-0.01075712963938713,
-0.009533664211630821,
-0.005422491114586592,
-0.07814639061689377,
-0.10630304366350174,
-0.03975465148687363,
0.09852100908756256,
-0.12189651280641556,
-0.03954415023326874,
0.048435404896736145,
-0.12508738040924072,
0.07591639459133148,
0.034244105219841,
0.022164322435855865,
0.012016849592328072,
-0.11496692150831223,
0.025098273530602455,
-0.016018275171518326,
-0.024013571441173553,
0.03318418562412262,
-0.20784585177898407,
0.004640313796699047,
-0.10313756763935089,
-0.03625442832708359,
0.023050468415021896,
-0.010260631330311298,
-0.11884146928787231,
0.0640348494052887,
-0.04873181879520416,
-0.05809559300541878,
-0.07496680319309235,
0.056229814887046814,
0.09313339740037918,
-0.03333976864814758,
0.12053661048412323,
-0.07398790121078491,
0.09335068613290787,
-0.22298145294189453,
-0.011198174208402634,
0.011130742728710175,
-0.03092033974826336,
0.014584987424314022,
-0.022453537210822105,
0.11112987250089645,
-0.051020897924900055,
0.05177120864391327,
-0.022253716364502907,
-0.01497021596878767,
0.03948788344860077,
-0.05008332058787346,
-0.004476124886423349,
0.09747041016817093,
0.05888005718588829,
0.0289668720215559,
-0.03736695274710655,
-0.015813425183296204,
0.0010433816350996494,
0.01830522157251835,
-0.031046906486153603,
0.13884897530078888,
0.1531239002943039,
0.06871549040079117,
0.03592093288898468,
0.059259191155433655,
-0.14170309901237488,
-0.038660407066345215,
0.21427008509635925,
-0.05315826088190079,
0.03922532498836517,
-0.053905367851257324,
0.08153276145458221,
0.07571796327829361,
-0.24679867923259735,
0.07363561540842056,
-0.06772506237030029,
-0.11039863526821136,
-0.062337230890989304,
-0.16212663054466248,
-0.06996610760688782,
-0.06573216617107391,
-0.015565870329737663,
-0.12220412492752075,
0.03462263569235802,
0.11261545866727829,
0.01540329772979021,
0.03300146013498306,
0.0982871875166893,
-0.02288171648979187,
-0.0027233115397393703,
0.08342883735895157,
0.04593581706285477,
0.018156496807932854,
-0.027985025197267532,
-0.04098406061530113,
-0.005701328162103891,
0.031146874651312828,
0.06659477949142456,
0.01676391251385212,
-0.009257291443645954,
0.03931054845452309,
-0.013292709365487099,
-0.11706576496362686,
0.03685722500085831,
-0.03103204257786274,
-0.018220197409391403,
0.14368559420108795,
0.0676676332950592,
-0.015586244873702526,
-0.022450389340519905,
0.20073308050632477,
-0.07804395258426666,
-0.06735652685165405,
-0.19235862791538239,
0.17471151053905487,
0.013480564579367638,
0.006990770809352398,
0.001506633241660893,
-0.1191277727484703,
0.00978526659309864,
0.1489921659231186,
0.17361010611057281,
-0.0493568480014801,
0.000678383803460747,
0.04445524886250496,
-0.003534521907567978,
0.005450690630823374,
0.03253939747810364,
0.09936033189296722,
0.1019415408372879,
-0.06691966205835342,
0.06732664257287979,
0.01306432206183672,
-0.1081828698515892,
-0.08404019474983215,
0.07414094358682632,
0.02757592312991619,
0.03868752345442772,
-0.02364921197295189,
0.1499776691198349,
-0.13424132764339447,
-0.158379927277565,
0.07668890058994293,
-0.1467815786600113,
-0.1688423752784729,
-0.059172727167606354,
0.008086081594228745,
0.04397066310048103,
0.06727290898561478,
0.03567226603627205,
-0.02513142116367817,
0.10017186403274536,
0.02025206759572029,
-0.007527110632508993,
-0.0766119435429573,
0.03421785682439804,
-0.05171409994363785,
0.21565838158130646,
-0.021343635395169258,
-0.013136211782693863,
0.1393928825855255,
-0.01900208182632923,
-0.0930822491645813,
0.016668271273374557,
0.08629578351974487,
-0.06898003071546555,
0.05784273147583008,
0.17574423551559448,
-0.03628040477633476,
0.10112476348876953,
0.09784111380577087,
-0.10622665286064148,
0.01974027417600155,
-0.0623435415327549,
-0.04646330326795578,
-0.07842937111854553,
0.05207139253616333,
-0.03267127275466919,
0.13810233771800995,
0.23566710948944092,
-0.07436338812112808,
-0.0041680471040308475,
-0.029925955459475517,
-0.0033334530889987946,
-0.018736863508820534,
0.11392539739608765,
-0.033276185393333435,
-0.21807309985160828,
0.051091473549604416,
0.0067223114892840385,
0.08290626108646393,
-0.23306392133235931,
-0.09272104501724243,
0.05931103229522705,
-0.03179682791233063,
-0.09277311712503433,
0.1431775689125061,
0.09139861166477203,
0.02658795937895775,
-0.04785595461726189,
-0.12143567949533463,
-0.004857594612985849,
0.20575974881649017,
-0.08067529648542404,
-0.053708720952272415
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
Finetrained on a cryptic crosswords dataset. Intended to be closed book but you do have to provide the clue's context in the format:
```question: Queer fish in dam do (6) context: Queer```
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text2text-generation | misterwavey/t5-small-ssm-cc1 | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T10:45:41+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
Finetrained on a cryptic crosswords dataset. Intended to be closed book but you do have to provide the clue's context in the format:
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nFinetrained on a cryptic crosswords dataset. Intended to be closed book but you do have to provide the clue's context in the format: \n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nFinetrained on a cryptic crosswords dataset. Intended to be closed book but you do have to provide the clue's context in the format: \n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
58,
6,
3,
91,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nFinetrained on a cryptic crosswords dataset. Intended to be closed book but you do have to provide the clue's context in the format: \n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05795160308480263,
0.19928812980651855,
-0.005282493773847818,
0.011779927648603916,
0.09585753083229065,
0.012127676047384739,
0.06381694972515106,
0.1143803820014,
-0.04212808981537819,
0.11798348277807236,
0.034200750291347504,
0.11195646971464157,
0.11900221556425095,
0.14388343691825867,
-0.015865866094827652,
-0.22618460655212402,
0.04366437718272209,
-0.09807690978050232,
-0.011355316266417503,
0.119535431265831,
0.1549789309501648,
-0.0892418846487999,
0.0627814456820488,
-0.03121148608624935,
-0.01896514557301998,
-0.03379681333899498,
-0.056910596787929535,
-0.04233882576227188,
0.04913285747170448,
0.04756223410367966,
0.07289055734872818,
0.003295816481113434,
0.07697925716638565,
-0.2932658791542053,
0.016198694705963135,
0.06205251067876816,
-0.01326048094779253,
0.06759355217218399,
0.08795247972011566,
-0.05064431205391884,
0.11477263271808624,
-0.0733281597495079,
0.13607607781887054,
0.08139767497777939,
-0.09347101300954819,
-0.16062770783901215,
-0.07638637721538544,
0.10378508269786835,
0.17542670667171478,
0.053925637155771255,
-0.03455322980880737,
0.11378983408212662,
-0.08104129880666733,
0.0275272186845541,
0.053257767111063004,
-0.10561119765043259,
-0.0688277930021286,
0.09855321049690247,
0.10611722618341446,
0.05434725433588028,
-0.12650412321090698,
-0.03229287266731262,
0.01657767966389656,
0.021237382665276527,
0.06558256596326828,
0.013935631141066551,
0.12499536573886871,
0.026100536808371544,
-0.1300472915172577,
-0.06436484307050705,
0.13445048034191132,
0.033460583537817,
-0.0396650992333889,
-0.23146803677082062,
-0.027826573699712753,
-0.02590417116880417,
-0.03223048895597458,
-0.04891382157802582,
0.03963825851678848,
0.008352020755410194,
0.09704215079545975,
-0.020337771624326706,
-0.08236631006002426,
-0.03902548924088478,
0.07771909981966019,
0.05197184160351753,
0.028003064915537834,
-0.02801736816763878,
0.026457134634256363,
0.11627206206321716,
0.07647696137428284,
-0.11468470096588135,
-0.04880411922931671,
-0.05679432302713394,
-0.08036632835865021,
-0.047924596816301346,
0.031459275633096695,
0.028630023822188377,
0.07459396868944168,
0.2675952911376953,
0.028326531872153282,
0.053771909326314926,
0.027168946340680122,
0.012124881148338318,
0.03656868636608124,
0.0880676880478859,
-0.055225029587745667,
-0.12846247851848602,
-0.016939470544457436,
0.08832297474145889,
0.004634546581655741,
-0.027668096125125885,
-0.028629466891288757,
0.036999352276325226,
0.05355685576796532,
0.11857832968235016,
0.09604497253894806,
0.022547326982021332,
-0.07973543554544449,
-0.05046425387263298,
0.18638622760772705,
-0.1587795466184616,
0.027115734294056892,
0.028795475140213966,
-0.027118733152747154,
-0.058683499693870544,
0.007963087409734726,
0.042081721127033234,
-0.042649053037166595,
0.0935748741030693,
-0.05242137983441353,
-0.052488651126623154,
-0.1078203096985817,
-0.026487288996577263,
0.04707929864525795,
0.0076561663299798965,
-0.031509239226579666,
-0.022894280031323433,
-0.09082208573818207,
-0.08827085793018341,
0.09106556326150894,
-0.06721650809049606,
-0.08153355121612549,
-0.03026718646287918,
-0.08497220277786255,
0.01985703594982624,
0.022762978449463844,
0.09220997989177704,
-0.02552352473139763,
0.04439634084701538,
-0.05001934990286827,
0.04478561505675316,
0.09805110841989517,
0.030992163345217705,
-0.0675569400191307,
0.0648561343550682,
-0.2016003280878067,
0.097183458507061,
-0.07632796466350555,
0.045225586742162704,
-0.16303123533725739,
-0.02067229524254799,
0.024897083640098572,
0.010920028202235699,
-0.001240321435034275,
0.13657459616661072,
-0.21307174861431122,
-0.007554927840828896,
0.18640662729740143,
-0.1018555536866188,
-0.08156745135784149,
0.0711369514465332,
-0.05663756653666496,
0.12842364609241486,
0.03229309245944023,
0.015290449373424053,
0.058414969593286514,
-0.11679182201623917,
-0.004605989903211594,
-0.048203401267528534,
-0.011115318164229393,
0.13578374683856964,
0.0850592777132988,
-0.09176941961050034,
0.04203294962644577,
0.01592203788459301,
-0.03777758777141571,
-0.07245749235153198,
-0.02937469072639942,
-0.10458052903413773,
0.009833727963268757,
-0.07033774256706238,
0.011726465076208115,
-0.0061031230725348,
-0.09113635122776031,
-0.03410288691520691,
-0.1518668383359909,
-0.03655749559402466,
0.08261468261480331,
-0.004180578514933586,
-0.01778312399983406,
-0.10320312529802322,
0.019900351762771606,
0.02131357602775097,
-0.011556016281247139,
-0.13039611279964447,
-0.043067097663879395,
0.0355253666639328,
-0.12396629899740219,
0.031681615859270096,
-0.055840298533439636,
0.04461173713207245,
0.010964438319206238,
-0.04322732985019684,
-0.02171998843550682,
0.012844979763031006,
0.007469862699508667,
-0.03137645125389099,
-0.22360363602638245,
-0.016734866425395012,
-0.036308418959379196,
0.15633614361286163,
-0.220536008477211,
0.04101194813847542,
0.0759764313697815,
0.14383980631828308,
-0.0003150202101096511,
-0.05587134510278702,
0.025149626657366753,
-0.06066000834107399,
-0.021923569962382317,
-0.05865180492401123,
0.004340619780123234,
-0.009998329915106297,
-0.043926600366830826,
0.012434114702045918,
-0.16360697150230408,
-0.034785397350788116,
0.08990723639726639,
0.04304637759923935,
-0.1272251009941101,
-0.04028982296586037,
-0.028645556420087814,
-0.061934586614370346,
-0.049705106765031815,
-0.06125706434249878,
0.12133188545703888,
0.0525679774582386,
0.034177277237176895,
-0.07460953295230865,
-0.07115445286035538,
-0.01300098281353712,
-0.021287826821208,
-0.01929445192217827,
0.09777981787919998,
0.06994086503982544,
-0.13878798484802246,
0.09542020410299301,
0.06093577668070793,
0.04308347776532173,
0.08603634685277939,
-0.021571530029177666,
-0.07132729887962341,
-0.027564235031604767,
0.058270324021577835,
0.02366175688803196,
0.12014596164226532,
-0.08650979399681091,
0.04523579776287079,
0.04181785136461258,
-0.03990592435002327,
0.028057841584086418,
-0.08533547818660736,
0.017210450023412704,
0.017390072345733643,
-0.022363927215337753,
0.037427257746458054,
-0.028774989768862724,
0.021601108834147453,
0.08900325745344162,
0.04463161528110504,
0.024237852543592453,
0.024720940738916397,
-0.06257670372724533,
-0.11967668682336807,
0.16253763437271118,
-0.10800153762102127,
-0.2046995609998703,
-0.13488876819610596,
0.04718196392059326,
0.03715172037482262,
-0.014709338545799255,
0.0031467678491026163,
-0.04954829812049866,
-0.09940692782402039,
-0.08687350153923035,
0.0037684321869164705,
0.0430956669151783,
-0.09613434970378876,
-0.04770524427294731,
0.046350110322237015,
0.04077271372079849,
-0.1383245885372162,
0.019544225186109543,
0.04731137678027153,
-0.0940970703959465,
-0.003619559109210968,
0.05533437058329582,
0.08403343707323074,
0.18291230499744415,
0.011224490590393543,
-0.017188243567943573,
0.02569073997437954,
0.23243097960948944,
-0.13612040877342224,
0.10402184724807739,
0.11735101044178009,
-0.07386759668588638,
0.0863211527466774,
0.22538664937019348,
0.04018131643533707,
-0.10120487213134766,
0.029707422479987144,
0.041869450360536575,
-0.020154371857643127,
-0.2606181800365448,
-0.07170096784830093,
-0.019937392324209213,
-0.06523686647415161,
0.08199236541986465,
0.08913584053516388,
0.10703079402446747,
0.03735748305916786,
-0.09252200275659561,
-0.07800900191068649,
0.07079450786113739,
0.1131778359413147,
-0.002372660441324115,
0.0065698884427547455,
0.08409932255744934,
-0.03384093567728996,
0.013834643177688122,
0.08228921890258789,
0.022075865417718887,
0.14130429923534393,
0.043649375438690186,
0.18412859737873077,
0.08064569532871246,
0.09323640167713165,
0.0006038497667759657,
0.030590685084462166,
0.0047219293192029,
0.04675109311938286,
0.0004398166493047029,
-0.07677894085645676,
-0.027310503646731377,
0.12260334938764572,
0.0550728365778923,
0.00788582768291235,
0.016234660521149635,
-0.03998379409313202,
0.07039566338062286,
0.21415667235851288,
-0.012260649353265762,
-0.18641029298305511,
-0.05575864389538765,
0.07611414045095444,
-0.09922727942466736,
-0.10746539384126663,
-0.0026429579593241215,
0.03187163174152374,
-0.17560753226280212,
0.03787757083773613,
-0.037664420902729034,
0.10924846678972244,
-0.10785871744155884,
-0.024591870605945587,
0.07292332500219345,
0.06095395237207413,
-0.019361278042197227,
0.07171071320772171,
-0.20543229579925537,
0.11552049219608307,
0.005553957540541887,
0.06840323656797409,
-0.09807268530130386,
0.09052707999944687,
0.00007252176874317229,
-0.026382924988865852,
0.16031920909881592,
-0.005704080685973167,
-0.07013165950775146,
-0.05432261899113655,
-0.08527009189128876,
-0.005233077798038721,
0.10775313526391983,
-0.13659150898456573,
0.0788506492972374,
-0.0336792878806591,
-0.024970626458525658,
0.0022195838391780853,
-0.0837036594748497,
-0.11944414675235748,
-0.17490437626838684,
0.055567458271980286,
-0.12025492638349533,
0.044729847460985184,
-0.10656820982694626,
-0.02983083948493004,
-0.026434585452079773,
0.190355122089386,
-0.22490353882312775,
-0.08551383763551712,
-0.13841530680656433,
-0.10719981044530869,
0.14474716782569885,
-0.04598908871412277,
0.09606622159481049,
-0.011186913587152958,
0.1593361645936966,
0.005191328935325146,
-0.004475496709346771,
0.08595962077379227,
-0.09142456948757172,
-0.20457406342029572,
-0.06648152321577072,
0.1717994660139084,
0.10942290723323822,
0.03397198021411896,
0.003196760779246688,
0.032307837158441544,
-0.021464258432388306,
-0.10795275121927261,
0.012823376804590225,
0.12819167971611023,
0.08238451182842255,
0.012452580034732819,
-0.03350626677274704,
-0.13364125788211823,
-0.08545850217342377,
-0.0428166501224041,
0.038153402507305145,
0.17154206335544586,
-0.06819427758455276,
0.15416856110095978,
0.13133347034454346,
-0.06840679049491882,
-0.19579678773880005,
0.008739097975194454,
0.028694141656160355,
-0.013368007726967335,
0.005607674829661846,
-0.18617960810661316,
0.07799846678972244,
0.006758463568985462,
-0.051762308925390244,
0.0836871862411499,
-0.17234016954898834,
-0.13974589109420776,
0.08075147122144699,
0.046839140355587006,
-0.20236442983150482,
-0.14007282257080078,
-0.09751323610544205,
-0.036570463329553604,
-0.15828485786914825,
0.08339148759841919,
0.0167519673705101,
0.011182278394699097,
0.033627841621637344,
0.011390561237931252,
0.03215564414858818,
-0.05439374968409538,
0.18213696777820587,
0.003546611638739705,
0.0302570853382349,
-0.09146486967802048,
-0.09988626837730408,
0.02586611919105053,
-0.041982974857091904,
0.0710178017616272,
-0.020275522023439407,
0.017004474997520447,
-0.11500648409128189,
-0.04493312910199165,
-0.05539782717823982,
0.02140277624130249,
-0.10533033311367035,
-0.09383488446474075,
-0.04714478924870491,
0.08706165105104446,
0.1077154278755188,
-0.02142292819917202,
-0.031049923971295357,
-0.07929683476686478,
0.0724935531616211,
0.21497008204460144,
0.20876100659370422,
0.07551934570074081,
-0.059966299682855606,
0.008073145523667336,
-0.019268544390797615,
0.04482307657599449,
-0.17402943968772888,
0.05873405188322067,
0.06525226682424545,
0.018149642273783684,
0.11258252710103989,
-0.024597855284810066,
-0.14293932914733887,
-0.0724528506398201,
0.059707161039114,
-0.06549928337335587,
-0.1967051774263382,
0.006225781049579382,
0.0507291778922081,
-0.17043359577655792,
-0.04625912383198738,
0.03279775008559227,
-0.010768436826765537,
-0.033957671374082565,
0.010066896677017212,
0.09410840272903442,
-0.004519202280789614,
0.08832067251205444,
0.06499090045690536,
0.09558041393756866,
-0.09539978206157684,
0.08729603886604309,
0.09682498127222061,
-0.06371648609638214,
0.026767004281282425,
0.10643479228019714,
-0.056847888976335526,
-0.0370270274579525,
0.057687193155288696,
0.07935084402561188,
0.022050559520721436,
-0.04946451261639595,
0.00880793109536171,
-0.08682231605052948,
0.056592877954244614,
0.11494595557451248,
0.028536973521113396,
0.009585963562130928,
0.055210765451192856,
0.03184351325035095,
-0.08420921117067337,
0.11751819401979446,
0.05978672206401825,
0.01104363426566124,
-0.03635348379611969,
-0.011328858323395252,
0.004311414901167154,
-0.039646927267313004,
-0.0016321903094649315,
-0.006705539301037788,
-0.07647864520549774,
-0.008729519322514534,
-0.12259546667337418,
0.025002311915159225,
-0.07873077690601349,
0.01773500256240368,
0.024449611082673073,
-0.028926536440849304,
0.007376181427389383,
0.0007159081869758666,
-0.07348133623600006,
-0.057756923139095306,
-0.014503502286970615,
0.10616780817508698,
-0.15961389243602753,
0.022934339940547943,
0.09428167343139648,
-0.10586848109960556,
0.09681758284568787,
-0.0072182160802185535,
-0.013768773525953293,
0.005244025494903326,
-0.171084463596344,
0.05405346304178238,
-0.024456309154629707,
0.004692418966442347,
-0.0019330531358718872,
-0.18884916603565216,
0.0023140208795666695,
-0.03278167173266411,
-0.05621291324496269,
-0.009569632820785046,
-0.006619232706725597,
-0.11686535179615021,
0.09992709010839462,
0.013655356131494045,
-0.09111274033784866,
-0.029773462563753128,
0.030183233320713043,
0.08661947399377823,
-0.04038918763399124,
0.13602280616760254,
-0.022048408165574074,
0.06713585555553436,
-0.16538763046264648,
-0.00795265194028616,
-0.0021400097757577896,
0.02490454912185669,
-0.0496375672519207,
-0.01526102889329195,
0.05195009335875511,
-0.021379167214035988,
0.1838264912366867,
-0.03611990809440613,
0.008777685463428497,
0.06176931411027908,
0.02373037301003933,
0.009185687638819218,
0.09612800180912018,
0.06690347194671631,
0.010662365704774857,
0.005120737012475729,
0.010057808831334114,
-0.0513584204018116,
-0.04224489629268646,
-0.1734611839056015,
0.07209991663694382,
0.2038576304912567,
0.09532591700553894,
-0.020290616899728775,
0.05802363157272339,
-0.116078682243824,
-0.08878971636295319,
0.1248512715101242,
-0.02374345436692238,
-0.01322826836258173,
-0.07626376301050186,
0.13629041612148285,
0.14569322764873505,
-0.2002793252468109,
0.07671287655830383,
-0.07067473232746124,
-0.05044691264629364,
-0.10433215647935867,
-0.2021869719028473,
-0.06332078576087952,
-0.025908762589097023,
-0.011585387401282787,
-0.06721321493387222,
0.06637271493673325,
0.08459249883890152,
-0.0034814730752259493,
-0.013989550061523914,
0.07975263148546219,
-0.04514596611261368,
-0.012602005153894424,
0.037767909467220306,
0.060257330536842346,
0.0265960656106472,
-0.06352473050355911,
0.0037857082206755877,
-0.0161980539560318,
0.05052025988698006,
0.07154157757759094,
0.03970012813806534,
-0.024193525314331055,
0.007133064325898886,
-0.028321964666247368,
-0.10049892216920853,
0.05007503926753998,
-0.021316543221473694,
-0.054890405386686325,
0.15792210400104523,
0.025498466566205025,
0.0036312860902398825,
-0.004057582002133131,
0.22541531920433044,
-0.06593148410320282,
-0.11534882336854935,
-0.15084826946258545,
0.06762722134590149,
-0.05195125192403793,
0.04059460759162903,
0.05531981587409973,
-0.10995841026306152,
0.040739014744758606,
0.14713819324970245,
0.1669650375843048,
-0.026372702792286873,
0.017015540972352028,
0.03309843689203262,
0.006621187087148428,
-0.01862250454723835,
0.03595472127199173,
0.04973356053233147,
0.14407676458358765,
-0.06881654262542725,
0.08028544485569,
0.012573251500725746,
-0.07790334522724152,
-0.025424117222428322,
0.11906008422374725,
-0.019904734566807747,
0.001412217621691525,
-0.05872270092368126,
0.1302832067012787,
-0.07167374342679977,
-0.21689969301223755,
0.04775147885084152,
-0.07304432988166809,
-0.1357172727584839,
-0.026369044557213783,
0.03308054804801941,
-0.00429290859028697,
0.017552554607391357,
0.07892777770757675,
-0.07011722773313522,
0.1696552336215973,
0.03946493938565254,
-0.06814303994178772,
-0.04598831385374069,
0.07775052636861801,
-0.08096214383840561,
0.29606664180755615,
0.01717071607708931,
0.05390579253435135,
0.11201871186494827,
-0.01776694692671299,
-0.11734174191951752,
0.029342716559767723,
0.10875631868839264,
-0.07185864448547363,
0.05867310240864754,
0.1638663411140442,
-0.002883994486182928,
0.13492751121520996,
0.07868283987045288,
-0.07584410905838013,
0.05102599412202835,
-0.06599061191082001,
-0.06260241568088531,
-0.11444854736328125,
0.10380300134420395,
-0.09854346513748169,
0.14523425698280334,
0.1315329670906067,
-0.05788888782262802,
0.010853147134184837,
-0.030719053000211716,
0.0496695376932621,
0.0015346633736044168,
0.12824365496635437,
0.007991141639649868,
-0.18989641964435577,
0.03672372177243233,
-0.02254098653793335,
0.10594062507152557,
-0.18076087534427643,
-0.08421733975410461,
0.035050418227910995,
-0.0003886220511049032,
-0.0847720354795456,
0.12202786654233932,
0.06375595182180405,
0.025433989241719246,
-0.037768058478832245,
-0.049146246165037155,
-0.014416635036468506,
0.14787836372852325,
-0.10264741629362106,
-0.005174257792532444
] |
null | null | timm | # Model card for poolformerv2_s24.st_safebooru_1k
## Model Details
- **metrics:**
|Precision|Recall|F1-score|
|-|-|-|
|0.7852106772815528|0.4576937472756056|0.554791361096754|
| {"license": "apache-2.0", "library_name": "timm", "tags": ["image-classification", "timm"]} | image-classification | STomoya/poolformerv2_s24.st_safebooru_1k | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"license:apache-2.0",
"region:us"
] | 2024-02-12T10:46:54+00:00 | [] | [] | TAGS
#timm #pytorch #safetensors #image-classification #license-apache-2.0 #region-us
| # Model card for poolformerv2_s24.st_safebooru_1k
## Model Details
- metrics:
|Precision|Recall|F1-score|
|-|-|-|
|0.7852106772815528|0.4576937472756056|0.554791361096754|
| [
"# Model card for poolformerv2_s24.st_safebooru_1k",
"## Model Details\n- metrics: \n|Precision|Recall|F1-score|\n|-|-|-|\n|0.7852106772815528|0.4576937472756056|0.554791361096754|"
] | [
"TAGS\n#timm #pytorch #safetensors #image-classification #license-apache-2.0 #region-us \n",
"# Model card for poolformerv2_s24.st_safebooru_1k",
"## Model Details\n- metrics: \n|Precision|Recall|F1-score|\n|-|-|-|\n|0.7852106772815528|0.4576937472756056|0.554791361096754|"
] | [
31,
18,
56
] | [
"passage: TAGS\n#timm #pytorch #safetensors #image-classification #license-apache-2.0 #region-us \n# Model card for poolformerv2_s24.st_safebooru_1k## Model Details\n- metrics: \n|Precision|Recall|F1-score|\n|-|-|-|\n|0.7852106772815528|0.4576937472756056|0.554791361096754|"
] | [
-0.19162461161613464,
0.05401986837387085,
-0.0051389774307608604,
0.028216367587447166,
0.06668481975793839,
0.022597335278987885,
0.10859569162130356,
0.0467495433986187,
-0.0685463398694992,
0.05242907255887985,
0.13700830936431885,
0.10074834525585175,
0.10359327495098114,
0.07632642239332199,
-0.00018766142602544278,
-0.11636523902416229,
0.10438385605812073,
0.03042171522974968,
0.046899933367967606,
0.1518605351448059,
0.06181309372186661,
-0.1292247623205185,
0.11717643588781357,
0.0013515407918021083,
-0.025603342801332474,
0.01442110724747181,
0.05534762516617775,
-0.10914283245801926,
0.08057276159524918,
-0.04438520222902298,
0.04565344750881195,
0.11924763023853302,
0.09078644216060638,
-0.07177888602018356,
0.03476766496896744,
0.04540528357028961,
-0.06002775952219963,
0.0824047103524208,
0.0770210474729538,
-0.08245816826820374,
0.058819763362407684,
-0.02799837663769722,
-0.031046677380800247,
0.023563440889120102,
-0.07139871269464493,
-0.20218853652477264,
-0.08387169241905212,
0.13090914487838745,
0.0948837399482727,
0.012845310382544994,
0.02109874226152897,
0.23480373620986938,
-0.0897066742181778,
0.04355659335851669,
0.12592239677906036,
-0.28936055302619934,
-0.024725185707211494,
0.13789381086826324,
-0.058721549808979034,
-0.01922169141471386,
0.0010976644698530436,
0.0627298504114151,
0.055957578122615814,
0.04220004379749298,
0.054353829473257065,
-0.0008648861548863351,
0.033326711505651474,
0.012147048488259315,
-0.06325411051511765,
-0.14431717991828918,
0.15122012794017792,
0.08102228492498398,
-0.08380477875471115,
-0.030418824404478073,
-0.08245296776294708,
0.0018315439810976386,
-0.0927169919013977,
0.02537413500249386,
0.0469813272356987,
-0.024329375475645065,
-0.07889953255653381,
0.0628848597407341,
-0.154127836227417,
-0.11469423025846481,
-0.0907546728849411,
0.008151871152222157,
0.03753562271595001,
0.15669305622577667,
-0.13143660128116608,
0.09846706688404083,
0.02190869115293026,
-0.06816346943378448,
0.033900000154972076,
-0.0710732638835907,
0.11434994637966156,
0.0850534662604332,
0.08171899616718292,
-0.057160187512636185,
0.07466761767864227,
0.048244111239910126,
-0.0000071468680289399344,
0.053886186331510544,
0.08486349880695343,
0.08820496499538422,
0.006433082278817892,
-0.07614821195602417,
-0.09891609102487564,
0.10514333099126816,
0.01586890034377575,
0.049794066697359085,
0.057010259479284286,
-0.03420301154255867,
0.002849154407158494,
-0.021934721618890762,
0.16082394123077393,
0.021567026153206825,
0.013495308347046375,
-0.006552251521497965,
0.018762215971946716,
0.0013093199813738465,
0.22849740087985992,
-0.02853616140782833,
-0.023353461176156998,
0.054899927228689194,
-0.03465114161372185,
0.026271117851138115,
0.06940703094005585,
-0.06565143167972565,
-0.08403646945953369,
0.13227635622024536,
-0.07046759128570557,
0.011779122985899448,
-0.016580933704972267,
-0.014151105657219887,
0.0186370350420475,
-0.03886199742555618,
0.014703446999192238,
-0.1310088187456131,
-0.10931549221277237,
0.04596502706408501,
0.020052313804626465,
-0.00784415565431118,
0.016963228583335876,
0.08667165040969849,
0.005884015467017889,
-0.03347114101052284,
-0.01926524192094803,
0.001700033899396658,
-0.06770721822977066,
0.07375075668096542,
-0.003616650588810444,
0.04385603591799736,
-0.12677188217639923,
-0.013907527551054955,
-0.13687464594841003,
0.07157187163829803,
-0.1172683984041214,
-0.0987841859459877,
-0.07682064175605774,
0.014791545458137989,
-0.08316685259342194,
-0.10248317569494247,
-0.024172548204660416,
0.019338300451636314,
0.017748810350894928,
0.14002572000026703,
-0.14046010375022888,
-0.019363922998309135,
0.2387801557779312,
-0.21925072371959686,
-0.12078064680099487,
-0.0051073734648525715,
0.014777230098843575,
-0.14910216629505157,
0.04978698492050171,
0.18328072130680084,
0.0358322374522686,
-0.11302588880062103,
-0.0174565427005291,
0.027038535103201866,
-0.11373085528612137,
-0.17661337554454803,
0.08664320409297943,
0.012675839476287365,
-0.11987241357564926,
0.0626278966665268,
-0.04735710844397545,
-0.03340136259794235,
-0.07851008325815201,
-0.06792861968278885,
-0.05393072962760925,
-0.05685589462518692,
0.03252201899886131,
0.030272657051682472,
0.03358880802989006,
-0.0890326276421547,
0.01973847672343254,
0.03314337134361267,
0.08410904556512833,
0.0037269224412739277,
-0.055153414607048035,
-0.12003335356712341,
0.22875258326530457,
-0.05126701667904854,
-0.03605262190103531,
-0.08832162618637085,
0.029791833832859993,
-0.03506658598780632,
-0.06290385872125626,
0.002971273148432374,
-0.052915800362825394,
0.1313403844833374,
0.09850262105464935,
-0.019099876284599304,
-0.024423224851489067,
0.067012719810009,
0.01797359809279442,
0.06130196526646614,
-0.102010078728199,
-0.05006716400384903,
0.03694942221045494,
0.122688889503479,
-0.13723872601985931,
0.007031470071524382,
0.1066678836941719,
0.10615409165620804,
-0.0021246285177767277,
-0.03451748192310333,
0.042611751705408096,
-0.07575833797454834,
0.05984184518456459,
-0.0022719160187989473,
0.07562170177698135,
0.03140414133667946,
0.024363400414586067,
0.09018946439027786,
-0.11811628937721252,
0.2435525506734848,
0.22469286620616913,
-0.051835350692272186,
0.0018756130011752248,
-0.011976572684943676,
-0.01451928075402975,
-0.003199512604624033,
0.012388276867568493,
0.12185394763946533,
-0.09808153659105301,
-0.03739497438073158,
0.11508718132972717,
-0.12035709619522095,
0.03127843514084816,
0.13305321335792542,
-0.046216707676649094,
-0.0015514056431129575,
0.10920403152704239,
0.14935804903507233,
-0.08184254169464111,
0.08167994767427444,
0.1632615029811859,
0.03775830194354057,
-0.0017110054614022374,
-0.08644882589578629,
-0.09011968970298767,
-0.001162434695288539,
0.021927619352936745,
-0.055264730006456375,
0.2912483811378479,
0.02333223819732666,
0.06225544214248657,
0.07290313392877579,
-0.008541527204215527,
0.03807704150676727,
-0.19111581146717072,
-0.036223798990249634,
0.039759378880262375,
-0.029115799814462662,
-0.17543230950832367,
-0.019779350608587265,
0.037061117589473724,
0.12533731758594513,
-0.06428397446870804,
-0.025964777916669846,
0.020525895059108734,
0.030931204557418823,
-0.1098894327878952,
0.20950670540332794,
-0.05020380765199661,
-0.1918526440858841,
-0.10550840198993683,
0.18446052074432373,
0.005588619504123926,
-0.05653280019760132,
0.00005365379911381751,
-0.0609176941215992,
0.011029510758817196,
-0.0850982666015625,
-0.16680465638637543,
-0.011567821726202965,
0.008994652889668941,
-0.040447626262903214,
0.03750232607126236,
0.1068466454744339,
-0.07437318563461304,
-0.028077101334929466,
-0.07054248452186584,
-0.0006504836492240429,
0.16604308784008026,
0.043114904314279556,
0.06681304425001144,
0.07845095545053482,
-0.09633588045835495,
0.04185473546385765,
0.006746948696672916,
0.189542755484581,
0.008122491650283337,
0.017487535253167152,
0.21421337127685547,
0.04289612919092178,
0.08223816007375717,
0.039889488369226456,
0.05265340209007263,
-0.01915079914033413,
-0.1048535704612732,
-0.0019326993497088552,
-0.09879429638385773,
-0.16483265161514282,
-0.08657105267047882,
0.05614485591650009,
0.08082792907953262,
0.05035710707306862,
0.08586879074573517,
0.030443720519542694,
0.16917537152767181,
-0.020365824922919273,
-0.0924893468618393,
-0.052547041326761246,
0.005214286502450705,
-0.03584849461913109,
0.014578986912965775,
0.06801057606935501,
-0.12797224521636963,
-0.030517341569066048,
0.12243150919675827,
-0.03736152499914169,
0.16587013006210327,
0.0381605438888073,
-0.12620534002780914,
0.06954962760210037,
0.23104144632816315,
0.06917168200016022,
0.12679122388362885,
-0.0017463340191170573,
-0.07094788551330566,
-0.01274280808866024,
-0.08704402297735214,
-0.04915158450603485,
-0.01621498353779316,
-0.06431485712528229,
0.016406161710619926,
-0.06531519442796707,
0.11732937395572662,
0.05338235944509506,
0.11543390154838562,
0.09801049530506134,
-0.3047509491443634,
-0.020397648215293884,
0.006330815143883228,
0.09520021080970764,
-0.09348028153181076,
0.06129937246441841,
0.0632811039686203,
-0.05344061553478241,
0.06632795184850693,
-0.028028085827827454,
0.05560234934091568,
-0.054075635969638824,
0.0025733192451298237,
-0.06586913764476776,
-0.059407755732536316,
0.017259646207094193,
0.06981579959392548,
-0.05883049592375755,
0.2612907588481903,
0.0019357516430318356,
-0.01429539080709219,
-0.0788300633430481,
-0.05305657908320427,
-0.002986334031447768,
0.0763387456536293,
0.19987840950489044,
0.07892607152462006,
-0.1999407261610031,
-0.0854984000325203,
-0.15990617871284485,
0.052832990884780884,
-0.0032242813613265753,
0.06998895108699799,
-0.02539546974003315,
0.03587285429239273,
-0.005088818725198507,
-0.03943543881177902,
-0.09638466686010361,
-0.1431894153356552,
0.04273064434528351,
0.005050354637205601,
-0.04443451017141342,
-0.07664692401885986,
-0.09707899391651154,
-0.04087844863533974,
-0.019033418968319893,
0.04467722028493881,
-0.06828010082244873,
-0.07959228754043579,
-0.05053109675645828,
-0.08053164929151535,
0.025275230407714844,
-0.058660827577114105,
0.06168178468942642,
-0.038680996745824814,
0.02730540931224823,
-0.018727049231529236,
-0.1774275004863739,
0.10502863675355911,
-0.07415661960840225,
-0.06870885193347931,
-0.009527958929538727,
0.15846164524555206,
-0.07363098114728928,
0.013196254149079323,
0.041775938123464584,
0.01698663830757141,
0.010332540608942509,
-0.0931302085518837,
0.16266389191150665,
-0.12515494227409363,
0.09974321722984314,
-0.02446829341351986,
0.1501370668411255,
-0.23281171917915344,
0.016470400616526604,
0.030681032687425613,
0.11492259800434113,
0.29329878091812134,
-0.07607947289943695,
0.052271995693445206,
0.16514895856380463,
0.03567786514759064,
-0.22178815305233002,
-0.06469511240720749,
-0.17685963213443756,
-0.0464850552380085,
0.06577210128307343,
-0.08624298870563507,
0.10859949141740799,
0.13034461438655853,
-0.09291663020849228,
0.130126953125,
-0.25663653016090393,
-0.09182809293270111,
0.18692342936992645,
0.1817491501569748,
0.17369623482227325,
-0.17901985347270966,
-0.08868290483951569,
-0.02554197795689106,
-0.25089848041534424,
0.10421627759933472,
-0.21506808698177338,
-0.05100418999791145,
-0.055385321378707886,
0.04392395168542862,
0.001847827690653503,
-0.09750484675168991,
0.1597655862569809,
-0.09533227235078812,
0.1300286054611206,
-0.10448838025331497,
-0.07961612939834595,
0.12093084305524826,
-0.0019903634674847126,
0.10618820786476135,
-0.09838922321796417,
0.12104028463363647,
-0.1496385782957077,
0.05481640249490738,
-0.08034578710794449,
-0.016844695433974266,
-0.04299848899245262,
-0.027260292321443558,
-0.043931376188993454,
-0.023959826678037643,
-0.09572026878595352,
-0.03235182166099548,
-0.014685218222439289,
0.01566089130938053,
0.006877104751765728,
0.14238020777702332,
-0.024107111617922783,
-0.07925353199243546,
-0.15795834362506866,
-0.10381795465946198,
-0.06207316741347313,
0.07924696058034897,
-0.19951753318309784,
0.027696965262293816,
0.07320985943078995,
0.05042756721377373,
0.023106999695301056,
0.0038655290845781565,
-0.052095405757427216,
-0.0004176814400125295,
0.13494126498699188,
-0.2649711072444916,
-0.10347726196050644,
0.017990022897720337,
0.17880623042583466,
0.08539136499166489,
0.09331360459327698,
0.00899522565305233,
-0.07835616916418076,
-0.01578838750720024,
-0.04453396797180176,
0.03414163365960121,
-0.029476827010512352,
0.06918986141681671,
0.11082740128040314,
0.0879490002989769,
-0.14922954142093658,
0.09181958436965942,
-0.03245687857270241,
-0.046901602298021317,
-0.028997527435421944,
0.012961586005985737,
-0.10883080959320068,
-0.06857746094465256,
0.047095879912376404,
0.051466286182403564,
0.030437486246228218,
-0.15615664422512054,
-0.10944069176912308,
-0.1808810979127884,
0.029335511848330498,
0.011759469285607338,
0.0723857581615448,
0.04231929033994675,
0.11529434472322464,
-0.08632231503725052,
-0.014496218413114548,
-0.03769569844007492,
0.027786415070295334,
0.0005689023528248072,
-0.15479563176631927,
0.04437897354364395,
0.04055936634540558,
0.06830831617116928,
-0.04553309828042984,
-0.004686982836574316,
-0.051682330667972565,
0.06438219547271729,
-0.22608649730682373,
0.11872237175703049,
-0.11757856607437134,
-0.007116916589438915,
0.001994169782847166,
0.006954042706638575,
-0.056042589247226715,
0.024762311950325966,
-0.08415375649929047,
0.04787361994385719,
0.037048131227493286,
0.01823439635336399,
-0.14164094626903534,
0.03229376673698425,
0.03681230545043945,
-0.0045806607231497765,
0.011234038509428501,
0.11701700091362,
0.020123543217778206,
0.04263764247298241,
-0.02305475063621998,
-0.11882118880748749,
0.09909138083457947,
-0.0003849710919894278,
-0.04400942847132683,
0.06397292017936707,
0.025227278470993042,
0.06267452985048294,
-0.07223059982061386,
0.018058788031339645,
0.049036990851163864,
-0.13261251151561737,
-0.030976293608546257,
-0.034391406923532486,
-0.044569652527570724,
-0.027605745941400528,
-0.01739603839814663,
0.15151962637901306,
-0.04492831230163574,
0.09335550665855408,
-0.051525216549634933,
-0.017413245514035225,
-0.2466525286436081,
-0.010343029163777828,
-0.029093807563185692,
-0.11829819530248642,
-0.19865961372852325,
0.03727028891444206,
0.01736409030854702,
-0.05554218590259552,
0.09228284657001495,
0.0077382586896419525,
-0.1274677813053131,
0.011305815540254116,
0.12408555299043655,
0.014706399291753769,
0.0362694151699543,
0.1994854211807251,
0.024455472826957703,
-0.009135995991528034,
-0.035667624324560165,
0.028101017698645592,
-0.023294979706406593,
-0.0654197484254837,
0.051355618983507156,
0.15885493159294128,
-0.04180506616830826,
0.0805189236998558,
0.10336911678314209,
0.016740893945097923,
-0.09704095870256424,
0.035276204347610474,
-0.10749773681163788,
0.09863459318876266,
0.025836508721113205,
0.0653817430138588,
0.22852586209774017,
-0.037829965353012085,
-0.022130461409687996,
-0.053694434463977814,
-0.03666786849498749,
-0.0928676575422287,
-0.16143798828125,
-0.06946875900030136,
-0.10250293463468552,
0.06384208798408508,
-0.009016227908432484,
-0.09192581474781036,
0.22976991534233093,
0.04232329502701759,
-0.03606525808572769,
0.107514388859272,
0.018566910177469254,
0.033582501113414764,
-0.01207150612026453,
0.006966689135879278,
-0.06402821838855743,
0.06834494322538376,
-0.03691757842898369,
0.10475268959999084,
-0.0257993433624506,
0.016108669340610504,
-0.031078020110726357,
0.021343136206269264,
0.0814577266573906,
0.015545010566711426,
-0.08978728950023651,
-0.0514710396528244,
-0.022237388417124748,
0.028288941830396652,
0.09324771165847778,
-0.013758230023086071,
0.06521403789520264,
-0.01053242664784193,
0.1087961420416832,
0.013083850033581257,
-0.11196930706501007,
-0.03211633861064911,
0.1493082195520401,
-0.08730842918157578,
0.055682338774204254,
0.07105442136526108,
-0.05888902023434639,
0.0363197885453701,
0.17756204307079315,
0.2805653512477875,
0.006991439498960972,
0.019507333636283875,
-0.03232155740261078,
0.007543186191469431,
-0.021610818803310394,
0.07624596357345581,
-0.02559349313378334,
0.06719908118247986,
-0.12383943796157837,
0.0013984572142362595,
-0.12503762543201447,
-0.04319217428565025,
-0.029933331534266472,
0.08896242827177048,
0.030885614454746246,
-0.0061273775063455105,
-0.07757273316383362,
0.10771488398313522,
-0.035753920674324036,
0.11519373953342438,
0.11624393612146378,
-0.06673310697078705,
-0.08238302916288376,
0.001115369494073093,
0.07478664070367813,
0.10363073647022247,
-0.0085496436804533,
-0.12658365070819855,
0.03796783462166786,
-0.0704776868224144,
0.04638715088367462,
-0.25795090198516846,
-0.081030935049057,
0.024716651067137718,
0.010080545209348202,
0.30212152004241943,
0.028276557102799416,
0.12362852692604065,
0.04046988859772682,
0.013188162818551064,
-0.10469666123390198,
0.07235914468765259,
0.010770771652460098,
-0.04511586204171181,
-0.047412365674972534,
-0.0017347023822367191,
0.020890740677714348,
-0.09425175935029984,
0.00854512583464384,
-0.05087527632713318,
-0.07556696981191635,
-0.03779007866978645,
-0.04191488027572632,
-0.020645746961236,
0.07654829323291779,
-0.01995043084025383,
0.03363480418920517,
0.027633437886834145,
-0.04083088040351868,
-0.027343032881617546,
-0.07419680804014206,
0.04837646335363388,
0.03012542426586151,
-0.2573736608028412,
0.04486091062426567,
0.006813512649387121,
0.0267526526004076,
-0.04015251621603966,
0.02687256596982479,
-0.2117319405078888,
-0.011218980886042118,
-0.06342492252588272,
-0.05443968251347542,
-0.053118858486413956,
0.04509853944182396,
0.17066511511802673,
0.08152160793542862,
-0.04107584431767464,
0.12738476693630219,
-0.012165366671979427,
0.08641704171895981,
0.00597490556538105,
-0.11024333536624908
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model_category_dataset_hate_speech_epoch_lr_1e5
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0173
- Accuracy: 0.9964
- Macro F1: 0.9964
- Micro F1: 0.9964
- Weighted F1: 0.9964
- Precision: 0.9962
- Recall: 0.9966
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Macro F1 | Micro F1 | Weighted F1 | Precision | Recall |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:--------:|:-----------:|:---------:|:------:|
| No log | 1.0 | 280 | 0.0696 | 0.9857 | 0.9856 | 0.9857 | 0.9857 | 0.9847 | 0.9869 |
| 0.0948 | 2.0 | 560 | 0.0500 | 0.9893 | 0.9892 | 0.9893 | 0.9893 | 0.9885 | 0.9902 |
| 0.0948 | 3.0 | 840 | 0.0525 | 0.9893 | 0.9892 | 0.9893 | 0.9893 | 0.9885 | 0.9902 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "mit", "tags": ["generated_from_trainer"], "metrics": ["accuracy", "precision", "recall"], "base_model": "roberta-base", "model-index": [{"name": "model_category_dataset_hate_speech_epoch_lr_1e5", "results": []}]} | text-classification | lazyghost/roberta_base-hate_speech_classifier-v2 | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"base_model:roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T10:49:43+00:00 | [] | [] | TAGS
#transformers #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us
| model\_category\_dataset\_hate\_speech\_epoch\_lr\_1e5
======================================================
This model is a fine-tuned version of roberta-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0173
* Accuracy: 0.9964
* Macro F1: 0.9964
* Micro F1: 0.9964
* Weighted F1: 0.9964
* Precision: 0.9962
* Recall: 0.9966
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
59,
98,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #roberta #text-classification #generated_from_trainer #base_model-roberta-base #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.092381551861763,
0.06047872453927994,
-0.0010544151300564408,
0.10592140257358551,
0.18873703479766846,
0.012978114187717438,
0.1483236849308014,
0.0937480553984642,
-0.11339004337787628,
0.0313447043299675,
0.13257382810115814,
0.15593665838241577,
-0.005467204377055168,
0.1657353788614273,
-0.08557246625423431,
-0.2335340529680252,
0.020338937640190125,
0.014239036478102207,
-0.06427526473999023,
0.12089532613754272,
0.10727120190858841,
-0.13638262450695038,
0.09291521459817886,
-0.023716220632195473,
-0.2168537974357605,
0.015895532444119453,
0.05065561458468437,
-0.06921587139368057,
0.14828196167945862,
0.03071230836212635,
0.13987155258655548,
0.027647966518998146,
0.10054916888475418,
-0.18825538456439972,
0.013754685409367085,
0.049528077244758606,
0.0028111564461141825,
0.0765194371342659,
0.031279318034648895,
-0.03131335973739624,
0.07296251505613327,
-0.08376210927963257,
0.06841788440942764,
0.030373215675354004,
-0.15089617669582367,
-0.20759177207946777,
-0.07553703337907791,
0.013267839327454567,
0.09016851335763931,
0.081574946641922,
-0.012500735931098461,
0.15025505423545837,
-0.08764190971851349,
0.0908028855919838,
0.19625499844551086,
-0.2989821135997772,
-0.06664401292800903,
0.06210016831755638,
0.011673493310809135,
0.0848226547241211,
-0.1182076707482338,
-0.0069083948619663715,
0.08084401488304138,
0.02440888062119484,
0.1282949298620224,
-0.035279531031847,
-0.08030194789171219,
0.006729864981025457,
-0.1367698609828949,
-0.01200588047504425,
0.1422627717256546,
0.04712622985243797,
-0.059312403202056885,
-0.03270222619175911,
-0.04564055800437927,
-0.12586337327957153,
-0.0476810559630394,
-0.02215772308409214,
0.04199445620179176,
-0.030711648985743523,
-0.10418875515460968,
0.016531463712453842,
-0.10267066955566406,
-0.08783673495054245,
-0.058315541595220566,
0.18650639057159424,
0.03058754652738571,
-0.0008988038171082735,
-0.013511309400200844,
0.10159966349601746,
-0.038036368787288666,
-0.12872204184532166,
0.012578468769788742,
0.012569787912070751,
-0.0029223733581602573,
-0.08370090276002884,
-0.06742912530899048,
-0.044711410999298096,
0.03130094334483147,
0.1582278162240982,
-0.07176109403371811,
0.035627465695142746,
0.013017273508012295,
0.028492195531725883,
-0.1019584909081459,
0.1659662425518036,
-0.035572659224271774,
-0.032684363424777985,
0.0327644981443882,
0.06151280924677849,
0.018729574978351593,
0.008579743094742298,
-0.11420799791812897,
0.004153730347752571,
0.12329330295324326,
0.027214596047997475,
-0.084039106965065,
0.09026874601840973,
-0.04339620843529701,
0.014271375723183155,
0.01919121854007244,
-0.08808189630508423,
0.020925316959619522,
-0.007085546385496855,
-0.06432844698429108,
-0.0801149234175682,
0.025921223685145378,
0.022943252697587013,
0.029354864731431007,
0.09658917784690857,
-0.09228929132223129,
0.01443520002067089,
-0.08659990131855011,
-0.12088895589113235,
-0.006995183881372213,
-0.052513524889945984,
0.04000655934214592,
-0.13182279467582703,
-0.1721567064523697,
-0.02378457598388195,
0.03440599888563156,
-0.025374112650752068,
-0.017600884661078453,
-0.06155481934547424,
-0.07916594296693802,
0.004819025751203299,
-0.016181595623493195,
0.08627869933843613,
-0.06110350042581558,
0.11186845600605011,
0.07316909730434418,
0.07348894327878952,
-0.05594140291213989,
0.03178340196609497,
-0.12503625452518463,
0.014607726596295834,
-0.21658508479595184,
0.03297561779618263,
-0.052344437688589096,
0.09295260906219482,
-0.07411053776741028,
-0.08280935138463974,
0.013655479066073895,
0.016260070726275444,
0.07238484174013138,
0.11079825460910797,
-0.13157276809215546,
-0.05948614701628685,
0.18214990198612213,
-0.10173720121383667,
-0.1384374499320984,
0.1020871251821518,
-0.07092448323965073,
0.08696521073579788,
0.08696765452623367,
0.17139509320259094,
0.0735217034816742,
-0.07229156047105789,
0.026211794465780258,
-0.03126838430762291,
0.03281598165631294,
-0.04801645874977112,
0.055767722427845,
0.01054431963711977,
-0.058817531913518906,
0.030282316729426384,
-0.045311179012060165,
0.06349606812000275,
-0.10451740026473999,
-0.07583882659673691,
-0.03821386769413948,
-0.12448793649673462,
0.06770677864551544,
0.05299674719572067,
0.07821168005466461,
-0.13221615552902222,
-0.05992609262466431,
0.0878153070807457,
0.07408858090639114,
-0.05314069986343384,
0.002824426395818591,
-0.06491082161664963,
0.06146363168954849,
-0.061621956527233124,
-0.030699769034981728,
-0.16591478884220123,
-0.04799981042742729,
0.001924030832014978,
0.05133160948753357,
0.022375652566552162,
-0.012294303625822067,
0.0804278776049614,
0.0856572836637497,
-0.07620507478713989,
-0.03892482444643974,
0.011332866735756397,
0.02148566208779812,
-0.12133258581161499,
-0.1942521333694458,
0.00225229118950665,
-0.033127959817647934,
0.14311659336090088,
-0.23632802069187164,
0.04880264028906822,
-0.03874064236879349,
0.0766478031873703,
0.030785366892814636,
0.0009226268157362938,
-0.0382414348423481,
0.08269158005714417,
-0.03774942830204964,
-0.052751973271369934,
0.05283449962735176,
0.006281575188040733,
-0.074070505797863,
-0.03472051024436951,
-0.14659537374973297,
0.21197699010372162,
0.13235415518283844,
-0.09628359973430634,
-0.12634684145450592,
-0.008856969885528088,
-0.03577379137277603,
-0.013672396540641785,
-0.05776194483041763,
0.028022266924381256,
0.1096683219075203,
-0.03450678661465645,
0.15660028159618378,
-0.06872148811817169,
-0.019608158618211746,
0.023311041295528412,
-0.07532190531492233,
0.034910157322883606,
0.09518931806087494,
0.06367484480142593,
-0.1393534541130066,
0.15202701091766357,
0.1474621742963791,
-0.08561134338378906,
0.13196296989917755,
-0.0355091355741024,
-0.04309603571891785,
-0.02187211625277996,
0.0025614374317228794,
0.012344993650913239,
0.09947875887155533,
-0.06949470192193985,
-0.002177173737436533,
-0.0006825540913268924,
0.029403597116470337,
0.0009931569220498204,
-0.21359843015670776,
-0.0497257262468338,
0.039108626544475555,
-0.033237800002098083,
-0.0024728167336434126,
-0.02360386773943901,
0.00010712174844229594,
0.10993468761444092,
0.006558367051184177,
-0.08151568472385406,
0.04376085102558136,
0.007769538555294275,
-0.09227374196052551,
0.2285204380750656,
-0.08277715742588043,
-0.10405544191598892,
-0.13066619634628296,
-0.07925070822238922,
-0.03332671523094177,
0.0471966378390789,
0.06718054413795471,
-0.09365876019001007,
-0.032178834080696106,
-0.08235406875610352,
0.006297665182501078,
0.04323204234242439,
0.036614228039979935,
-0.007472877390682697,
0.016817664727568626,
0.08242488652467728,
-0.09752540290355682,
-0.014705220237374306,
-0.04529770463705063,
-0.07759600877761841,
0.06072993576526642,
0.016086166724562645,
0.12265835702419281,
0.1363334208726883,
-0.04710335657000542,
-0.00608679698780179,
-0.043065812438726425,
0.23591481149196625,
-0.05846944451332092,
-0.0305937509983778,
0.12595871090888977,
-0.007557434495538473,
0.03610578551888466,
0.15135398507118225,
0.05167035013437271,
-0.11024627089500427,
0.0454554446041584,
0.01716799847781658,
-0.033831361681222916,
-0.19501729309558868,
-0.03762674704194069,
-0.02471095882356167,
-0.018694434314966202,
0.0882413312792778,
0.01732352375984192,
0.05299997329711914,
0.08637528866529465,
0.03237275779247284,
0.06758582592010498,
-0.008276163600385189,
0.07773943990468979,
0.095127172768116,
0.04971059411764145,
0.1318815052509308,
-0.054004475474357605,
-0.07906830310821533,
0.024121267721056938,
-0.03202052786946297,
0.19283118844032288,
0.02070581167936325,
0.07515519112348557,
0.05221310257911682,
0.15731222927570343,
0.016146626323461533,
0.08213534951210022,
0.020565679296851158,
-0.06458041071891785,
-0.003515984397381544,
-0.0410965234041214,
-0.04627968370914459,
0.03371163457632065,
-0.1019086092710495,
0.06582508981227875,
-0.14060813188552856,
0.004164765123277903,
0.05940879136323929,
0.20835204422473907,
0.05292978882789612,
-0.35229796171188354,
-0.10239150375127792,
0.018741503357887268,
-0.0130431167781353,
-0.03207872435450554,
0.021783985197544098,
0.10786392539739609,
-0.07879304140806198,
0.029310185462236404,
-0.05126671865582466,
0.07338925451040268,
-0.016400840133428574,
0.04697997123003006,
0.026437941938638687,
0.09182374179363251,
-0.027950547635555267,
0.060792021453380585,
-0.30201229453086853,
0.28319647908210754,
0.009108579717576504,
0.08318869024515152,
-0.037632569670677185,
-0.017549913376569748,
0.03335690125823021,
0.10956505686044693,
0.06117071583867073,
-0.023294003680348396,
-0.10120249539613724,
-0.24754884839057922,
-0.019572870805859566,
0.03283126279711723,
0.10812721401453018,
-0.025214308872818947,
0.11565453559160233,
-0.04449596628546715,
0.0054775020107626915,
0.09078055620193481,
-0.04648511856794357,
-0.0905165895819664,
-0.067286916077137,
-0.05158661678433418,
0.02547239325940609,
0.03563244640827179,
-0.08593631535768509,
-0.08665628731250763,
-0.11776568740606308,
0.14577019214630127,
0.008093085139989853,
-0.014957273378968239,
-0.11618923395872116,
0.0795443132519722,
0.0479806587100029,
-0.08206173777580261,
0.05908302217721939,
0.007446529809385538,
0.0714525356888771,
0.030039826408028603,
-0.048975128680467606,
0.12638452649116516,
-0.06540721654891968,
-0.17045052349567413,
-0.06917358189821243,
0.08388159424066544,
0.02223024144768715,
0.042603954672813416,
0.018439147621393204,
0.02037583477795124,
-0.006186238490045071,
-0.08029036968946457,
0.023780565708875656,
-0.02686075121164322,
0.06273507326841354,
0.04139344021677971,
-0.06518486142158508,
-0.025419408455491066,
-0.0630372166633606,
-0.03356530889868736,
0.17548029124736786,
0.29555487632751465,
-0.09492085129022598,
-0.013624889776110649,
0.06084469333291054,
-0.05464564636349678,
-0.21572723984718323,
0.06672883033752441,
0.027339937165379524,
0.005323463119566441,
0.05710809677839279,
-0.1272992491722107,
0.11444072425365448,
0.09680541604757309,
-0.023468798026442528,
0.07451941817998886,
-0.25576144456863403,
-0.13204772770404816,
0.14069928228855133,
0.17430225014686584,
0.1616767942905426,
-0.1585277020931244,
-0.019993916153907776,
-0.048519305884838104,
-0.09914376586675644,
0.08140573650598526,
-0.10782653093338013,
0.10985516756772995,
-0.011118284426629543,
0.05523890256881714,
0.008581588976085186,
-0.05180210992693901,
0.12131574004888535,
-0.012117423117160797,
0.13540978729724884,
-0.07630644738674164,
-0.03080626204609871,
0.04637918621301651,
-0.05303996056318283,
0.01797310635447502,
-0.06713631749153137,
0.04702756181359291,
-0.04026706889271736,
-0.029736386612057686,
-0.06558024138212204,
0.042158812284469604,
-0.027529016137123108,
-0.07375770062208176,
-0.03809201344847679,
0.03530783951282501,
0.02991919219493866,
-0.027221933007240295,
0.11561466753482819,
-0.012879259884357452,
0.19115738570690155,
0.11627160012722015,
0.07781197130680084,
-0.04103038087487221,
0.03032839484512806,
0.01328182965517044,
-0.047154564410448074,
0.04930446669459343,
-0.15427252650260925,
0.04317034035921097,
0.1054539754986763,
0.009270195849239826,
0.1460874378681183,
0.08114267885684967,
-0.01054447889328003,
0.02472209930419922,
0.09022830426692963,
-0.15310771763324738,
-0.07120093703269958,
-0.0047790599055588245,
-0.06129230931401253,
-0.10521068423986435,
0.067497119307518,
0.10812706500291824,
-0.08095341920852661,
-0.0020670220255851746,
-0.03020746074616909,
-0.0016612430335953832,
-0.056661467999219894,
0.19046206772327423,
0.08934132009744644,
0.051866743713617325,
-0.08415161073207855,
0.06358522176742554,
0.04258469492197037,
-0.022921795025467873,
0.006852937396615744,
0.036935970187187195,
-0.09566771984100342,
-0.0445454902946949,
0.0955725759267807,
0.1989901065826416,
-0.07759338617324829,
-0.04347251355648041,
-0.14923249185085297,
-0.13172930479049683,
0.037420060485601425,
0.21380247175693512,
0.10363496840000153,
0.01584753394126892,
-0.007940059527754784,
0.012943125329911709,
-0.14181040227413177,
0.09233700484037399,
0.03311933949589729,
0.09213906526565552,
-0.15185612440109253,
0.1671813726425171,
-0.016705837100744247,
0.0072016045451164246,
-0.034607019275426865,
0.03401315584778786,
-0.13845409452915192,
0.00755518302321434,
-0.14334794878959656,
-0.02718373015522957,
-0.03215400502085686,
0.007962427102029324,
0.008018535561859608,
-0.06904735416173935,
-0.0558335967361927,
0.00687044020742178,
-0.10829007625579834,
-0.009969038888812065,
0.0396086648106575,
0.05923660844564438,
-0.12233839929103851,
-0.042823441326618195,
0.025099441409111023,
-0.05795401334762573,
0.04747616499662399,
0.03119981847703457,
0.024192534387111664,
0.06881918013095856,
-0.1963830441236496,
0.013186932541429996,
0.08052852749824524,
-0.024611782282590866,
0.05518849194049835,
-0.08967428654432297,
-0.002548546064645052,
0.003967683296650648,
0.06815552711486816,
0.03425002843141556,
0.07952282577753067,
-0.12885507941246033,
0.01910793036222458,
-0.03214747831225395,
-0.06326086074113846,
-0.05725174397230148,
0.01244312059134245,
0.0886341780424118,
-0.0227720495313406,
0.1972379833459854,
-0.11482077836990356,
0.013322368264198303,
-0.20564322173595428,
-0.006960438098758459,
-0.025984669104218483,
-0.12255331128835678,
-0.1533028930425644,
-0.06178693845868111,
0.05426926165819168,
-0.03589276969432831,
0.1362089067697525,
0.018255367875099182,
0.0514245331287384,
0.03436824679374695,
-0.04068128019571304,
0.057933077216148376,
0.04275647923350334,
0.24076275527477264,
0.042467668652534485,
-0.05136833339929581,
0.03314929082989693,
0.07142303138971329,
0.11020804196596146,
0.04034837707877159,
0.16499537229537964,
0.16376888751983643,
-0.06879037618637085,
0.0903225988149643,
0.030450917780399323,
-0.03499339148402214,
-0.09839541465044022,
0.022873077541589737,
-0.04773295670747757,
0.042675431817770004,
-0.024956032633781433,
0.16771246492862701,
0.1113038957118988,
-0.1596510410308838,
0.014791124500334263,
-0.06400762498378754,
-0.07509065419435501,
-0.11566317826509476,
-0.012912808917462826,
-0.110990971326828,
-0.16200336813926697,
0.0052018119022250175,
-0.10898660123348236,
-0.00010230737825622782,
0.11000165343284607,
-0.007534196600317955,
-0.012020758353173733,
0.18351922929286957,
0.01937796361744404,
0.04025115445256233,
0.03576052188873291,
-0.01631632074713707,
-0.02836678922176361,
-0.07964594662189484,
-0.08914143592119217,
0.004973202478140593,
-0.0299563966691494,
0.020890504121780396,
-0.05292314291000366,
-0.05877668038010597,
0.04571649059653282,
-0.026820514351129532,
-0.11036552488803864,
0.021868659183382988,
0.04840833693742752,
0.04654969647526741,
0.029218489304184914,
0.014792089350521564,
-0.0017182002775371075,
0.007170905824750662,
0.24693967401981354,
-0.07403948903083801,
-0.10640744119882584,
-0.10616568475961685,
0.29820123314857483,
0.056547973304986954,
0.033216625452041626,
0.004069481045007706,
-0.0933113619685173,
0.05185767635703087,
0.22145943343639374,
0.19306649267673492,
-0.07604090124368668,
0.014328992925584316,
-0.03616703674197197,
-0.018269764259457588,
-0.02274150960147381,
0.11484453827142715,
0.10497981309890747,
-0.008942861109972,
-0.08683338016271591,
-0.026456929743289948,
-0.03683249652385712,
-0.009448667988181114,
-0.030056579038500786,
0.04904748126864433,
0.033162616193294525,
0.019570782780647278,
-0.0548878088593483,
0.07059311121702194,
-0.04761431738734245,
-0.08925081044435501,
0.06434527784585953,
-0.17936936020851135,
-0.1336761713027954,
-0.023530075326561928,
0.07989111542701721,
-0.0005645982455462217,
0.060173023492097855,
-0.03521040827035904,
0.002638244768604636,
0.0378761887550354,
-0.025992408394813538,
-0.05508613958954811,
-0.09195919334888458,
0.058541201055049896,
-0.07114976644515991,
0.20402328670024872,
-0.04107194021344185,
0.0433766208589077,
0.13312745094299316,
0.03628500923514366,
-0.08089794218540192,
0.11609099805355072,
0.03365769982337952,
-0.07791761308908463,
0.04774216189980507,
0.0805850401520729,
-0.04512883722782135,
0.11268968135118484,
0.04416634514927864,
-0.16268542408943176,
0.04228774830698967,
-0.08079247176647186,
-0.07936161011457443,
-0.059155356138944626,
-0.04775198549032211,
-0.042833056300878525,
0.13429853320121765,
0.1902322918176651,
-0.04056421294808388,
0.037104636430740356,
-0.04999326169490814,
0.03572339564561844,
0.07696384936571121,
0.018776675686240196,
-0.05558769032359123,
-0.24966581165790558,
0.02156651020050049,
0.11184259504079819,
-0.02493482641875744,
-0.29701539874076843,
-0.07771316170692444,
-0.02015971578657627,
-0.052419934421777725,
-0.08779453486204147,
0.10121823102235794,
0.11808080971240997,
0.0632300153374672,
-0.059431176632642746,
-0.12960650026798248,
-0.07725346088409424,
0.18311329185962677,
-0.12135174125432968,
-0.11742767691612244
] |
null | null | transformers |
# NeuBeagle-7B
NeuBeagle-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [eren23/dpo-binarized-NeuralTrix-7B](https://huggingface.co/eren23/dpo-binarized-NeuralTrix-7B)
## 🧩 Configuration
```yaml
models:
- model: eren23/dpo-binarized-NeutrixOmnibe-7B
# No parameters necessary for base model
- model: eren23/dpo-binarized-NeuralTrix-7B
parameters:
density: 0.53
weight: 0.6
merge_method: dare_ties
base_model: eren23/dpo-binarized-NeutrixOmnibe-7B
parameters:
int8_mask: true
dtype: bfloat16
random_seed: 0
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "mlabonne/NeuBeagle-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "cc-by-nc-4.0", "tags": ["merge", "mergekit", "lazymergekit"], "base_model": ["eren23/dpo-binarized-NeuralTrix-7B", "eren23/dpo-binarized-NeutrixOmnibe-7B"]} | text-generation | mlabonne/NeuBeagle-7B | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"base_model:eren23/dpo-binarized-NeuralTrix-7B",
"base_model:eren23/dpo-binarized-NeutrixOmnibe-7B",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T10:53:42+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #base_model-eren23/dpo-binarized-NeuralTrix-7B #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# NeuBeagle-7B
NeuBeagle-7B is a merge of the following models using LazyMergekit:
* eren23/dpo-binarized-NeuralTrix-7B
## Configuration
## Usage
| [
"# NeuBeagle-7B\n\nNeuBeagle-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeuralTrix-7B",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #base_model-eren23/dpo-binarized-NeuralTrix-7B #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# NeuBeagle-7B\n\nNeuBeagle-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeuralTrix-7B",
"## Configuration",
"## Usage"
] | [
115,
45,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #base_model-eren23/dpo-binarized-NeuralTrix-7B #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# NeuBeagle-7B\n\nNeuBeagle-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeuralTrix-7B## Configuration## Usage"
] | [
-0.017640478909015656,
0.06242142245173454,
-0.003834483912214637,
0.00011936493683606386,
0.05675341933965683,
0.06301800161600113,
0.14784225821495056,
0.09245187044143677,
-0.06483421474695206,
0.04778410494327545,
0.07694853842258453,
0.1323629766702652,
-0.013228069059550762,
0.08883947879076004,
-0.036598533391952515,
-0.18370139598846436,
0.1027645617723465,
0.0723060742020607,
0.015181521885097027,
0.03750302642583847,
0.08065912872552872,
-0.01876486837863922,
0.05046335235238075,
-0.005841834004968405,
-0.14288507401943207,
0.031995099037885666,
-0.04299319535493851,
-0.02190665900707245,
0.0724896714091301,
0.061582885682582855,
0.11901053786277771,
0.05682297423481941,
-0.005722406320273876,
-0.13164345920085907,
0.040576282888650894,
0.006486513651907444,
-0.04217172786593437,
0.08083248883485794,
0.039957668632268906,
-0.03453081473708153,
0.10859581083059311,
-0.04211338609457016,
0.017528925091028214,
0.009321845136582851,
-0.10140088200569153,
-0.11030412465333939,
-0.014070531353354454,
0.19131456315517426,
0.013450700789690018,
0.06197705119848251,
-0.03324740752577782,
0.13193416595458984,
-0.016462890431284904,
0.04322478547692299,
0.11090466380119324,
-0.22079508006572723,
-0.02901959791779518,
0.1030498594045639,
-0.0055840108543634415,
-0.02569890394806862,
0.02978387475013733,
0.016673266887664795,
0.05903705582022667,
0.00007697957335039973,
0.051981888711452484,
-0.053523581475019455,
0.09136152267456055,
-0.060329172760248184,
-0.14776894450187683,
0.026386862620711327,
0.06554161757230759,
0.04564991593360901,
0.029688535258173943,
-0.03592715039849281,
-0.11375663429498672,
0.027014119550585747,
-0.06789933890104294,
-0.053459372371435165,
-0.00014682338223792613,
-0.007808296475559473,
0.06438368558883667,
-0.10450570285320282,
-0.01572096347808838,
-0.02713838219642639,
-0.10973028838634491,
0.20617662370204926,
0.04813985899090767,
-0.035752274096012115,
-0.0008391045266762376,
0.0633883997797966,
-0.08970202505588531,
-0.09850119799375534,
0.020800931379199028,
-0.057502273470163345,
0.002882834058254957,
-0.016832852736115456,
-0.07820217311382294,
-0.042587555944919586,
0.09852836281061172,
0.23866786062717438,
0.031999751925468445,
0.01672258786857128,
-0.005349284503608942,
0.07028643786907196,
-0.02576412819325924,
-0.05021025985479355,
-0.15219466388225555,
-0.08291716128587723,
0.12334869801998138,
0.10548076778650284,
0.1387733519077301,
0.0029289722442626953,
-0.15951800346374512,
-0.03327074646949768,
0.07452686131000519,
-0.02744600735604763,
0.03392361104488373,
0.09961967170238495,
-0.061780549585819244,
-0.08428759127855301,
0.2460561990737915,
-0.017649192363023758,
-0.02531685307621956,
0.02971002832055092,
-0.05727376416325569,
0.08131848275661469,
0.02980075590312481,
0.0489925742149353,
-0.035397715866565704,
0.06342332065105438,
-0.0813002809882164,
-0.0519205667078495,
-0.055869944393634796,
-0.09532615542411804,
0.03382999077439308,
-0.01758568547666073,
0.011698984540998936,
-0.1575303077697754,
-0.2301155924797058,
0.023174192756414413,
0.05599496141076088,
-0.05672242492437363,
-0.043168529868125916,
-0.11231105774641037,
-0.02004893310368061,
-0.03411850333213806,
-0.02129337750375271,
-0.09478364884853363,
0.0071976459585130215,
-0.02333982288837433,
0.07182250916957855,
0.03277510777115822,
-0.27994000911712646,
0.014645662158727646,
-0.084433414041996,
0.07764501869678497,
-0.16504016518592834,
0.0707855075597763,
-0.035637106746435165,
0.0320211723446846,
-0.07782497256994247,
-0.03743011876940727,
-0.13282722234725952,
-0.01878095231950283,
0.10164988785982132,
0.08482624590396881,
-0.07390296459197998,
-0.09254524856805801,
0.02258697710931301,
-0.09738525003194809,
-0.1175428107380867,
0.13147571682929993,
-0.029429497197270393,
0.06285414099693298,
0.07032215595245361,
0.25849369168281555,
0.041643742471933365,
0.00371729489415884,
-0.07040900737047195,
-0.03978714719414711,
-0.026481036096811295,
-0.02834242582321167,
0.02791799232363701,
0.07236693054437637,
-0.0703401118516922,
0.03319573774933815,
-0.007220355793833733,
0.09926021844148636,
-0.003184356726706028,
-0.04948347061872482,
-0.009546348825097084,
-0.094634510576725,
0.13986395299434662,
-0.013683684170246124,
0.0626368597149849,
-0.04936136677861214,
-0.04435183107852936,
0.17067022621631622,
0.10817943513393402,
-0.09401461482048035,
0.03508150950074196,
-0.06097197160124779,
0.10784086585044861,
-0.0834503099322319,
0.0718899741768837,
-0.0958920419216156,
-0.14156584441661835,
0.022407393902540207,
-0.07141922414302826,
0.09577734768390656,
-0.06246282160282135,
0.015674784779548645,
0.012053977698087692,
-0.09661274403333664,
-0.0447562150657177,
0.012130453251302242,
0.05645780637860298,
0.03155422583222389,
-0.16774621605873108,
0.008079493418335915,
-0.0333702489733696,
0.25267845392227173,
-0.12794111669063568,
0.060456015169620514,
-0.07883457839488983,
0.2095765322446823,
-0.024508124217391014,
0.03551454469561577,
0.015552840195596218,
0.029299896210432053,
-0.03986275941133499,
0.01683794893324375,
0.11532782763242722,
-0.0018402840942144394,
-0.1593610793352127,
0.05018237605690956,
-0.09086191654205322,
0.06626162678003311,
0.1153053343296051,
-0.024059627205133438,
-0.03217194229364395,
0.001490232185460627,
0.021519161760807037,
-0.09322275221347809,
0.1044725850224495,
-0.08401788771152496,
0.09017181396484375,
-0.01297610905021429,
0.08088318258523941,
-0.10929074883460999,
-0.029028208926320076,
0.013644948601722717,
-0.018289579078555107,
-0.017653757706284523,
0.04762309044599533,
0.016696514561772346,
-0.31079554557800293,
0.1248801201581955,
0.07616502046585083,
-0.1445714682340622,
0.08409225195646286,
0.045499902218580246,
-0.00971439853310585,
-0.044765423983335495,
0.009741622023284435,
0.03079412132501602,
0.03525053709745407,
-0.07315661758184433,
0.05855190381407738,
0.03235471993684769,
-0.010819102637469769,
0.10863975435495377,
-0.06145012006163597,
0.03755347430706024,
-0.041293613612651825,
-0.008529405109584332,
0.08096248656511307,
0.06656583398580551,
-0.021926242858171463,
0.0630815178155899,
0.027779242023825645,
-0.11237599700689316,
0.08738452941179276,
0.01621479168534279,
-0.05419451743364334,
0.1096537709236145,
-0.14430205523967743,
-0.21220913529396057,
-0.14714165031909943,
0.00913397129625082,
-0.13823434710502625,
0.011559753678739071,
0.006353320553898811,
0.06299393624067307,
0.00817433837801218,
-0.08013008534908295,
0.016158021986484528,
0.03656228259205818,
-0.03840739652514458,
0.0015065778279677033,
0.012221106328070164,
0.011747423559427261,
-0.06998821347951889,
0.0068609947338700294,
-0.03204498812556267,
0.04761768504977226,
0.04978564754128456,
-0.1396254003047943,
0.10873952507972717,
0.15695154666900635,
0.016261983662843704,
-0.053659338504076004,
-0.02677842788398266,
0.15594607591629028,
-0.05948919057846069,
0.03783663734793663,
0.09144648909568787,
-0.08605531603097916,
0.03906547278165817,
0.17265871167182922,
0.01587231084704399,
-0.0557083934545517,
0.044544246047735214,
0.012570117600262165,
0.004364245105534792,
-0.15981316566467285,
-0.12496013194322586,
-0.07425373047590256,
0.03574839234352112,
0.0393860787153244,
0.04528992623090744,
0.08133172988891602,
0.04995272308588028,
-0.05809321999549866,
0.016578607261180878,
0.019540179520845413,
0.11178292334079742,
0.3463215231895447,
0.020456591621041298,
0.09694807231426239,
-0.009530458599328995,
-0.11560778319835663,
0.03202179819345474,
0.09775814414024353,
0.038160309195518494,
0.025262827053666115,
0.2295854389667511,
-0.009471489116549492,
0.04672131687402725,
0.08643333613872528,
0.039229098707437515,
-0.03839291259646416,
-0.02332034893333912,
-0.02811598591506481,
-0.06162548065185547,
0.02677115611732006,
0.029633402824401855,
-0.042710911482572556,
0.03006860613822937,
0.039057325571775436,
-0.06419689208269119,
0.10699824243783951,
0.00790120754390955,
0.07042255997657776,
-0.29418715834617615,
-0.03321883827447891,
0.007965716533362865,
0.0446951650083065,
-0.02589964121580124,
-0.03643205016851425,
-0.01127480436116457,
0.009128719568252563,
0.11733565479516983,
-0.019459033384919167,
0.06857974827289581,
-0.06083492189645767,
0.023942388594150543,
-0.06938312202692032,
0.12921260297298431,
0.017987407743930817,
0.08330389857292175,
-0.15235082805156708,
0.03311861306428909,
0.015047059394419193,
0.026085766032338142,
0.04127062112092972,
-0.04417511075735092,
0.021390989422798157,
0.15183867514133453,
0.011312032118439674,
-0.012566572055220604,
0.06584954261779785,
0.05212504416704178,
-0.10867201536893845,
0.006779149174690247,
0.06023462116718292,
-0.002091214759275317,
0.0676833763718605,
0.015136724337935448,
-0.043350234627723694,
0.052087243646383286,
0.09199534356594086,
-0.17569869756698608,
-0.1484120935201645,
0.08098660409450531,
0.1014760211110115,
0.06868430972099304,
-0.061508797109127045,
-0.04846847802400589,
-0.08073411881923676,
0.22004324197769165,
-0.001404129434376955,
-0.03866911306977272,
-0.06384053826332092,
-0.0054442984983325005,
0.11391659826040268,
-0.0628380998969078,
0.04130999743938446,
-0.026525234803557396,
0.05934246629476547,
-0.07087568938732147,
-0.17644739151000977,
0.09332109242677689,
-0.06612907350063324,
-0.08296147733926773,
-0.01921933703124523,
0.060647666454315186,
-0.0020500312093645334,
0.023835821077227592,
0.0043331836350262165,
0.0014728232054039836,
-0.018787382170557976,
-0.04625644534826279,
-0.02701125480234623,
0.0756450667977333,
-0.02204981818795204,
0.0749182403087616,
-0.10586412250995636,
-0.08687075972557068,
-0.05121138319373131,
0.005306839942932129,
0.1590321660041809,
0.2311907261610031,
0.009070320054888725,
-0.014929134398698807,
0.2264963537454605,
-0.058794841170310974,
-0.24797651171684265,
-0.033713292330503464,
0.057610247284173965,
-0.0120519008487463,
-0.062202487140893936,
-0.15266932547092438,
0.04420978203415871,
0.17284779250621796,
-0.005549907684326172,
0.04686664417386055,
-0.2632392346858978,
-0.11257215589284897,
0.1016620323061943,
0.056469518691301346,
0.19857406616210938,
-0.10541311651468277,
-0.022321324795484543,
-0.09924039989709854,
-0.09617839753627777,
0.24261531233787537,
-0.10987696051597595,
0.11816176027059555,
-0.02407076768577099,
-0.013057650998234749,
0.002725545084103942,
-0.009565522894263268,
0.07227423787117004,
-0.032186925411224365,
0.024911491200327873,
-0.008728420361876488,
-0.025898030027747154,
0.1552286148071289,
-0.03061240166425705,
0.049793850630521774,
-0.06101354584097862,
0.030293377116322517,
0.10605283081531525,
-0.06848065555095673,
-0.060007739812135696,
0.0644160658121109,
-0.0027004454750567675,
-0.07840979844331741,
-0.06247059628367424,
0.06499460339546204,
0.013508925214409828,
0.04622005298733711,
0.20870821177959442,
-0.007672967854887247,
0.006548493634909391,
0.16178849339485168,
0.05431964248418808,
-0.054162632673978806,
0.07701349258422852,
0.01326047908514738,
-0.0863950103521347,
0.05316663905978203,
0.019455691799521446,
-0.008549515157938004,
0.1275164633989334,
-0.03710315749049187,
0.0982089713215828,
0.058011334389448166,
-0.06830858439207077,
-0.05737218260765076,
0.09017117321491241,
-0.1404741257429123,
-0.14671817421913147,
-0.034396033734083176,
-0.042351964861154556,
-0.03825327008962631,
0.09403350204229355,
0.19784390926361084,
-0.08583275228738785,
-0.018596166744828224,
0.07427021861076355,
-0.051996756345033646,
-0.08629739284515381,
0.0925593301653862,
0.05997877195477486,
0.002266577910631895,
-0.04426518455147743,
0.06912200897932053,
0.07647114247083664,
-0.08686526864767075,
-0.012120571918785572,
0.08194953203201294,
-0.12433206290006638,
-0.08556186407804489,
-0.12394077330827713,
0.2551034092903137,
-0.10825227946043015,
-0.07492458820343018,
-0.15343087911605835,
-0.0492304228246212,
0.006710386835038662,
0.10198967903852463,
0.09997781366109848,
0.021595938131213188,
-0.014765583910048008,
-0.005669010803103447,
-0.0963224321603775,
0.10682478547096252,
-0.026049092411994934,
0.16022327542304993,
-0.06247791647911072,
-0.08339289575815201,
-0.013519106432795525,
-0.016658248379826546,
-0.07148591428995132,
0.03344196826219559,
-0.18265928328037262,
-0.03824113681912422,
-0.17065709829330444,
-0.017886932939291,
-0.1054212749004364,
-0.021718954667448997,
0.005841918755322695,
0.06401776522397995,
-0.011757493950426579,
0.007132185623049736,
-0.01997140608727932,
-0.03839046508073807,
0.026381393894553185,
0.046083830296993256,
-0.04282093048095703,
-0.024404019117355347,
-0.009614519774913788,
-0.04190237447619438,
0.09556581825017929,
0.028980739414691925,
0.01509598083794117,
0.02459573931992054,
-0.07562050968408585,
-0.051830075681209564,
0.08027325570583344,
0.05488969385623932,
0.015379599295556545,
-0.09692994505167007,
-0.030220860615372658,
0.04952533170580864,
-0.021504966542124748,
-0.022613525390625,
0.08863308280706406,
-0.0804426446557045,
-0.024863261729478836,
-0.09606724232435226,
-0.03197497874498367,
-0.03393089398741722,
-0.0385996513068676,
-0.04660623520612717,
0.10796316713094711,
0.15791352093219757,
-0.05109616741538048,
-0.014227708801627159,
-0.05958569794893265,
0.006020594388246536,
0.013584272935986519,
-0.15416832268238068,
-0.0011059428798034787,
-0.03954258933663368,
-0.008592861704528332,
-0.021951211616396904,
0.14598138630390167,
-0.131796196103096,
-0.18910649418830872,
0.010372383520007133,
-0.07811091840267181,
-0.05183234065771103,
0.026409907266497612,
0.21329768002033234,
0.10180618613958359,
-0.042561642825603485,
-0.10581037402153015,
0.069135382771492,
0.05718580633401871,
0.11769161373376846,
0.06493853032588959,
0.06435928493738174,
0.004658543039113283,
0.10697339475154877,
0.0996493324637413,
0.018429001793265343,
-0.038601722568273544,
0.16463829576969147,
0.01289261132478714,
0.029022205621004105,
0.016887443140149117,
0.1616714745759964,
0.1914568543434143,
-0.0317949578166008,
-0.001795198186300695,
0.03906599432229996,
-0.06896086037158966,
-0.05444180965423584,
-0.16847582161426544,
-0.11966654658317566,
-0.18996936082839966,
-0.02975214086472988,
-0.12046276032924652,
-0.12299493700265884,
-0.023660602048039436,
0.01453592348843813,
-0.048054419457912445,
0.13862237334251404,
0.05718458816409111,
-0.05054442957043648,
0.03215945139527321,
-0.048626311123371124,
-0.05898764356970787,
0.022782035171985626,
-0.07828562706708908,
0.00038137988303788006,
0.03772500529885292,
0.018579432740807533,
-0.01580188237130642,
0.03213074430823326,
0.047060929238796234,
-0.02433875761926174,
-0.07696036249399185,
0.002602023771032691,
0.0651608407497406,
0.014267266727983952,
0.09659385681152344,
0.02259076200425625,
-0.09864030033349991,
0.01362947653979063,
0.07571516931056976,
-0.005651278421282768,
-0.14628276228904724,
-0.0703527107834816,
0.2309149205684662,
-0.0005859136581420898,
0.07022541761398315,
0.012410270981490612,
-0.061790917068719864,
-0.03235746920108795,
0.15112264454364777,
0.2701719403266907,
-0.10456181317567825,
-0.015711382031440735,
0.03545965254306793,
0.007649505976587534,
-0.004061159212142229,
0.029410379007458687,
0.04827946424484253,
0.1278679519891739,
-0.0111624700948596,
-0.027469808235764503,
-0.04938345402479172,
0.005214716773480177,
-0.08202657848596573,
-0.04489609971642494,
0.08879248052835464,
-0.025130409747362137,
-0.03659151494503021,
0.05708370357751846,
-0.20711755752563477,
-0.005105647724121809,
-0.048957113176584244,
-0.07901690900325775,
-0.11814018338918686,
-0.09108112752437592,
0.010378370061516762,
-0.073929563164711,
0.06525126844644547,
-0.06745538115501404,
-0.019919201731681824,
0.009648668579757214,
-0.04526069015264511,
-0.05720844864845276,
-0.009326105006039143,
-0.0006155352457426488,
0.01336041372269392,
0.008847449906170368,
-0.034931425005197525,
0.06863130629062653,
0.11791112273931503,
0.02192859724164009,
-0.09522639960050583,
0.11904741078615189,
-0.007994178682565689,
-0.015714669600129128,
0.06647464632987976,
0.0539059154689312,
-0.024211101233959198,
0.11227288842201233,
0.05752737820148468,
-0.14815862476825714,
0.006738624069839716,
0.04493670538067818,
-0.026129646226763725,
-0.0855085626244545,
0.06219532713294029,
-0.0787840187549591,
0.1191350668668747,
0.10633550584316254,
-0.03409045189619064,
-0.025428513064980507,
0.001459019724279642,
-0.00260436674579978,
0.11837805807590485,
0.11858521401882172,
-0.020340265706181526,
-0.19120146334171295,
-0.047440968453884125,
0.08466159552335739,
0.011304170824587345,
-0.22154513001441956,
-0.04042071849107742,
-0.13065433502197266,
0.019001543521881104,
-0.07945364713668823,
0.04179507866501808,
0.13065171241760254,
0.00375411962158978,
-0.041647862643003464,
-0.13048529624938965,
-0.08151239156723022,
0.031669020652770996,
-0.11220091581344604,
-0.1518123298883438
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.7.2.dev0 | {"library_name": "peft", "base_model": "google/flan-t5-small"} | null | HeydarS/flan-t5-small_peft_v17 | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:google/flan-t5-small",
"region:us"
] | 2024-02-12T10:55:58+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #arxiv-1910.09700 #base_model-google/flan-t5-small #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.7.2.dev0 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.7.2.dev0"
] | [
"TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-google/flan-t5-small #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.7.2.dev0"
] | [
36,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
14
] | [
"passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-google/flan-t5-small #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.7.2.dev0"
] | [
-0.10678984969854355,
0.1943177431821823,
-0.003271501511335373,
0.036465056240558624,
0.09410196542739868,
0.01846383325755596,
0.054446954280138016,
0.12355354428291321,
-0.03337998315691948,
0.11120248585939407,
0.06908183544874191,
0.09975259751081467,
0.10259854793548584,
0.21278169751167297,
0.008818644098937511,
-0.20420463383197784,
0.025644095614552498,
-0.09515897929668427,
-0.005338720511645079,
0.12422468513250351,
0.14860181510448456,
-0.0988120287656784,
0.08199920505285263,
-0.014860249124467373,
-0.011655353009700775,
-0.03557303547859192,
-0.07019121944904327,
-0.03437522053718567,
0.04684819281101227,
0.05083300918340683,
0.058458320796489716,
-0.0004759027506224811,
0.08707091212272644,
-0.26740366220474243,
0.019087903201580048,
0.046006426215171814,
-0.00844853837043047,
0.0865015834569931,
0.10129158198833466,
-0.0397624596953392,
0.13128970563411713,
-0.034760791808366776,
0.13959383964538574,
0.08237355947494507,
-0.09614212810993195,
-0.22411824762821198,
-0.07014220207929611,
0.08653971552848816,
0.17552702128887177,
0.07672001421451569,
-0.04395952448248863,
0.1295768767595291,
-0.0924074649810791,
0.020061619579792023,
0.041444964706897736,
-0.09138372540473938,
-0.07157407701015472,
0.05500961095094681,
0.10402261465787888,
0.05199452489614487,
-0.1353532075881958,
-0.027786536142230034,
0.023208588361740112,
0.036523208022117615,
0.08116298913955688,
0.014606006443500519,
0.14655494689941406,
0.028648709878325462,
-0.14998315274715424,
-0.04007238894701004,
0.12892410159111023,
0.030228814110159874,
-0.038517750799655914,
-0.2261561155319214,
0.006055756472051144,
-0.08207837492227554,
-0.02847118303179741,
-0.051226627081632614,
0.03693482652306557,
0.0017866286216303706,
0.09731553494930267,
-0.03107457421720028,
-0.09191954880952835,
-0.01085467729717493,
0.09129106253385544,
0.04843943938612938,
0.023714596405625343,
-0.02320827543735504,
0.005523155443370342,
0.12197954207658768,
0.05083313211798668,
-0.12910759449005127,
-0.060821473598480225,
-0.07120278477668762,
-0.044372428208589554,
-0.04398733749985695,
0.03725236654281616,
0.034253284335136414,
0.056280918419361115,
0.24983061850070953,
-0.03523522987961769,
0.05504084378480911,
0.05744991451501846,
0.020601743832230568,
0.04300375655293465,
0.09559731185436249,
-0.05926046147942543,
-0.1513720452785492,
-0.013174066320061684,
0.0960920974612236,
-0.0036772575695067644,
-0.021742843091487885,
-0.04954633116722107,
0.038210052996873856,
0.03962017595767975,
0.10701151937246323,
0.09570838510990143,
-0.004919416271150112,
-0.07620739191770554,
-0.05127223581075668,
0.20753157138824463,
-0.147607684135437,
0.042259056121110916,
0.02207372896373272,
-0.015594509430229664,
-0.05114225670695305,
0.013563153333961964,
0.018195796757936478,
-0.02547607012093067,
0.1005428358912468,
-0.06698956340551376,
-0.03888751193881035,
-0.1142214834690094,
-0.025797180831432343,
0.03585217520594597,
0.010450964793562889,
-0.028879987075924873,
-0.03486150503158569,
-0.06265238672494888,
-0.09284557402133942,
0.1008237972855568,
-0.06457357853651047,
-0.06046295538544655,
-0.030450783669948578,
-0.09070669114589691,
0.02045322209596634,
0.02785527892410755,
0.10014168173074722,
-0.024220606312155724,
0.04299863055348396,
-0.01004557404667139,
0.06294021755456924,
0.07942742109298706,
0.035637300461530685,
-0.07000906020402908,
0.06124889850616455,
-0.20366689562797546,
0.08724482357501984,
-0.07773306220769882,
0.028578296303749084,
-0.16088712215423584,
-0.02044290490448475,
0.0036016395315527916,
0.022697916254401207,
0.03678199648857117,
0.15899917483329773,
-0.19847238063812256,
-0.03135470673441887,
0.16011151671409607,
-0.10484761744737625,
-0.12142323702573776,
0.041624486446380615,
-0.04963298887014389,
0.15966784954071045,
0.022792702540755272,
-0.005777034442871809,
0.09329923987388611,
-0.15099859237670898,
-0.02590417116880417,
-0.026783613488078117,
-0.004161354620009661,
0.10193873196840286,
0.08460481464862823,
-0.08336814492940903,
0.031122585758566856,
0.013792578130960464,
-0.041370000690221786,
-0.023667994886636734,
-0.05149602144956589,
-0.10817660391330719,
0.00310539617203176,
-0.08229676634073257,
0.026549918577075005,
-0.0079584876075387,
-0.0789434090256691,
-0.0108463354408741,
-0.16435840725898743,
-0.03416353091597557,
0.07953426241874695,
0.015295976772904396,
-0.018154967576265335,
-0.09447082132101059,
0.04128055274486542,
-0.025232087820768356,
-0.022191878408193588,
-0.1548198163509369,
-0.03200295567512512,
0.01790352165699005,
-0.13551989197731018,
0.00979374535381794,
-0.12306686490774155,
0.06709369271993637,
0.01443812157958746,
-0.06903327256441116,
-0.03456452488899231,
-0.01243849191814661,
0.007768309209495783,
-0.051930468529462814,
-0.24015015363693237,
-0.020823560655117035,
-0.05454397201538086,
0.1534179002046585,
-0.2292969524860382,
0.03954611346125603,
0.05107582360506058,
0.12773801386356354,
0.003950045444071293,
-0.06093154475092888,
0.031201137229800224,
-0.06812259554862976,
-0.02653426118195057,
-0.072655588388443,
-0.0031959384214133024,
-0.007014698814600706,
-0.04512464627623558,
0.017115700989961624,
-0.11757603287696838,
-0.03797215223312378,
0.10148635506629944,
0.06405945867300034,
-0.16711628437042236,
-0.02273024059832096,
-0.04613037779927254,
-0.06434614211320877,
-0.0845428854227066,
-0.060160405933856964,
0.1034717708826065,
0.05107305571436882,
0.039624687284231186,
-0.07340081036090851,
-0.06763333082199097,
0.010769062675535679,
-0.017530182376503944,
-0.02475883439183235,
0.11498381942510605,
0.07155054807662964,
-0.11456798762083054,
0.09703671187162399,
0.07266693562269211,
0.034246496856212616,
0.07816658914089203,
-0.027058683335781097,
-0.10618729889392853,
-0.029812106862664223,
0.04632946103811264,
0.015520608052611351,
0.15852820873260498,
-0.06928595155477524,
0.052912525832653046,
0.04526408761739731,
-0.036795031279325485,
0.0452008955180645,
-0.0973026305437088,
0.009762495756149292,
0.007700842339545488,
-0.015425536781549454,
0.016144562512636185,
-0.017006870359182358,
0.00971278641372919,
0.08625975996255875,
0.05244483798742294,
0.038263678550720215,
0.02964569628238678,
-0.02858326956629753,
-0.13132143020629883,
0.18367736041545868,
-0.097126305103302,
-0.2401144951581955,
-0.1555911749601364,
0.05932430922985077,
0.05536810681223869,
-0.018744371831417084,
0.026573235169053078,
-0.055290598422288895,
-0.10348799079656601,
-0.08127795904874802,
-0.0018320380477234721,
0.028834031894803047,
-0.055946704000234604,
-0.07552959769964218,
0.04827537387609482,
0.04250814765691757,
-0.11669281870126724,
0.03729432076215744,
0.059431154280900955,
-0.012045396491885185,
0.004491843748837709,
0.0586012527346611,
0.08725428581237793,
0.17935322225093842,
-0.008778807707130909,
-0.002924887230619788,
0.0479402057826519,
0.28161922097206116,
-0.1588558554649353,
0.1162000373005867,
0.12457982450723648,
-0.06381220370531082,
0.07975198328495026,
0.18955272436141968,
0.0323023796081543,
-0.10250310599803925,
0.03585449233651161,
0.03121950849890709,
-0.027301868423819542,
-0.2692923843860626,
-0.04797854274511337,
-0.012558883987367153,
-0.09533172845840454,
0.07855671644210815,
0.09082730114459991,
0.0858454555273056,
0.03862258791923523,
-0.06632458418607712,
-0.08970397710800171,
0.03655579686164856,
0.10149446129798889,
-0.011015145108103752,
0.0055600921623408794,
0.08375383168458939,
-0.033575840294361115,
0.008882422931492329,
0.09735552221536636,
-0.020280253142118454,
0.16386733949184418,
0.052247341722249985,
0.10975296795368195,
0.07963389158248901,
0.0894087553024292,
-0.0036525982432067394,
0.02519180439412594,
0.017281780019402504,
0.02391679398715496,
0.013328672386705875,
-0.08561091870069504,
0.032629840075969696,
0.11012618988752365,
0.03957906365394592,
0.02978535369038582,
0.00998382456600666,
-0.039224158972501755,
0.04983345419168472,
0.18270018696784973,
0.010937974788248539,
-0.2020755410194397,
-0.08197435736656189,
0.05732262507081032,
-0.07482830435037613,
-0.13613879680633545,
-0.015820156782865524,
0.030992785468697548,
-0.16632801294326782,
0.02093089185655117,
-0.04292873293161392,
0.10087257623672485,
-0.07618314027786255,
-0.037972018122673035,
0.09977278858423233,
0.06849559396505356,
-0.025875449180603027,
0.05679687485098839,
-0.19829612970352173,
0.12874209880828857,
0.028032664209604263,
0.06925363838672638,
-0.0862623080611229,
0.09810810536146164,
0.0016403973568230867,
0.00022841551981400698,
0.16997623443603516,
0.0018117899307981133,
-0.06915976852178574,
-0.06151696294546127,
-0.09548554569482803,
-0.01595810241997242,
0.10285594314336777,
-0.13037802278995514,
0.06550488620996475,
-0.018156372010707855,
-0.03292224556207657,
0.002918755169957876,
-0.07878871262073517,
-0.1296411156654358,
-0.17135953903198242,
0.056213293224573135,
-0.09777677059173584,
0.03310873731970787,
-0.09236916899681091,
-0.06500788778066635,
0.006954923737794161,
0.17732946574687958,
-0.19555382430553436,
-0.09678661078214645,
-0.15155962109565735,
-0.08430524170398712,
0.1621757447719574,
-0.042294811457395554,
0.08763790875673294,
-0.0006992825074121356,
0.1618642359972,
0.013844527304172516,
-0.004267666023224592,
0.10292565077543259,
-0.08854202181100845,
-0.19641070067882538,
-0.05858640745282173,
0.16903233528137207,
0.13367006182670593,
0.03741365671157837,
-0.0123937102034688,
0.02510468289256096,
-0.04979650303721428,
-0.11724548786878586,
0.025051934644579887,
0.13700759410858154,
0.07830961793661118,
-0.016996048390865326,
-0.03564208373427391,
-0.09467646479606628,
-0.06327393651008606,
-0.053889814764261246,
0.003958552610129118,
0.19154421985149384,
-0.0771714448928833,
0.1618814766407013,
0.11504010111093521,
-0.055141348391771317,
-0.20527765154838562,
0.05006660893559456,
0.05271307751536369,
0.01491944957524538,
0.037955041974782944,
-0.19302628934383392,
0.08449610322713852,
-0.002792536048218608,
-0.07195683568716049,
0.1686546951532364,
-0.1711827963590622,
-0.14475350081920624,
0.09616963565349579,
0.03805926814675331,
-0.22803117334842682,
-0.144056037068367,
-0.10231795907020569,
-0.018022805452346802,
-0.11065148562192917,
0.057700369507074356,
-0.0009834450902417302,
0.012204715050756931,
0.029331756755709648,
0.01700112223625183,
0.02773463912308216,
-0.04783860221505165,
0.20156502723693848,
-0.026620658114552498,
0.009645968675613403,
-0.050803907215595245,
-0.08929022401571274,
0.03113371878862381,
-0.04851984605193138,
0.10288829356431961,
0.0008060118998400867,
0.028640469536185265,
-0.15146908164024353,
-0.04350794479250908,
-0.057145193219184875,
0.03118651919066906,
-0.09822992235422134,
-0.08958588540554047,
-0.04674823209643364,
0.09503955394029617,
0.09616370499134064,
-0.029350243508815765,
0.0049018654972314835,
-0.08808459341526031,
0.07062564045190811,
0.2061457633972168,
0.19166786968708038,
0.07469252496957779,
-0.06762950867414474,
0.02133602648973465,
-0.033436186611652374,
0.0433998666703701,
-0.23225659132003784,
0.041024476289749146,
0.058239731937646866,
0.023343829438090324,
0.08593717962503433,
-0.009314903989434242,
-0.15332193672657013,
-0.07462562620639801,
0.08290237933397293,
-0.051120638847351074,
-0.1674211323261261,
-0.028783630579710007,
0.03499951213598251,
-0.20815572142601013,
-0.045045603066682816,
0.020816296339035034,
-0.022935092449188232,
-0.03888389840722084,
0.025203412398695946,
0.07721291482448578,
-0.01619059033691883,
0.10729877650737762,
0.09112436324357986,
0.09265044331550598,
-0.09836417436599731,
0.0770459994673729,
0.07560011744499207,
-0.04857173189520836,
0.02510337345302105,
0.11391250044107437,
-0.05003104731440544,
-0.037385791540145874,
0.083185575902462,
0.08730430901050568,
0.026338724419474602,
-0.050377007573843,
0.013850384391844273,
-0.05622924491763115,
0.0633346363902092,
0.12224308401346207,
0.027672111988067627,
-0.005194958299398422,
0.05762802064418793,
0.033403314650058746,
-0.09613504260778427,
0.10990279912948608,
0.05573276802897453,
0.019114062190055847,
-0.04555441066622734,
-0.029955226927995682,
-0.00770801305770874,
-0.013083512894809246,
-0.020021196454763412,
-0.004767982754856348,
-0.09446404874324799,
-0.009712337516248226,
-0.09333176910877228,
0.028948191553354263,
-0.07187854498624802,
0.008848866447806358,
0.02648242749273777,
-0.05521933361887932,
0.005334476009011269,
0.004076773766428232,
-0.07224726676940918,
-0.050886157900094986,
-0.014045567251741886,
0.08688664436340332,
-0.13213980197906494,
0.03212160989642143,
0.07266359031200409,
-0.10390472412109375,
0.07380658388137817,
-0.004253770224750042,
0.0074723889119923115,
0.009869659319519997,
-0.1613026112318039,
0.05902193859219551,
-0.02187931537628174,
-0.015551870688796043,
0.017850317060947418,
-0.20995107293128967,
-0.00848670955747366,
-0.05112685635685921,
-0.05156785994768143,
0.011753957718610764,
-0.026744084432721138,
-0.12500913441181183,
0.0965094268321991,
-0.003700636327266693,
-0.06694921106100082,
-0.018117429688572884,
0.037198904901742935,
0.09988336265087128,
-0.026223337277770042,
0.13024064898490906,
-0.028749262914061546,
0.07606945931911469,
-0.1754995584487915,
-0.0030784623231738806,
-0.01671811379492283,
0.03889745846390724,
-0.024309078231453896,
-0.024833159521222115,
0.05799832195043564,
-0.02053908444941044,
0.1737559735774994,
-0.02077670209109783,
0.07631205022335052,
0.05753037706017494,
0.008513612672686577,
0.00741326529532671,
0.08384902775287628,
0.060722313821315765,
-0.0023954210337251425,
-0.005406905896961689,
0.03896801918745041,
-0.005658421199768782,
-0.04032319411635399,
-0.15614990890026093,
0.07069823890924454,
0.1549290269613266,
0.0453864187002182,
0.024653466418385506,
0.027533741667866707,
-0.11539218574762344,
-0.07469619065523148,
0.1278480440378189,
-0.01059043500572443,
-0.03528245538473129,
-0.07531804591417313,
0.1789758801460266,
0.13622300326824188,
-0.19748330116271973,
0.07542150467634201,
-0.055151283740997314,
-0.05125879496335983,
-0.1321147382259369,
-0.15921591222286224,
-0.06342728435993195,
-0.042384300380945206,
-0.02174173668026924,
-0.06384839862585068,
0.05015945062041283,
0.04718686267733574,
0.004179064650088549,
-0.01801125705242157,
0.10866912454366684,
0.0111298318952322,
-0.021991629153490067,
0.05256103724241257,
0.06457983702421188,
0.032188184559345245,
-0.09587226063013077,
0.008660759776830673,
-0.005370547529309988,
0.014875974506139755,
0.061043936759233475,
0.01840699277818203,
-0.05432228371500969,
0.01596442610025406,
-0.018292615190148354,
-0.1153998151421547,
0.041081663221120834,
-0.01310122013092041,
-0.035758525133132935,
0.14260442554950714,
0.029665594920516014,
0.006796913221478462,
-0.021382983773946762,
0.23089437186717987,
-0.07574119418859482,
-0.07035340368747711,
-0.14695331454277039,
0.06946837902069092,
-0.06635992974042892,
0.03255327045917511,
0.02953065000474453,
-0.11735396832227707,
0.017487145960330963,
0.166885107755661,
0.1308896392583847,
-0.010424978099763393,
0.0127103915438056,
0.0481414869427681,
0.004356767050921917,
-0.028531156480312347,
0.017408154904842377,
0.05389042943716049,
0.1403871774673462,
-0.07027843594551086,
0.06541658192873001,
-0.011468660086393356,
-0.07632677257061005,
-0.018440628424286842,
0.10812637209892273,
-0.0006737037329003215,
0.003604974364861846,
-0.0696696937084198,
0.14224593341350555,
-0.08679819852113724,
-0.22620150446891785,
0.059672754257917404,
-0.07387839257717133,
-0.14711986482143402,
-0.04849543794989586,
0.013550695031881332,
-0.01215170044451952,
0.014455395750701427,
0.07522529363632202,
-0.04903700575232506,
0.17068269848823547,
0.043809082359075546,
-0.05564592033624649,
-0.0813327208161354,
0.05685770511627197,
-0.1354251652956009,
0.2828375995159149,
0.018338225781917572,
0.04615168273448944,
0.10663270950317383,
-0.017801061272621155,
-0.14192043244838715,
0.011115341447293758,
0.10684415698051453,
-0.0680256262421608,
0.059420645236968994,
0.1705193966627121,
0.0007127286517061293,
0.1261327713727951,
0.053895916789770126,
-0.05706224590539932,
0.04041490703821182,
-0.08960862457752228,
-0.05077595263719559,
-0.1094801276922226,
0.08181770890951157,
-0.08364015817642212,
0.16108252108097076,
0.12768596410751343,
-0.06758809089660645,
-0.006613167002797127,
-0.02095017395913601,
0.08278229087591171,
0.008397233672440052,
0.11297295987606049,
0.01208486221730709,
-0.18272413313388824,
0.035754308104515076,
0.011232861317694187,
0.10045328736305237,
-0.21050354838371277,
-0.06255286931991577,
0.04737021028995514,
-0.017753610387444496,
-0.08089037239551544,
0.11951220780611038,
0.04366198554635048,
0.0325021930038929,
-0.04012960195541382,
-0.055530186742544174,
0.0059353322722017765,
0.1491893082857132,
-0.11500366032123566,
-0.007098687347024679
] |
null | null | peft |
# MindWell
MindWell is a chat assistant trained by fine-tuning [lmsys/vicuna-7b-v1.5](https://huggingface.co/lmsys/vicuna-7b-v1.5) on two symptom-based depression datasets: [BDI-Sen](https://dl.acm.org/doi/abs/10.1145/3539618.3591905) and [PsySym](https://aclanthology.org/2022.emnlp-main.677/). To expedite and optimize the fine-tuning process, we have implemented Low Rank Adaptation (LoRA) techniques, ensuring enhanced efficiency and faster adaptation.
## How to Get Started with MindWell
Use the code below to get started with the model.
```python
from peft import PeftModel
from transformers import AutoModelForCausalLM, LlamaTokenizer, GenerationConfig
config = PeftConfig.from_pretrained("irlab-udc/MindWell")
model = AutoModelForCausalLM.from_pretrained("lmsys/vicuna-13b-v1.5", device_map="auto")
model = PeftModel.from_pretrained(model, "irlab-udc/MindWell")
tokenizer = AutoTokenizer.from_pretrained("lmsys/vicuna-13b-v1.5")
def evaluate(input):
inputs = tokenizer(input, return_tensors="pt")
input_ids = inputs["input_ids"].cuda()
generation_output = model.generate(
input_ids=input_ids,
generation_config=GenerationConfig(do_sample=True),
return_dict_in_generate=True,
output_scores=True,
max_new_tokens=512,
)
for s in generation_output.sequences:
output = tokenizer.decode(s)
print("Answer:", output)
evaluate("What can you do?")
```
```
Answer: I can analyze the user's comments to determine if they exhibit any signs of depressive symptoms based on the provided list of symptoms. I will justify my decisions by means of excerpts from the user's comments. If I don't know the answer, I will truthfully say that I don't know.
```
## Training
#### Configurations and Hyperparameters
The following `LoraConfig` config was used during training:
- r: 8
- lora_alpha: 16
- target_modules: ["q_proj", "v_proj"]
- lora_dropout: 0.05
- bias: "none"
- task_type: "CAUSAL_LM"
The following `TrainingArguments` config was used during training:
- per_device_train_batch_size: 64
- gradient_accumulation_steps: 32
- warmup_steps: 100
- num_train_epochs: 20
- learning_rate: 3e-4
- fp16=True
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
#### Framework versions
- PyTorch 2.1.0
- PEFT 0.5.0
- 🤗 Transformers 4.34.0
- 🤗 Datasets 2.14.5
- 🤗 Tokenizers 0.14.0
## Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** NVIDIA RTX 6000 Ada Generation.
- **Hours used:** 20.
- **Cloud Provider:** Private infrastructure.
- **Carbon Emitted:** 2.59 Kg. CO2 eq. | {"language": ["en"], "license": "llama2", "library_name": "peft", "tags": ["medical"], "base_model": "lmsys/vicuna-7b-v1.5", "inference": false} | null | irlab-udc/MindWell | [
"peft",
"medical",
"en",
"arxiv:1910.09700",
"base_model:lmsys/vicuna-7b-v1.5",
"license:llama2",
"region:us"
] | 2024-02-12T11:00:03+00:00 | [
"1910.09700"
] | [
"en"
] | TAGS
#peft #medical #en #arxiv-1910.09700 #base_model-lmsys/vicuna-7b-v1.5 #license-llama2 #region-us
|
# MindWell
MindWell is a chat assistant trained by fine-tuning lmsys/vicuna-7b-v1.5 on two symptom-based depression datasets: BDI-Sen and PsySym. To expedite and optimize the fine-tuning process, we have implemented Low Rank Adaptation (LoRA) techniques, ensuring enhanced efficiency and faster adaptation.
## How to Get Started with MindWell
Use the code below to get started with the model.
## Training
#### Configurations and Hyperparameters
The following 'LoraConfig' config was used during training:
- r: 8
- lora_alpha: 16
- target_modules: ["q_proj", "v_proj"]
- lora_dropout: 0.05
- bias: "none"
- task_type: "CAUSAL_LM"
The following 'TrainingArguments' config was used during training:
- per_device_train_batch_size: 64
- gradient_accumulation_steps: 32
- warmup_steps: 100
- num_train_epochs: 20
- learning_rate: 3e-4
- fp16=True
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: fp4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float32
#### Framework versions
- PyTorch 2.1.0
- PEFT 0.5.0
- Transformers 4.34.0
- Datasets 2.14.5
- Tokenizers 0.14.0
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: NVIDIA RTX 6000 Ada Generation.
- Hours used: 20.
- Cloud Provider: Private infrastructure.
- Carbon Emitted: 2.59 Kg. CO2 eq. | [
"# MindWell\n\nMindWell is a chat assistant trained by fine-tuning lmsys/vicuna-7b-v1.5 on two symptom-based depression datasets: BDI-Sen and PsySym. To expedite and optimize the fine-tuning process, we have implemented Low Rank Adaptation (LoRA) techniques, ensuring enhanced efficiency and faster adaptation.",
"## How to Get Started with MindWell\n\nUse the code below to get started with the model.",
"## Training",
"#### Configurations and Hyperparameters\n\nThe following 'LoraConfig' config was used during training:\n- r: 8\n- lora_alpha: 16\n- target_modules: [\"q_proj\", \"v_proj\"]\n- lora_dropout: 0.05\n- bias: \"none\"\n- task_type: \"CAUSAL_LM\"\n\nThe following 'TrainingArguments' config was used during training:\n- per_device_train_batch_size: 64\n- gradient_accumulation_steps: 32\n- warmup_steps: 100\n- num_train_epochs: 20\n- learning_rate: 3e-4\n- fp16=True\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32",
"#### Framework versions\n\n- PyTorch 2.1.0\n- PEFT 0.5.0\n- Transformers 4.34.0\n- Datasets 2.14.5\n- Tokenizers 0.14.0",
"## Environmental Impact\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: NVIDIA RTX 6000 Ada Generation.\n- Hours used: 20.\n- Cloud Provider: Private infrastructure.\n- Carbon Emitted: 2.59 Kg. CO2 eq."
] | [
"TAGS\n#peft #medical #en #arxiv-1910.09700 #base_model-lmsys/vicuna-7b-v1.5 #license-llama2 #region-us \n",
"# MindWell\n\nMindWell is a chat assistant trained by fine-tuning lmsys/vicuna-7b-v1.5 on two symptom-based depression datasets: BDI-Sen and PsySym. To expedite and optimize the fine-tuning process, we have implemented Low Rank Adaptation (LoRA) techniques, ensuring enhanced efficiency and faster adaptation.",
"## How to Get Started with MindWell\n\nUse the code below to get started with the model.",
"## Training",
"#### Configurations and Hyperparameters\n\nThe following 'LoraConfig' config was used during training:\n- r: 8\n- lora_alpha: 16\n- target_modules: [\"q_proj\", \"v_proj\"]\n- lora_dropout: 0.05\n- bias: \"none\"\n- task_type: \"CAUSAL_LM\"\n\nThe following 'TrainingArguments' config was used during training:\n- per_device_train_batch_size: 64\n- gradient_accumulation_steps: 32\n- warmup_steps: 100\n- num_train_epochs: 20\n- learning_rate: 3e-4\n- fp16=True\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32",
"#### Framework versions\n\n- PyTorch 2.1.0\n- PEFT 0.5.0\n- Transformers 4.34.0\n- Datasets 2.14.5\n- Tokenizers 0.14.0",
"## Environmental Impact\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: NVIDIA RTX 6000 Ada Generation.\n- Hours used: 20.\n- Cloud Provider: Private infrastructure.\n- Carbon Emitted: 2.59 Kg. CO2 eq."
] | [
45,
85,
20,
2,
324,
36,
69
] | [
"passage: TAGS\n#peft #medical #en #arxiv-1910.09700 #base_model-lmsys/vicuna-7b-v1.5 #license-llama2 #region-us \n# MindWell\n\nMindWell is a chat assistant trained by fine-tuning lmsys/vicuna-7b-v1.5 on two symptom-based depression datasets: BDI-Sen and PsySym. To expedite and optimize the fine-tuning process, we have implemented Low Rank Adaptation (LoRA) techniques, ensuring enhanced efficiency and faster adaptation.## How to Get Started with MindWell\n\nUse the code below to get started with the model.## Training#### Configurations and Hyperparameters\n\nThe following 'LoraConfig' config was used during training:\n- r: 8\n- lora_alpha: 16\n- target_modules: [\"q_proj\", \"v_proj\"]\n- lora_dropout: 0.05\n- bias: \"none\"\n- task_type: \"CAUSAL_LM\"\n\nThe following 'TrainingArguments' config was used during training:\n- per_device_train_batch_size: 64\n- gradient_accumulation_steps: 32\n- warmup_steps: 100\n- num_train_epochs: 20\n- learning_rate: 3e-4\n- fp16=True\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32"
] | [
-0.07730504870414734,
0.08269818127155304,
-0.0059975371696054935,
0.05634741485118866,
0.06616093963384628,
0.052694302052259445,
0.09035887569189072,
0.15587027370929718,
0.10430323332548141,
0.17078477144241333,
0.06003142520785332,
0.13831780850887299,
0.08627493679523468,
0.14320477843284607,
0.0039131916128098965,
-0.12047071754932404,
0.032789021730422974,
-0.062264811247587204,
-0.012917649000883102,
0.05220431461930275,
0.055637914687395096,
-0.07897192239761353,
0.010974254459142685,
-0.0009159564506262541,
-0.027290593832731247,
-0.04588436707854271,
-0.007132907398045063,
0.019787341356277466,
0.03420161083340645,
-0.007088230457156897,
-0.016502458602190018,
-0.008788633160293102,
0.010356713086366653,
-0.24327178299427032,
0.0017563437577337027,
0.044549278914928436,
0.04283735901117325,
0.08999132364988327,
0.07296363264322281,
-0.01309886109083891,
0.0821836069226265,
-0.2022733837366104,
0.09164374321699142,
0.04395950213074684,
-0.09681954979896545,
-0.17706403136253357,
-0.12682004272937775,
0.09957168996334076,
0.11729886382818222,
0.08593327552080154,
-0.019795645028352737,
0.16225570440292358,
-0.05720304325222969,
0.06214863806962967,
0.12079223245382309,
-0.2619060277938843,
-0.016951043158769608,
0.009808100759983063,
0.0041979397647082806,
-0.02491644024848938,
-0.11713507026433945,
-0.0664532259106636,
-0.0424615778028965,
0.043388091027736664,
-0.0020629914943128824,
-0.013255598023533821,
0.011009815149009228,
-0.02710188925266266,
-0.10075774043798447,
-0.004579171072691679,
0.0417531281709671,
0.05677299201488495,
-0.011585102416574955,
-0.2068508416414261,
-0.03953343257308006,
-0.13889946043491364,
-0.005414698738604784,
-0.030335789546370506,
-0.008442588150501251,
0.0022708524484187365,
0.09185042232275009,
-0.022924507036805153,
-0.018048102036118507,
-0.026855189353227615,
0.023230310529470444,
-0.005278680007904768,
0.03409513458609581,
-0.005542302969843149,
0.074928417801857,
0.09096506237983704,
0.02514779195189476,
-0.10327719897031784,
-0.02954556979238987,
-0.022457344457507133,
-0.13979801535606384,
-0.02275339886546135,
-0.0028275533113628626,
0.012411842122673988,
0.06274014711380005,
0.18040934205055237,
0.012320229783654213,
0.08726712316274643,
-0.01930953375995159,
0.010278427042067051,
0.001993135316297412,
0.07480128854513168,
-0.09534517675638199,
-0.0934867411851883,
0.013665263541042805,
0.12978525459766388,
0.023267116397619247,
0.016120819374918938,
-0.03092365711927414,
-0.003032494569197297,
0.059288300573825836,
0.04101023077964783,
-0.007510720286518335,
0.03286435827612877,
-0.13921542465686798,
-0.04900050535798073,
0.09529536217451096,
-0.14177200198173523,
0.03470322862267494,
0.049798958003520966,
-0.06331165879964828,
0.07502985000610352,
0.07813001424074173,
-0.05238526314496994,
-0.09478172659873962,
0.023431405425071716,
-0.04892106354236603,
-0.024603543803095818,
-0.0725354254245758,
-0.0624392032623291,
0.0453682504594326,
-0.023231150582432747,
-0.031225750222802162,
-0.015249001793563366,
-0.13722705841064453,
-0.09127388149499893,
0.07061418890953064,
-0.10892347246408463,
0.019714048132300377,
-0.07015866041183472,
-0.08835134655237198,
0.023432621732354164,
-0.008530769497156143,
0.07306478917598724,
-0.04648132994771004,
0.042684562504291534,
-0.08080761134624481,
0.08318006992340088,
0.046517714858055115,
-0.00748957134783268,
-0.025489535182714462,
0.05429217964410782,
-0.172540083527565,
0.14000147581100464,
-0.07175489515066147,
-0.0024911535438150167,
-0.13170698285102844,
-0.03960994631052017,
0.014918250031769276,
-0.025778675451874733,
0.09365871548652649,
0.10755002498626709,
-0.14172416925430298,
-0.010773583315312862,
0.14022821187973022,
-0.025690434500575066,
-0.1134999543428421,
0.08197757601737976,
-0.047180548310279846,
0.08682157844305038,
0.0653277337551117,
0.10863425582647324,
0.11740890890359879,
-0.15840311348438263,
-0.0778798758983612,
-0.046998146921396255,
-0.02804684452712536,
0.04819274693727493,
0.022943686693906784,
-0.006295889150351286,
0.05907982960343361,
0.01817837357521057,
-0.003908011596649885,
0.05094902589917183,
-0.025421934202313423,
-0.03243188187479973,
0.005980382673442364,
-0.08901874721050262,
-0.044755157083272934,
-0.020698517560958862,
-0.035352159291505814,
-0.039439715445041656,
-0.10374926775693893,
0.046192463487386703,
0.14479853212833405,
-0.0173508208245039,
-0.011864796280860901,
-0.15276969969272614,
0.007809002883732319,
0.005187934264540672,
0.01941409520804882,
-0.15910032391548157,
-0.09036519378423691,
0.019306937232613564,
-0.04274043068289757,
0.026238352060317993,
0.012684850953519344,
0.07395555078983307,
0.03225993737578392,
-0.02213636413216591,
-0.02486577443778515,
0.0293898768723011,
-0.0005803984240628779,
-0.05915895849466324,
-0.22227776050567627,
-0.0012958121951669455,
-0.03599437326192856,
0.19799739122390747,
-0.1376592516899109,
-0.0037701341789215803,
0.039375659078359604,
0.11337964981794357,
0.028185484930872917,
-0.06929502636194229,
0.053492773324251175,
-0.017620345577597618,
0.05887630209326744,
-0.051834359765052795,
0.020292507484555244,
-0.0005162862944416702,
-0.06644750386476517,
-0.008288010954856873,
-0.13854673504829407,
-0.10783495754003525,
0.06884127110242844,
0.10676625370979309,
-0.09336819499731064,
-0.044978510588407516,
-0.039223674684762955,
-0.03966899216175079,
0.004922759253531694,
-0.04476858302950859,
0.15968282520771027,
0.0991518571972847,
0.06350881606340408,
-0.011955004185438156,
-0.06864789873361588,
-0.01947370357811451,
0.012904388830065727,
-0.0005608562496490777,
0.1377478986978531,
-0.0019300359999760985,
-0.11511023342609406,
0.025752630084753036,
0.0392209030687809,
-0.008353245444595814,
0.08989547193050385,
-0.021061427891254425,
-0.0641760602593422,
-0.11710140854120255,
0.08253537863492966,
0.05439376085996628,
0.07731346786022186,
-0.13333792984485626,
0.06254112720489502,
0.056983355432748795,
-0.020918210968375206,
0.015378091484308243,
-0.08998794853687286,
0.02762165293097496,
-0.03170733526349068,
-0.023094303905963898,
0.09082794934511185,
0.02798270806670189,
0.0712641105055809,
0.08289133012294769,
0.04752696678042412,
0.027660109102725983,
-0.007270479574799538,
-0.06827273219823837,
-0.10740949958562851,
0.10453880578279495,
-0.17011874914169312,
-0.12868325412273407,
-0.12456319481134415,
-0.0048461235128343105,
-0.04020651802420616,
-0.04384767636656761,
-0.051114022731781006,
-0.07164277881383896,
-0.08718562126159668,
-0.06956002116203308,
-0.021425964310765266,
0.09525492042303085,
-0.025375911965966225,
0.06382234394550323,
0.0415787510573864,
0.12258079648017883,
-0.07652075588703156,
-0.019385073333978653,
-0.010723001323640347,
0.010195556096732616,
0.0010994637850672007,
0.05646819621324539,
-0.03194087743759155,
0.13668814301490784,
0.08876240253448486,
0.03637327998876572,
0.02102949284017086,
0.2858497202396393,
-0.06717219948768616,
0.08773206919431686,
0.13178770244121552,
0.010391313582658768,
0.02603689767420292,
0.1250266134738922,
0.03416455537080765,
-0.109721839427948,
0.045376844704151154,
0.06232689693570137,
0.01892942748963833,
-0.22763323783874512,
-0.06406106799840927,
-0.06642071902751923,
-0.08302884548902512,
0.09321657568216324,
0.05813485383987427,
-0.020173298195004463,
0.026506878435611725,
-0.0066777002066373825,
0.02216276526451111,
0.055809225887060165,
0.0821632444858551,
0.10925308614969254,
0.005754195619374514,
0.05887481942772865,
-0.013006923720240593,
0.011122655123472214,
0.06654565036296844,
0.023570548743009567,
0.09111274033784866,
-0.07277917861938477,
0.11880877614021301,
0.08848980069160461,
0.09316076338291168,
-0.057285044342279434,
0.040724631398916245,
-0.024030286818742752,
0.00789109617471695,
0.020716417580842972,
-0.07314389199018478,
-0.06962022185325623,
0.058410532772541046,
0.03666165471076965,
0.01611551269888878,
0.0165164265781641,
-0.0036597298458218575,
0.0923251062631607,
0.132883682847023,
0.14456559717655182,
-0.19848553836345673,
-0.057021304965019226,
0.021908681839704514,
-0.035591885447502136,
-0.09920667111873627,
-0.023352637887001038,
0.06855025142431259,
-0.08866661041975021,
0.048802658915519714,
-0.046763867139816284,
0.060305021703243256,
-0.14246714115142822,
-0.0061835357919335365,
0.04361449554562569,
0.08368851244449615,
0.02522522769868374,
0.018484676256775856,
-0.12962758541107178,
0.05740286037325859,
0.03133147954940796,
0.10137432813644409,
-0.08699307590723038,
0.0517844595015049,
0.03154301643371582,
-0.10179886966943741,
0.11954654008150101,
0.0006969945388846099,
0.017922604456543922,
-0.1286463886499405,
-0.11504903435707092,
-0.043070368468761444,
0.0760953426361084,
-0.03485036641359329,
0.08037843555212021,
-0.023640232160687447,
-0.04164190590381622,
0.0017713879933580756,
-0.09033847600221634,
-0.06546350568532944,
-0.16720840334892273,
0.07877549529075623,
0.053390227258205414,
-0.010151070542633533,
-0.040957849472761154,
-0.030084898695349693,
-0.0532274954020977,
0.1632639765739441,
-0.16586335003376007,
-0.0536809042096138,
-0.11507907509803772,
0.08938294649124146,
0.17448385059833527,
-0.048540618270635605,
0.01524078194051981,
0.003898862050846219,
0.032943740487098694,
-0.0005071984487585723,
-0.061605777591466904,
0.0436236672103405,
-0.05343782156705856,
-0.19945095479488373,
-0.051668696105480194,
0.16368809342384338,
0.1002512201666832,
0.03538942709565163,
-0.01094844751060009,
0.022083519026637077,
0.013852582313120365,
-0.10062380880117416,
0.06711810827255249,
0.15650278329849243,
0.06323925405740738,
0.05019686743617058,
-0.09743550419807434,
0.050247691571712494,
-0.060351938009262085,
-0.0008823768002912402,
0.0488082654774189,
0.3292386829853058,
-0.0402492918074131,
0.10506550222635269,
0.05889805033802986,
-0.058517150580883026,
-0.1500403881072998,
0.007266052532941103,
0.11114103347063065,
0.02893781289458275,
0.008489404805004597,
-0.1666957437992096,
0.13620276749134064,
0.07875800132751465,
-0.010571441613137722,
0.05615135282278061,
-0.2703274190425873,
-0.10828433185815811,
0.06460211426019669,
0.053056810051202774,
0.018282337114214897,
-0.08513756841421127,
-0.03675011917948723,
-0.01985086500644684,
-0.029674379155039787,
0.10646709054708481,
-0.0006029274663887918,
0.10106900334358215,
-0.020298950374126434,
0.029662277549505234,
0.05962751805782318,
-0.05437247455120087,
0.16475984454154968,
0.012586016207933426,
0.055623725056648254,
-0.05113716796040535,
0.010556007735431194,
0.022397341206669807,
-0.10312497615814209,
0.08456224948167801,
-0.01638146862387657,
0.020627792924642563,
-0.13671447336673737,
-0.03582446277141571,
-0.08398524671792984,
-0.029304571449756622,
-0.022506963461637497,
-0.027447057887911797,
-0.061517611145973206,
0.1264183521270752,
0.11498302966356277,
-0.010851344093680382,
-0.048479869961738586,
-0.039011213928461075,
-0.03952757269144058,
0.15753398835659027,
0.03722357749938965,
0.015826445072889328,
-0.09354016184806824,
0.051968853920698166,
0.015955615788698196,
0.00424604956060648,
-0.03337514400482178,
0.031297583132982254,
0.12938949465751648,
-0.013879490084946156,
0.1461741030216217,
0.03047821670770645,
-0.14123518764972687,
-0.04276444762945175,
0.07583215832710266,
-0.10484082251787186,
-0.11854095757007599,
0.03030405379831791,
-0.011533890850841999,
-0.11456415802240372,
-0.0926152765750885,
0.15482372045516968,
0.02670719474554062,
0.0070959413424134254,
0.05229939520359039,
0.06833408027887344,
-0.044381480664014816,
0.14219892024993896,
0.009374831803143024,
0.06385008990764618,
-0.06895537674427032,
0.04654134437441826,
0.10043230652809143,
-0.09165425598621368,
0.07326073199510574,
0.13411061465740204,
-0.09000447392463684,
-0.02752840332686901,
0.025385508313775063,
0.08268590271472931,
0.12034860998392105,
-0.00200815312564373,
-0.034689344465732574,
-0.08875961601734161,
0.07770463824272156,
0.1282687783241272,
-0.019861457869410515,
0.04298778623342514,
-0.02127700299024582,
-0.007786707952618599,
-0.11457916349172592,
0.13333341479301453,
0.03252876177430153,
-0.0018191308481618762,
-0.047433797270059586,
0.04282520338892937,
-0.041616011410951614,
-0.0053627886809408665,
0.0022836404386907816,
0.007673162501305342,
-0.15715783834457397,
-0.003416076535359025,
-0.08000480383634567,
0.020360123366117477,
-0.07766830921173096,
-0.017877621576189995,
0.024654220789670944,
0.04313625022768974,
-0.004720277618616819,
0.015801172703504562,
-0.056765440851449966,
-0.08851588517427444,
0.011647907085716724,
0.06963251531124115,
-0.12881462275981903,
-0.05425069481134415,
0.028769034892320633,
-0.15498225390911102,
0.0943991169333458,
0.024920135736465454,
0.00273777823895216,
-0.0006615730817429721,
-0.03691764920949936,
-0.04303257167339325,
0.022683965042233467,
0.08343273401260376,
0.059204377233982086,
-0.18284784257411957,
0.007040279917418957,
-0.06512129306793213,
-0.019879724830389023,
0.048708975315093994,
0.10658368468284607,
-0.06381439417600632,
-0.020133255049586296,
-0.01806361973285675,
-0.016405155882239342,
-0.07013002038002014,
0.03706038370728493,
0.10351497679948807,
0.023804709315299988,
0.09809992462396622,
-0.0763312503695488,
0.06464944779872894,
-0.16751380264759064,
-0.027246082201600075,
0.014533882029354572,
0.015917951241135597,
0.05451631918549538,
-0.025724569335579872,
0.11657287925481796,
-0.025478534400463104,
0.0699169710278511,
-0.10492710769176483,
-0.11521734297275543,
0.019748462364077568,
-0.061254557222127914,
-0.08220412582159042,
0.004754520021378994,
0.03472086787223816,
0.036936089396476746,
-0.020988833159208298,
0.02095431461930275,
-0.019927673041820526,
0.033566806465387344,
0.06282256543636322,
0.14942583441734314,
0.14131274819374084,
0.10466687381267548,
0.03945642709732056,
0.02211008034646511,
-0.1346844881772995,
-0.015766140073537827,
0.2374095320701599,
-0.11967217922210693,
0.08608147501945496,
-0.09828045219182968,
0.13336940109729767,
0.028252484276890755,
-0.15607765316963196,
0.06254124641418457,
-0.029565611854195595,
-0.10967257618904114,
-0.07794532924890518,
-0.14101722836494446,
-0.04774704575538635,
-0.06849708408117294,
-0.006158621981739998,
-0.0903724730014801,
0.012225772254168987,
0.05632952228188515,
0.06790425628423691,
0.021240290254354477,
0.11403699964284897,
-0.1277414709329605,
0.0065734172239899635,
0.07384288311004639,
0.04471031576395035,
-0.045783597975969315,
-0.04217031970620155,
-0.0076060728169977665,
0.025112492963671684,
0.032876331359148026,
0.08342092484235764,
0.03565484285354614,
0.031947605311870575,
-0.004241249058395624,
0.009826376102864742,
-0.08783093839883804,
0.023065559566020966,
-0.004427249077707529,
-0.0010761007433757186,
0.18512554466724396,
0.0764879584312439,
-0.026273399591445923,
-0.06882477551698685,
0.11865051090717316,
-0.05781419202685356,
-0.10496708750724792,
-0.18067805469036102,
0.09334738552570343,
-0.002297281986102462,
-0.023904362693428993,
0.00825403444468975,
-0.0983147844672203,
-0.03798777982592583,
0.16918513178825378,
0.11357124894857407,
-0.12932872772216797,
-0.011434581130743027,
0.052234530448913574,
-0.0028279381804168224,
-0.07032763957977295,
0.08742307126522064,
0.07618393748998642,
0.10478639602661133,
-0.014258568175137043,
0.047350604087114334,
-0.02722625620663166,
-0.06865320354700089,
-0.02548782154917717,
0.027975814417004585,
-0.0064770826138556,
0.03916367515921593,
-0.03480399772524834,
0.054131779819726944,
-0.05369611456990242,
-0.19486962258815765,
0.1496739387512207,
-0.14194144308567047,
-0.1648445874452591,
-0.04420594871044159,
-0.04204878956079483,
-0.03786400705575943,
0.02306375466287136,
0.003476861398667097,
-0.0024969014339149,
0.20031508803367615,
-0.03408070653676987,
-0.028082309290766716,
-0.06704887002706528,
0.04529393091797829,
0.002995241666212678,
0.17600975930690765,
0.014476871117949486,
0.07118150591850281,
0.13888958096504211,
-0.030955687165260315,
-0.17163431644439697,
0.03743770346045494,
0.04499713331460953,
-0.1364956498146057,
0.01129812654107809,
0.1604674905538559,
0.001463254215195775,
0.13830606639385223,
0.07294608652591705,
-0.09007079899311066,
-0.02087016962468624,
-0.058219488710165024,
-0.033921800553798676,
-0.1122155413031578,
0.02972199022769928,
-0.08549699932336807,
0.12921713292598724,
0.1999424397945404,
-0.035412974655628204,
0.035595741122961044,
-0.021928196772933006,
0.029311802238225937,
0.004655726719647646,
0.11112838238477707,
-0.035621821880340576,
-0.18690890073776245,
0.10724461823701859,
-0.02879040315747261,
0.08424296975135803,
-0.22667109966278076,
-0.12988105416297913,
-0.005054569337517023,
-0.03243540972471237,
-0.06955333054065704,
0.16408130526542664,
-0.041977617889642715,
0.025692544877529144,
-0.030341878533363342,
-0.10884720832109451,
-0.003066739533096552,
0.09920985996723175,
-0.10498026013374329,
-0.03433593735098839
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "meta-llama/Llama-2-7b-chat-hf"} | null | shivanikerai/Llama-2-7b-chat-hf-adapter-sku-title-description-ner-generation-marico-v1.0 | [
"peft",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | 2024-02-12T11:02:29+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #arxiv-1910.09700 #base_model-meta-llama/Llama-2-7b-chat-hf #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #arxiv-1910.09700 #base_model-meta-llama/Llama-2-7b-chat-hf #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
38,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #arxiv-1910.09700 #base_model-meta-llama/Llama-2-7b-chat-hf #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.1097489595413208,
0.19965529441833496,
-0.0029093523044139147,
0.02977496199309826,
0.08865993469953537,
0.020992767065763474,
0.04617491737008095,
0.13436155021190643,
-0.0122890155762434,
0.10603273659944534,
0.06528570502996445,
0.09982994943857193,
0.11414647847414017,
0.22117121517658234,
0.008661055937409401,
-0.19818119704723358,
0.02392975240945816,
-0.09021910279989243,
-0.008825909346342087,
0.1210189089179039,
0.14740028977394104,
-0.09894569218158722,
0.08424650132656097,
-0.0056873951107263565,
-0.008893657475709915,
-0.02980463020503521,
-0.07571642100811005,
-0.021988803520798683,
0.04101024195551872,
0.04730468988418579,
0.05011952668428421,
-0.0026592575013637543,
0.0872035101056099,
-0.26955920457839966,
0.019151655957102776,
0.04484740272164345,
-0.0026050545275211334,
0.08793988078832626,
0.09100331366062164,
-0.04279746115207672,
0.13107092678546906,
-0.029642820358276367,
0.13622359931468964,
0.08729755878448486,
-0.08290641754865646,
-0.22245174646377563,
-0.0685657411813736,
0.08323489874601364,
0.1859087347984314,
0.07741431891918182,
-0.040737878531217575,
0.12529872357845306,
-0.08601926267147064,
0.01631336659193039,
0.04629611223936081,
-0.08685805648565292,
-0.06553229689598083,
0.062460605055093765,
0.10471820086240768,
0.061145562678575516,
-0.12969349324703217,
-0.030036436393857002,
0.02531454712152481,
0.033760916441679,
0.0762089416384697,
0.011855230666697025,
0.16021670401096344,
0.033228375017642975,
-0.1405784636735916,
-0.04224565625190735,
0.14612790942192078,
0.033758267760276794,
-0.03398217633366585,
-0.22321653366088867,
-0.0009301623213104904,
-0.09518437832593918,
-0.02987043373286724,
-0.04406297579407692,
0.0417029894888401,
0.002315347082912922,
0.1102258637547493,
-0.03279596567153931,
-0.08844900876283646,
-0.016932649537920952,
0.09914511442184448,
0.045378677546978,
0.02553815394639969,
-0.016274455934762955,
0.0037991050630807877,
0.1283528357744217,
0.06785524636507034,
-0.13458992540836334,
-0.06278920918703079,
-0.07116561383008957,
-0.045561533421278,
-0.0355088971555233,
0.03829069435596466,
0.04880223795771599,
0.05905542150139809,
0.24367274343967438,
-0.02556382119655609,
0.06690357625484467,
0.07187432795763016,
0.019574804231524467,
0.051900845021009445,
0.09590231627225876,
-0.057793986052274704,
-0.16486790776252747,
-0.012440260499715805,
0.0971127599477768,
-0.006702732294797897,
-0.02692808210849762,
-0.06152992323040962,
0.04885540530085564,
0.029513226822018623,
0.10595010221004486,
0.09877003729343414,
-0.011269476264715195,
-0.07271049171686172,
-0.06290774792432785,
0.20190829038619995,
-0.15416783094406128,
0.04069993644952774,
0.020708607509732246,
-0.02069385163486004,
-0.045518483966588974,
0.010804135352373123,
0.01757807843387127,
-0.030719280242919922,
0.08147570490837097,
-0.07056427747011185,
-0.03961678594350815,
-0.1222657561302185,
-0.02327624335885048,
0.028196869418025017,
0.009746973402798176,
-0.03046281822025776,
-0.031196700409054756,
-0.06462333351373672,
-0.09444823861122131,
0.10479193180799484,
-0.06643617898225784,
-0.061557602137327194,
-0.030483780428767204,
-0.08981305360794067,
0.02254730835556984,
0.027911558747291565,
0.09077779948711395,
-0.027895735576748848,
0.040625639259815216,
-0.011112388223409653,
0.06572747975587845,
0.07461882382631302,
0.03578711673617363,
-0.06424850225448608,
0.06015384569764137,
-0.20406599342823029,
0.08556332439184189,
-0.08446065336465836,
0.03385736048221588,
-0.16098789870738983,
-0.01247160229831934,
0.014834500849246979,
0.02343825064599514,
0.030182762071490288,
0.16115155816078186,
-0.2115187644958496,
-0.03635507822036743,
0.1532590687274933,
-0.09581614285707474,
-0.11948860436677933,
0.03439079225063324,
-0.048357971012592316,
0.16117459535598755,
0.017020463943481445,
0.0018450876232236624,
0.0983242467045784,
-0.15128687024116516,
-0.0230529997497797,
-0.015843115746974945,
-0.0012368750758469105,
0.09137727320194244,
0.08664927631616592,
-0.08640901744365692,
0.03284556791186333,
0.01722603663802147,
-0.0544295534491539,
-0.027559028938412666,
-0.04327577352523804,
-0.10873787850141525,
0.006965435575693846,
-0.07952671498060226,
0.013697277754545212,
-0.01072197500616312,
-0.08107749372720718,
-0.00446817884221673,
-0.16061486303806305,
-0.03408057615160942,
0.09041638672351837,
0.007928465493023396,
-0.020917540416121483,
-0.1060028225183487,
0.046736665070056915,
-0.026493346318602562,
-0.021115737035870552,
-0.14343948662281036,
-0.013705371879041195,
0.018003713339567184,
-0.13926094770431519,
0.0067591541446745396,
-0.10391131043434143,
0.06531371921300888,
0.006667348090559244,
-0.055276401340961456,
-0.03745187819004059,
-0.008435043506324291,
0.008067243732511997,
-0.05036483332514763,
-0.24700452387332916,
-0.028853783383965492,
-0.0472220778465271,
0.1697845607995987,
-0.22070062160491943,
0.03759501501917839,
0.05085914582014084,
0.13595159351825714,
-0.0016047356184571981,
-0.061770617961883545,
0.026718933135271072,
-0.07498997449874878,
-0.02612743154168129,
-0.07308053225278854,
-0.005071202293038368,
-0.004502609837800264,
-0.04442371800541878,
0.012331030331552029,
-0.11311253905296326,
-0.04569253697991371,
0.10320332646369934,
0.06468506157398224,
-0.146511510014534,
-0.008327248506247997,
-0.04162632301449776,
-0.06364759057760239,
-0.07115332782268524,
-0.06655067205429077,
0.11369676142930984,
0.05197574570775032,
0.0431116484105587,
-0.07517135888338089,
-0.07446738332509995,
0.010255836881697178,
-0.020570721477270126,
-0.01626063883304596,
0.11025681346654892,
0.08404304832220078,
-0.1041274294257164,
0.0926150381565094,
0.07018421590328217,
0.03671332448720932,
0.09441360831260681,
-0.02397226169705391,
-0.10423600673675537,
-0.030812280252575874,
0.04195296764373779,
0.004009140655398369,
0.1705813854932785,
-0.07354769110679626,
0.04992767795920372,
0.04659350588917732,
-0.037093956023454666,
0.05276673287153244,
-0.09705978631973267,
0.014151694253087044,
0.008510625921189785,
-0.0136459581553936,
0.01807168684899807,
-0.021475235000252724,
0.006767760030925274,
0.08053372800350189,
0.059816546738147736,
0.03201870992779732,
0.021526606753468513,
-0.03682904690504074,
-0.13491664826869965,
0.18162168562412262,
-0.10188733041286469,
-0.2443610280752182,
-0.15931478142738342,
0.05819355323910713,
0.049542199820280075,
-0.020695745944976807,
0.019119199365377426,
-0.06112532317638397,
-0.10424990206956863,
-0.08117005974054337,
0.002776210894808173,
0.02195224165916443,
-0.0610133558511734,
-0.061887603253126144,
0.045107848942279816,
0.044492244720458984,
-0.12340037524700165,
0.03238305076956749,
0.05671203136444092,
-0.012632269412279129,
-0.004414911847561598,
0.05694727599620819,
0.08675510436296463,
0.1874821037054062,
-0.006445154082030058,
0.007426074240356684,
0.05649397894740105,
0.2790212035179138,
-0.16323049366474152,
0.11844439059495926,
0.12372992187738419,
-0.06020679324865341,
0.07730602473020554,
0.18820282816886902,
0.03437932953238487,
-0.09829609096050262,
0.025189749896526337,
0.03178888559341431,
-0.022859500721096992,
-0.26027607917785645,
-0.05554875358939171,
-0.01645888015627861,
-0.09643355756998062,
0.07367592304944992,
0.0906422883272171,
0.08419600874185562,
0.03131236881017685,
-0.06533831357955933,
-0.0881643146276474,
0.02824743278324604,
0.10229384154081345,
-0.02348904497921467,
0.005101914517581463,
0.08225834369659424,
-0.03695062920451164,
0.013857926242053509,
0.09725916385650635,
-0.009007931686937809,
0.1615152209997177,
0.05508911609649658,
0.11773016303777695,
0.08667030930519104,
0.09202395379543304,
-0.003566388040781021,
0.020574092864990234,
0.01455873902887106,
0.02242422103881836,
0.013324055820703506,
-0.08327095955610275,
0.02621372602880001,
0.11398548632860184,
0.04665733501315117,
0.02912866696715355,
0.01468511763960123,
-0.039022818207740784,
0.045901842415332794,
0.18915611505508423,
0.012414890341460705,
-0.20079661905765533,
-0.07266959547996521,
0.06361795961856842,
-0.07976381480693817,
-0.13955058157444,
-0.013478885404765606,
0.025797680020332336,
-0.16800275444984436,
0.02203844115138054,
-0.03507455438375473,
0.10170629620552063,
-0.0963946059346199,
-0.039566002786159515,
0.10248400270938873,
0.0665711835026741,
-0.020160404965281487,
0.05552557855844498,
-0.18503813445568085,
0.12085454165935516,
0.02827446348965168,
0.06710166484117508,
-0.08878343552350998,
0.10236646980047226,
0.004695627372711897,
-0.002138222334906459,
0.1606006920337677,
0.00798854324966669,
-0.051763866096735,
-0.07134003192186356,
-0.08979557454586029,
-0.010677219368517399,
0.09291231632232666,
-0.14273858070373535,
0.07039275765419006,
-0.022995779290795326,
-0.02993251569569111,
-0.005642946343868971,
-0.08615931123495102,
-0.12289456278085709,
-0.1725243479013443,
0.06079187989234924,
-0.09906207025051117,
0.02511128969490528,
-0.08947616070508957,
-0.05932797119021416,
0.006897508632391691,
0.18469759821891785,
-0.21570178866386414,
-0.10304705053567886,
-0.15054449439048767,
-0.0936024934053421,
0.1552099734544754,
-0.04413881152868271,
0.08562310039997101,
0.0017082891426980495,
0.1672871708869934,
0.017176339402794838,
-0.016635054722428322,
0.10156692564487457,
-0.08906082808971405,
-0.18433070182800293,
-0.05445864051580429,
0.1685963124036789,
0.13608239591121674,
0.03545503690838814,
-0.016973987221717834,
0.021124379709362984,
-0.05652422085404396,
-0.12180635333061218,
0.0269536841660738,
0.15689286589622498,
0.06437011808156967,
-0.014987948350608349,
-0.024878444150090218,
-0.08955308794975281,
-0.05765317752957344,
-0.04360170289874077,
-0.003433096455410123,
0.1908487230539322,
-0.07466883957386017,
0.16467387974262238,
0.11037430912256241,
-0.054548002779483795,
-0.2023840695619583,
0.042840443551540375,
0.05058063566684723,
0.01961439661681652,
0.035955674946308136,
-0.19901296496391296,
0.08479160815477371,
-0.010504565201699734,
-0.07431543618440628,
0.16766101121902466,
-0.16628403961658478,
-0.13823777437210083,
0.1015063226222992,
0.032590609043836594,
-0.21843241155147552,
-0.13565467298030853,
-0.10244499146938324,
-0.02490033023059368,
-0.14416609704494476,
0.049558479338884354,
0.0006803516880609095,
0.011386794969439507,
0.020660055801272392,
0.021814515814185143,
0.021355489268898964,
-0.04512013494968414,
0.20669199526309967,
-0.021750332787632942,
0.006546253804117441,
-0.04992818832397461,
-0.08849974721670151,
0.02558918669819832,
-0.0519903302192688,
0.10638050734996796,
-0.004647671245038509,
0.02836514823138714,
-0.17432881891727448,
-0.03721484914422035,
-0.058030031621456146,
0.026985708624124527,
-0.0952608585357666,
-0.08798448741436005,
-0.04866350069642067,
0.09186452627182007,
0.09572658687829971,
-0.02544824220240116,
-0.00004692322909249924,
-0.09164057672023773,
0.05423513054847717,
0.2070705145597458,
0.19299735128879547,
0.052031077444553375,
-0.07143436372280121,
0.016188301146030426,
-0.02803553082048893,
0.04441770166158676,
-0.23758257925510406,
0.04161182418465614,
0.058910369873046875,
0.02422342449426651,
0.08394542336463928,
-0.012012011371552944,
-0.16020891070365906,
-0.07254844158887863,
0.0852367952466011,
-0.05064064636826515,
-0.16870680451393127,
-0.0331687405705452,
0.026366785168647766,
-0.20051728188991547,
-0.039656393229961395,
0.026078378781676292,
-0.015614881180226803,
-0.03962672874331474,
0.02537040039896965,
0.07639287412166595,
-0.022939560934901237,
0.10037108510732651,
0.08623708039522171,
0.09555447101593018,
-0.10854125022888184,
0.07222291827201843,
0.0721302255988121,
-0.03215806186199188,
0.03032229095697403,
0.11419452726840973,
-0.053388405591249466,
-0.0324053093791008,
0.0738874301314354,
0.1004129946231842,
0.0194260086864233,
-0.055149152874946594,
0.005042869132012129,
-0.05898541584610939,
0.05889400094747543,
0.09808851778507233,
0.030880333855748177,
-0.006825966760516167,
0.05613933131098747,
0.03107989951968193,
-0.08853210508823395,
0.10866532474756241,
0.05046829953789711,
0.013064395636320114,
-0.04929133132100105,
-0.04452117159962654,
-0.002970898523926735,
-0.010758851654827595,
-0.01955058053135872,
-0.01199736725538969,
-0.08564981073141098,
-0.0059140753000974655,
-0.10399674624204636,
0.016365695744752884,
-0.07241548597812653,
0.008978740312159061,
0.02920009195804596,
-0.050707753747701645,
-0.0015031982911750674,
0.006290242541581392,
-0.0772068202495575,
-0.0534459687769413,
-0.014710417948663235,
0.08307627588510513,
-0.12379390001296997,
0.04395909979939461,
0.07218582183122635,
-0.10520237684249878,
0.07459963113069534,
-0.0038973672781139612,
0.011330110020935535,
0.009173562750220299,
-0.13834594190120697,
0.05256360024213791,
-0.025771914049983025,
-0.009634209796786308,
0.02815556339919567,
-0.20430852472782135,
-0.008868485689163208,
-0.0473669096827507,
-0.057277146726846695,
0.004087900277227163,
-0.022652771323919296,
-0.1210695132613182,
0.09218170493841171,
-0.005038459785282612,
-0.06111753359436989,
-0.024025723338127136,
0.0451849028468132,
0.10360851138830185,
-0.020232100039720535,
0.13148805499076843,
-0.016950950026512146,
0.06813012063503265,
-0.17686088383197784,
-0.008940344676375389,
-0.0117637375369668,
0.046239178627729416,
-0.01858733594417572,
-0.03316918760538101,
0.059893541038036346,
-0.025310030207037926,
0.18254873156547546,
-0.0161010529845953,
0.07041553407907486,
0.054922621697187424,
0.017255321145057678,
0.019025981426239014,
0.07829860597848892,
0.05666811019182205,
-0.005336637608706951,
0.004061167594045401,
0.041410814970731735,
-0.005901503376662731,
-0.03938421607017517,
-0.15817397832870483,
0.06680605560541153,
0.14928972721099854,
0.058281898498535156,
0.027325185015797615,
0.03197052329778671,
-0.11885952204465866,
-0.08157291263341904,
0.13254015147686005,
-0.020477067679166794,
-0.027409963309764862,
-0.06893298029899597,
0.17479558289051056,
0.143619567155838,
-0.20190387964248657,
0.07251779735088348,
-0.05340872332453728,
-0.05151306837797165,
-0.1334860920906067,
-0.1659441590309143,
-0.059017378836870193,
-0.06145646050572395,
-0.02472650445997715,
-0.06262028217315674,
0.05266156792640686,
0.053667254745960236,
0.005791811738163233,
-0.01900913380086422,
0.10502754151821136,
0.012417243793606758,
-0.03177746385335922,
0.04707982763648033,
0.06342339515686035,
0.0324389673769474,
-0.09790628403425217,
0.010163860395550728,
-0.001273071626201272,
0.015008065849542618,
0.06558454036712646,
0.014757347293198109,
-0.05895645171403885,
0.019310571253299713,
-0.015444929711520672,
-0.1163446307182312,
0.0407673716545105,
-0.01765078492462635,
-0.03799813240766525,
0.15219756960868835,
0.03260631859302521,
0.006804205477237701,
-0.023361939936876297,
0.22725367546081543,
-0.08163497596979141,
-0.06626982986927032,
-0.1492985486984253,
0.06571583449840546,
-0.06286054849624634,
0.030812766402959824,
0.03342539072036743,
-0.12286258488893509,
0.005743655376136303,
0.17193713784217834,
0.13066774606704712,
-0.01748792454600334,
0.009805599227547646,
0.04607410728931427,
0.005078371614217758,
-0.03783397376537323,
0.020511096343398094,
0.051410648971796036,
0.15321633219718933,
-0.06997452676296234,
0.06351571530103683,
-0.011043943464756012,
-0.0881529375910759,
-0.013664931058883667,
0.10772715508937836,
0.0014034134801477194,
0.0007117211353033781,
-0.06336770951747894,
0.13644009828567505,
-0.07988499104976654,
-0.22675208747386932,
0.06008664518594742,
-0.07122340798377991,
-0.14581744372844696,
-0.04729337617754936,
0.025740813463926315,
-0.016615169122815132,
0.00811750814318657,
0.0723295584321022,
-0.05156058445572853,
0.1941734254360199,
0.04136710986495018,
-0.058017972856760025,
-0.09357237070798874,
0.06208472698926926,
-0.16663874685764313,
0.2724353075027466,
0.015191740356385708,
0.04635656997561455,
0.1060401126742363,
-0.014362643472850323,
-0.13888666033744812,
0.010941687040030956,
0.10760833323001862,
-0.07241661101579666,
0.053875286132097244,
0.17876289784908295,
0.004598530475050211,
0.12946905195713043,
0.05905318632721901,
-0.054642051458358765,
0.034602828323841095,
-0.10552660375833511,
-0.04506244510412216,
-0.1109640896320343,
0.08033160120248795,
-0.08631961792707443,
0.15878845751285553,
0.12487447261810303,
-0.06972363591194153,
-0.005138404667377472,
-0.019111502915620804,
0.08445312827825546,
0.007957316935062408,
0.11301423609256744,
0.011437082663178444,
-0.18568097054958344,
0.03820236027240753,
0.005357298534363508,
0.09878119826316833,
-0.19602061808109283,
-0.057720545679330826,
0.044161323457956314,
-0.02059127390384674,
-0.07218626141548157,
0.12508058547973633,
0.04109282046556473,
0.03746681660413742,
-0.04023266211152077,
-0.04551305994391441,
0.0047440179623663425,
0.14461630582809448,
-0.11838681995868683,
-0.00870958436280489
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | mertllc/mms-tts-tur-inkilap_modified_50 | [
"transformers",
"safetensors",
"vits",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:05:08+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #vits #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #vits #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
34,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #vits #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.054659612476825714,
0.21414990723133087,
-0.0031807427294552326,
0.026865221560001373,
0.1250888854265213,
0.00032571866177022457,
0.04081440716981888,
0.12862813472747803,
-0.02167222462594509,
0.11129128932952881,
0.03218022361397743,
0.09727001935243607,
0.10339263826608658,
0.16586677730083466,
0.03691011667251587,
-0.21517004072666168,
0.009132993407547474,
-0.09292528033256531,
0.018077509477734566,
0.10867427289485931,
0.13162045180797577,
-0.10489460080862045,
0.07603627443313599,
-0.03790099918842316,
-0.017673974856734276,
-0.0003223843814339489,
-0.0923151820898056,
-0.070840984582901,
0.06550594419240952,
0.06909013539552689,
0.06122942641377449,
0.009997012093663216,
0.10145736485719681,
-0.29726552963256836,
0.01687687449157238,
0.08279260247945786,
-0.004506718832999468,
0.06148726865649223,
0.0646374449133873,
-0.08339887112379074,
0.1029256209731102,
-0.08559336513280869,
0.13652671873569489,
0.08214850723743439,
-0.06937385350465775,
-0.21391066908836365,
-0.06977995485067368,
0.0987061932682991,
0.12011658400297165,
0.06274435669183731,
-0.02326560579240322,
0.1522950381040573,
-0.06972704082727432,
0.012022249400615692,
0.1361677050590515,
-0.09713108092546463,
-0.05137801170349121,
0.049987345933914185,
0.11240657418966293,
0.10166463255882263,
-0.1353231519460678,
0.007596791721880436,
0.04457303136587143,
0.023097742348909378,
0.09194746613502502,
0.020738936960697174,
0.0916183590888977,
0.04564107209444046,
-0.13860996067523956,
-0.03957565128803253,
0.10889606922864914,
0.03478158637881279,
-0.05796414613723755,
-0.21188515424728394,
-0.0026691502425819635,
-0.026535477489233017,
-0.023307178169488907,
-0.05803702771663666,
0.045833978801965714,
-0.03317271173000336,
0.067923404276371,
-0.042256616055965424,
-0.10016343742609024,
-0.03838508576154709,
0.0836847797036171,
0.06997206062078476,
0.013808192685246468,
-0.026154542341828346,
0.03861820325255394,
0.11874474585056305,
0.037009406834840775,
-0.10824361443519592,
-0.0663856491446495,
-0.06518013030290604,
-0.09711762517690659,
-0.04532422870397568,
0.04776853322982788,
0.01869308575987816,
0.030892416834831238,
0.20719914138317108,
-0.0024066849146038294,
0.040300752967596054,
0.01544452179223299,
0.00820851232856512,
0.05608583986759186,
0.09020276367664337,
-0.057233426719903946,
-0.13989022374153137,
-0.04616677761077881,
0.08976847678422928,
-0.00493787182494998,
-0.03551584109663963,
-0.04997507110238075,
0.048379965126514435,
0.05169600620865822,
0.1267518699169159,
0.08646857738494873,
-0.012898874469101429,
-0.05273304134607315,
-0.025197435170412064,
0.22986702620983124,
-0.14503952860832214,
0.04801303148269653,
-0.016220765188336372,
-0.026413746178150177,
-0.04562145099043846,
0.037146687507629395,
0.02893291600048542,
-0.0071297562681138515,
0.09902069717645645,
-0.055000074207782745,
-0.03897455707192421,
-0.10056453198194504,
-0.03981734439730644,
0.04000834375619888,
-0.0014343701768666506,
-0.011925416998565197,
-0.07901987433433533,
-0.1033727377653122,
-0.04151687026023865,
0.0622556135058403,
-0.06062569096684456,
-0.03672588989138603,
0.014433487318456173,
-0.0646335631608963,
-0.011868113651871681,
-0.0046113538555800915,
0.10713792592287064,
-0.03111988678574562,
0.041085705161094666,
-0.03385680913925171,
0.05467362701892853,
0.10134078562259674,
0.03396330401301384,
-0.0692443996667862,
0.05283360555768013,
-0.2253323644399643,
0.0846395194530487,
-0.1103181466460228,
0.040045637637376785,
-0.1649162620306015,
-0.04362662881612778,
0.01545786950737238,
0.01223697792738676,
0.010682502761483192,
0.11813149601221085,
-0.18765069544315338,
-0.02040630392730236,
0.13456352055072784,
-0.09486816823482513,
-0.10925174504518509,
0.07470420002937317,
-0.04261988773941994,
0.14796192944049835,
0.04623936489224434,
-0.017894135788083076,
0.07337126135826111,
-0.16546636819839478,
-0.06534566730260849,
-0.015944186598062515,
-0.01140376552939415,
0.13805019855499268,
0.06177884340286255,
-0.05833873897790909,
0.06357681751251221,
0.02317901886999607,
-0.022351879626512527,
-0.04479735344648361,
-0.05049646645784378,
-0.10716529190540314,
-0.006589649710804224,
-0.0877491682767868,
0.049144841730594635,
-0.008710972033441067,
-0.07987060397863388,
-0.032660458236932755,
-0.18162156641483307,
0.03565994277596474,
0.08912748098373413,
0.006954456213861704,
-0.008257697336375713,
-0.07709750533103943,
0.012575463391840458,
-0.027584582567214966,
-0.010441360995173454,
-0.16807158291339874,
-0.045059818774461746,
0.045085642486810684,
-0.1683385670185089,
0.03666726127266884,
-0.05383622646331787,
0.057435907423496246,
0.04089425876736641,
-0.0608406662940979,
-0.012410139665007591,
-0.020455263555049896,
0.02037479542195797,
-0.03554835915565491,
-0.19715940952301025,
-0.04920884966850281,
-0.033720988780260086,
0.15323609113693237,
-0.2512565553188324,
0.03701164573431015,
0.04283377155661583,
0.1445688009262085,
-0.004499740432947874,
-0.041343484073877335,
0.021006079390645027,
-0.05124713480472565,
-0.04886976629495621,
-0.064845971763134,
-0.003489583032205701,
-0.029771825298666954,
-0.04689984768629074,
0.014419492334127426,
-0.17416127026081085,
-0.03588438406586647,
0.09719391912221909,
0.1012604832649231,
-0.15479636192321777,
-0.018018238246440887,
-0.046819429844617844,
-0.06501296907663345,
-0.08719377964735031,
-0.0634685754776001,
0.12365260720252991,
0.04887883737683296,
0.044603388756513596,
-0.07642911374568939,
-0.06516730040311813,
0.02209198847413063,
0.00037755590165033937,
-0.03342745080590248,
0.07709765434265137,
0.06420876830816269,
-0.09495706856250763,
0.07597044855356216,
0.0879693329334259,
0.07397416979074478,
0.09690815210342407,
0.017737112939357758,
-0.10766889899969101,
-0.025353191420435905,
0.025884538888931274,
0.02590569481253624,
0.14766225218772888,
-0.052133310586214066,
0.03766921907663345,
0.047928281128406525,
-0.048178963363170624,
0.018924955278635025,
-0.09172655642032623,
0.02477680705487728,
0.03108147345483303,
-0.0051895990036427975,
0.04569429159164429,
-0.04261132329702377,
0.0015583503991365433,
0.07553404569625854,
0.0439009927213192,
0.054722823202610016,
0.004550157580524683,
-0.014615098014473915,
-0.09760808199644089,
0.16303586959838867,
-0.09686829894781113,
-0.2844827473163605,
-0.15191766619682312,
0.025421515107154846,
0.038875505328178406,
-0.02202117070555687,
0.031196635216474533,
-0.0685606598854065,
-0.10619828850030899,
-0.10253546386957169,
-0.0007893215515650809,
0.021664658561348915,
-0.07999464124441147,
-0.07771245390176773,
0.07423610240221024,
0.04034431278705597,
-0.14601534605026245,
0.03843066841363907,
0.05174413323402405,
-0.05686575174331665,
-0.020990731194615364,
0.08788161724805832,
0.11919383704662323,
0.15064425766468048,
-0.01956579089164734,
-0.029653063043951988,
0.02179299294948578,
0.18913501501083374,
-0.13056331872940063,
0.10870491713285446,
0.1331699639558792,
-0.0433298796415329,
0.08741360157728195,
0.17486868798732758,
0.02946310304105282,
-0.08184187114238739,
0.04125521704554558,
0.04271497204899788,
-0.0446363128721714,
-0.2628204822540283,
-0.0587831549346447,
0.013565518893301487,
-0.07289978116750717,
0.09574431926012039,
0.09441626816987991,
0.13101495802402496,
0.03733300045132637,
-0.07704862952232361,
-0.042284153401851654,
-0.0007691121427342296,
0.11566338688135147,
-0.04729871824383736,
-0.00864650122821331,
0.08112052828073502,
-0.04204992949962616,
0.0042695761658251286,
0.101866215467453,
0.024085933342576027,
0.18680992722511292,
0.02045324817299843,
0.1325864940881729,
0.06266885250806808,
0.07362587004899979,
-0.00304698059335351,
0.021530818194150925,
0.04571235924959183,
0.016793522983789444,
-0.004352389834821224,
-0.10109587758779526,
0.004940509796142578,
0.14031140506267548,
0.044244058430194855,
0.029351718723773956,
0.0023038540966808796,
-0.025745723396539688,
0.059172797948122025,
0.16894783079624176,
-0.014623390510678291,
-0.20305828750133514,
-0.07212355732917786,
0.07476779818534851,
-0.05524183437228203,
-0.12190999835729599,
-0.03604535013437271,
0.03974858298897743,
-0.17753031849861145,
0.03411399945616722,
-0.020660564303398132,
0.09808827936649323,
-0.0960298478603363,
-0.025731271132826805,
0.017328539863228798,
0.08463997393846512,
-0.017630890011787415,
0.09686511754989624,
-0.15011048316955566,
0.12523487210273743,
0.03229980170726776,
0.0898485779762268,
-0.11468798667192459,
0.08304145932197571,
-0.009098101407289505,
0.016468055546283722,
0.18883956968784332,
-0.00914006493985653,
-0.043279051780700684,
-0.0765409916639328,
-0.09724772721529007,
-0.016675574705004692,
0.12457696348428726,
-0.11865599453449249,
0.08336363732814789,
-0.006434252485632896,
-0.05090279504656792,
0.010499227792024612,
-0.11436042934656143,
-0.17895425856113434,
-0.19684189558029175,
0.061690423637628555,
-0.10233647376298904,
0.01922602578997612,
-0.1105671152472496,
-0.06737665832042694,
-0.029828263446688652,
0.2358294576406479,
-0.14021140336990356,
-0.07348582148551941,
-0.1486395299434662,
-0.049397800117731094,
0.1688835471868515,
-0.039627790451049805,
0.07352027297019958,
-0.014237076044082642,
0.21156272292137146,
-0.0005727469106204808,
-0.0019497170578688383,
0.0662601962685585,
-0.09127254039049149,
-0.17042554914951324,
-0.0796523243188858,
0.1408538520336151,
0.1185344010591507,
0.05187511071562767,
-0.00005241960025159642,
0.008437353186309338,
-0.01933823712170124,
-0.11107131093740463,
-0.005973829887807369,
0.13854430615901947,
0.06674695014953613,
0.03547331318259239,
-0.05006469413638115,
-0.10860110819339752,
-0.06920936703681946,
-0.058358483016490936,
0.05175930634140968,
0.18184207379817963,
-0.1009909063577652,
0.17350798845291138,
0.15878215432167053,
-0.07211574912071228,
-0.21567314863204956,
0.039191193878650665,
0.04846473038196564,
-0.014512532390654087,
0.04614531248807907,
-0.1829945594072342,
0.09505120664834976,
0.015141540206968784,
-0.052736036479473114,
0.12199369817972183,
-0.15728448331356049,
-0.15639621019363403,
0.06087431684136391,
0.04970995709300041,
-0.23623821139335632,
-0.1441342532634735,
-0.08822641521692276,
-0.06784138828516006,
-0.14815589785575867,
0.07915012538433075,
-0.019972164183855057,
0.011897586286067963,
0.04091079905629158,
0.013740893453359604,
0.023185279220342636,
-0.055776987224817276,
0.18284909427165985,
-0.0035617330577224493,
0.014864614233374596,
-0.06912479549646378,
-0.058035630732774734,
0.0975092425942421,
-0.05838471278548241,
0.1184525191783905,
-0.003918026573956013,
0.013672815635800362,
-0.08212041109800339,
-0.05343952775001526,
-0.046617619693279266,
0.05752236396074295,
-0.08050531893968582,
-0.11092408001422882,
-0.04487094283103943,
0.08938708156347275,
0.07764840126037598,
-0.033286161720752716,
-0.010930746793746948,
-0.07634644955396652,
0.10063119232654572,
0.19033774733543396,
0.17030654847621918,
0.018113715574145317,
-0.07677590847015381,
0.015532949939370155,
-0.03924742713570595,
0.04019718989729881,
-0.2505480647087097,
0.03877655416727066,
0.0529145747423172,
0.0354921817779541,
0.1059221550822258,
-0.02500346675515175,
-0.17749741673469543,
-0.0438142865896225,
0.06573881208896637,
-0.045354213565588,
-0.22390563786029816,
-0.009726951830089092,
0.09943331032991409,
-0.1914641559123993,
-0.015451330691576004,
0.02838914282619953,
-0.04480560123920441,
-0.02868090756237507,
0.0007889526314102113,
0.0600614957511425,
0.015805870294570923,
0.09190283715724945,
0.07423794269561768,
0.09749054163694382,
-0.08805927634239197,
0.09811163693666458,
0.10723351687192917,
-0.09035424888134003,
0.03553062304854393,
0.06695880740880966,
-0.0467107780277729,
-0.04594837874174118,
0.05199020728468895,
0.04819667339324951,
0.01212578546255827,
-0.0561964213848114,
0.010319532826542854,
-0.04872706159949303,
0.04633839800953865,
0.10621411353349686,
0.028242740780115128,
-0.03058992512524128,
0.06704547256231308,
0.03252853453159332,
-0.1153404489159584,
0.09847725927829742,
0.012868257239460945,
0.03807265684008598,
-0.06272068619728088,
-0.015808504074811935,
0.04865187034010887,
0.027409857138991356,
-0.01764598675072193,
-0.025427930057048798,
-0.035527609288692474,
-0.015147317200899124,
-0.15422900021076202,
-0.012660279870033264,
-0.07294544577598572,
0.007333413697779179,
0.006807927042245865,
-0.03955657035112381,
-0.0043836915865540504,
0.029364487156271935,
-0.07081043720245361,
-0.06899864971637726,
-0.0017123379511758685,
0.10014908015727997,
-0.16123399138450623,
0.0016520773060619831,
0.07378670573234558,
-0.10700937360525131,
0.06776659190654755,
-0.009028629399836063,
0.006400149781256914,
0.021102426573634148,
-0.1615109145641327,
0.05426544323563576,
-0.010029333643615246,
0.02013414539396763,
0.032934170216321945,
-0.16248436272144318,
0.0024488656781613827,
-0.047329291701316833,
-0.022390197962522507,
-0.004845738876610994,
-0.04656189680099487,
-0.11974798142910004,
0.07715073227882385,
-0.01184067688882351,
-0.05094744265079498,
-0.01612357795238495,
0.05293868109583855,
0.08231643587350845,
-0.03882661834359169,
0.09632368385791779,
-0.005011113826185465,
0.05959545075893402,
-0.17253276705741882,
-0.02932477742433548,
-0.0432354174554348,
0.014331330545246601,
0.01743181422352791,
-0.009555062279105186,
0.03874485567212105,
-0.00935265514999628,
0.22544825077056885,
-0.03915993124246597,
0.16461394727230072,
0.055936723947525024,
-0.0032888432033360004,
0.0007776605198159814,
0.06758615374565125,
0.05568486079573631,
0.03412187471985817,
0.00899792555719614,
0.02200561948120594,
-0.023325180634856224,
-0.006471368949860334,
-0.1553903967142105,
0.02697177603840828,
0.14716137945652008,
0.0745159387588501,
0.006664956454187632,
0.07025619596242905,
-0.1267581284046173,
-0.11370917409658432,
0.09592846781015396,
-0.02568071521818638,
0.008476621471345425,
-0.07835444062948227,
0.12778781354427338,
0.14673273265361786,
-0.14686504006385803,
0.06517019122838974,
-0.053687721490859985,
-0.05600763112306595,
-0.09034380316734314,
-0.10879118740558624,
-0.06126067787408829,
-0.04308179020881653,
0.004678911529481411,
-0.042684826999902725,
0.055097613483667374,
0.04954573139548302,
-0.014461824670433998,
0.004931987728923559,
0.12391652166843414,
-0.006120255216956139,
0.001201988779939711,
0.03766126185655594,
0.03769403696060181,
0.024755796417593956,
-0.059261444956064224,
0.030717262998223305,
0.021477915346622467,
0.034908585250377655,
0.059853747487068176,
0.037230484187603,
-0.045039307326078415,
0.028804119676351547,
0.0020213082898408175,
-0.10957802832126617,
0.023749636486172676,
-0.012328135780990124,
-0.06936221569776535,
0.12969832122325897,
0.03471869230270386,
0.009512413293123245,
-0.037131089717149734,
0.23728759586811066,
-0.062090300023555756,
-0.08014962822198868,
-0.12913139164447784,
0.09616934508085251,
-0.013530191034078598,
0.057892732322216034,
0.03356536477804184,
-0.12210189551115036,
0.0036616562865674496,
0.13605539500713348,
0.11633196473121643,
-0.0003361425769980997,
0.012180927209556103,
0.044184453785419464,
0.004239979665726423,
-0.06263455748558044,
0.044461920857429504,
0.06619330495595932,
0.12273700535297394,
-0.07938622683286667,
0.07410858571529388,
0.00435013510286808,
-0.08385829627513885,
-0.0399140790104866,
0.1140546128153801,
-0.03326992690563202,
0.03303933143615723,
-0.041518088430166245,
0.10997304320335388,
-0.059399381279945374,
-0.3032641112804413,
0.03540288656949997,
-0.10066618025302887,
-0.1533578634262085,
-0.01690032333135605,
0.06605888903141022,
-0.02134985849261284,
0.01722477562725544,
0.06963877379894257,
-0.058587364852428436,
0.1905425637960434,
0.03258530795574188,
-0.07860512286424637,
-0.059183377772569656,
0.05133861303329468,
-0.0791650041937828,
0.302468478679657,
0.00626079086214304,
0.03169599175453186,
0.10508318990468979,
-0.028644336387515068,
-0.16361252963542938,
0.02362491562962532,
0.1140698790550232,
-0.08390003442764282,
0.08627496659755707,
0.19878266751766205,
-0.019539451226592064,
0.11435621976852417,
0.05704843997955322,
-0.06186779960989952,
0.0524447038769722,
-0.03936922550201416,
-0.052163589745759964,
-0.09776037186384201,
0.06190723925828934,
-0.06178663671016693,
0.15432539582252502,
0.09593082964420319,
-0.05059736222028732,
-0.006600235588848591,
-0.05587591603398323,
0.04507772624492645,
0.018967149779200554,
0.12800532579421997,
0.012484090402722359,
-0.17696550488471985,
0.032744914293289185,
0.0010579711524769664,
0.11208613961935043,
-0.24666742980480194,
-0.08353681117296219,
0.09015431255102158,
-0.019416818395256996,
-0.05258401483297348,
0.09870866686105728,
0.0722413882613182,
0.04240792989730835,
-0.04463369399309158,
-0.10492048412561417,
-0.019366342574357986,
0.1493324637413025,
-0.14043603837490082,
-0.014699541963636875
] |
null | null | transformers |
chinese-hubert-base from https://huggingface.co/lj1995/GPT-SoVITS
pretrained models used in https://github.com/shibing624/parrots | {"language": ["zh"], "license": "apache-2.0", "pipeline_tag": "text-to-speech"} | text-to-speech | shibing624/parrots-chinese-hubert-base | [
"transformers",
"pytorch",
"hubert",
"feature-extraction",
"text-to-speech",
"zh",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:06:01+00:00 | [] | [
"zh"
] | TAGS
#transformers #pytorch #hubert #feature-extraction #text-to-speech #zh #license-apache-2.0 #endpoints_compatible #region-us
|
chinese-hubert-base from URL
pretrained models used in URL | [] | [
"TAGS\n#transformers #pytorch #hubert #feature-extraction #text-to-speech #zh #license-apache-2.0 #endpoints_compatible #region-us \n"
] | [
47
] | [
"passage: TAGS\n#transformers #pytorch #hubert #feature-extraction #text-to-speech #zh #license-apache-2.0 #endpoints_compatible #region-us \n"
] | [
-0.08026564121246338,
-0.017752207815647125,
-0.006073329597711563,
-0.013631004840135574,
0.054151974618434906,
-0.006421172060072422,
0.06632080674171448,
0.12720826268196106,
0.044387951493263245,
-0.05425240099430084,
0.09110653400421143,
0.19298431277275085,
0.009689713828265667,
-0.017541086301207542,
-0.05041944608092308,
-0.2410622537136078,
0.09613976627588272,
0.056798774749040604,
-0.017371179535984993,
0.07385673373937607,
0.11228188127279282,
-0.032343290746212006,
0.028485432267189026,
0.04887662082910538,
-0.017015237361192703,
0.023122413083910942,
0.01729453355073929,
-0.08097999542951584,
0.09823872894048691,
0.03624771907925606,
-0.0011320383055135608,
0.07155657559633255,
-0.04329175874590874,
-0.25930747389793396,
0.025157615542411804,
-0.031083418056368828,
-0.06854991614818573,
0.002129067899659276,
0.01829136535525322,
-0.05148698762059212,
0.14546021819114685,
-0.0027520048897713423,
-0.06607566028833389,
0.09280483424663544,
-0.02691018395125866,
-0.20156654715538025,
-0.09632573276758194,
0.1525724232196808,
0.02564949169754982,
0.09316989779472351,
0.014589808881282806,
0.10502129048109055,
-0.09794529527425766,
0.04018022119998932,
0.2326459288597107,
-0.37113624811172485,
0.013494853861629963,
0.014249518513679504,
0.14504584670066833,
0.046194374561309814,
-0.039940524846315384,
0.06494445353746414,
0.016620561480522156,
-0.013095790520310402,
-0.045352041721343994,
-0.08170109987258911,
0.0015788536984473467,
0.0490957647562027,
-0.08548591285943985,
-0.04499557986855507,
0.2375544011592865,
0.010960784740746021,
0.027428435161709785,
-0.04489786550402641,
-0.04133596643805504,
0.05428728088736534,
-0.04312004894018173,
0.0710584968328476,
-0.011085371486842632,
0.10765261948108673,
0.06556407362222672,
-0.1103842705488205,
-0.14155922830104828,
-0.0028655859641730785,
-0.13958866894245148,
0.14844869077205658,
0.05242855101823807,
0.06782630831003189,
-0.14262841641902924,
0.046399910002946854,
0.049558836966753006,
-0.134429931640625,
-0.04776206612586975,
-0.08716043084859848,
0.11839801073074341,
0.09710855782032013,
-0.05996926128864288,
0.03172881156206131,
0.18504180014133453,
0.17167878150939941,
0.0002875775098800659,
0.05346653610467911,
-0.04918232560157776,
0.12187153846025467,
-0.03751309588551521,
0.09020734578371048,
0.04622778668999672,
-0.052955109626054764,
0.1039002388715744,
-0.10759611427783966,
0.06506329774856567,
-0.029772169888019562,
-0.1548275351524353,
-0.051583752036094666,
-0.01655367948114872,
0.12178537249565125,
0.05071861296892166,
0.028714830055832863,
0.0038593069184571505,
0.005740097723901272,
0.08224235475063324,
-0.0398641973733902,
-0.02893987111747265,
0.01791892573237419,
0.03766386955976486,
0.1321684867143631,
0.047677721828222275,
0.024815047159790993,
-0.08888401836156845,
0.03257079795002937,
-0.01740948297083378,
0.024745885282754898,
0.0006986108492128551,
0.02862693928182125,
0.06349257379770279,
-0.12502789497375488,
0.049701008945703506,
-0.13228069245815277,
-0.10393035411834717,
0.03306075558066368,
0.023857401683926582,
0.03908369690179825,
-0.03674088791012764,
0.010654022917151451,
-0.004689543042331934,
0.02262810431420803,
-0.0835757702589035,
-0.03569190204143524,
-0.09108549356460571,
0.0930425152182579,
-0.0743749737739563,
0.011257713660597801,
-0.19154085218906403,
0.05717051774263382,
-0.06490384042263031,
0.012768158689141273,
0.03216349333524704,
-0.016438409686088562,
-0.08556456863880157,
0.1609390527009964,
-0.03029962070286274,
-0.05641937255859375,
-0.029358260333538055,
0.009940322488546371,
-0.0634215697646141,
0.12152117490768433,
-0.08576290309429169,
-0.06522056460380554,
0.23668326437473297,
-0.10808713734149933,
-0.20913870632648468,
0.07016266882419586,
0.017108844593167305,
-0.009549295529723167,
0.06678259372711182,
0.2586437165737152,
0.08776181936264038,
-0.10723169893026352,
0.052339017391204834,
0.16331776976585388,
-0.06278815120458603,
-0.153131902217865,
0.06258109211921692,
-0.060806214809417725,
-0.01010085828602314,
0.049505677074193954,
0.0017779124900698662,
0.10514261573553085,
0.009066483937203884,
-0.0664874017238617,
-0.049333881586790085,
-0.06374891847372055,
0.008660963736474514,
-0.00672964658588171,
0.06843428313732147,
-0.05750051140785217,
-0.0017844432732090354,
-0.03858949616551399,
0.06963454186916351,
0.007429170422255993,
0.10299769788980484,
-0.08780471235513687,
0.09549584984779358,
0.08735524117946625,
0.03842424228787422,
-0.13254013657569885,
0.11435011029243469,
-0.047956522554159164,
0.04997740685939789,
0.05548819899559021,
0.0853748619556427,
0.026136960834264755,
-0.089061439037323,
-0.009695326909422874,
-0.0066501544788479805,
0.07705419510602951,
0.057412080466747284,
0.012939827516674995,
-0.1201220229268074,
0.027708355337381363,
-0.012513411231338978,
0.04473824426531792,
0.0020024292171001434,
0.0023679318837821484,
0.07723695784807205,
0.07861556112766266,
-0.07358089834451675,
0.08459828048944473,
-0.03499135375022888,
-0.026714757084846497,
-0.03630027174949646,
0.02589409053325653,
0.09030338376760483,
0.05742396041750908,
-0.12475793808698654,
0.20126087963581085,
-0.04463006928563118,
0.1819305419921875,
0.20272180438041687,
-0.17942354083061218,
0.15248934924602509,
-0.022703630849719048,
-0.0229434035718441,
-0.0024876969400793314,
0.08132945001125336,
-0.027038317173719406,
0.08449912816286087,
0.014605456963181496,
0.14650870859622955,
-0.06290451437234879,
-0.03543785214424133,
-0.01248489785939455,
-0.07321024686098099,
0.02267187088727951,
0.05439172685146332,
0.10237465798854828,
-0.18907137215137482,
0.12236834317445755,
0.2566530406475067,
-0.003336464986205101,
0.1889931559562683,
-0.12221028655767441,
0.005289413966238499,
0.051406361162662506,
0.00821703765541315,
-0.03853438422083855,
0.08228980749845505,
-0.24921424686908722,
-0.03988279029726982,
0.07750775665044785,
0.02512463554739952,
0.11053130030632019,
-0.13200786709785461,
-0.04242733120918274,
-0.014208664186298847,
-0.05526990815997124,
-0.0612538643181324,
0.07991886138916016,
-0.015309590846300125,
0.06531025469303131,
-0.05294018238782883,
-0.052894629538059235,
0.08448585122823715,
-0.01648722216486931,
-0.09260962903499603,
0.1024026870727539,
-0.1656116545200348,
-0.2071152925491333,
-0.10405334830284119,
-0.13545754551887512,
-0.07574670761823654,
-0.026912715286016464,
0.16264934837818146,
-0.03221018612384796,
-0.004421376623213291,
0.027778444811701775,
-0.021021602675318718,
-0.07210855185985565,
-0.008635523729026318,
-0.03011620230972767,
0.03098493441939354,
-0.040138449519872665,
-0.13271017372608185,
-0.06551088392734528,
-0.03651642054319382,
-0.010814626701176167,
0.056231338530778885,
-0.11886383593082428,
0.0845593586564064,
0.09244921803474426,
0.07811843603849411,
0.08339844644069672,
-0.037208881229162216,
0.10475423187017441,
-0.03574607893824577,
-0.00986185111105442,
0.27383551001548767,
-0.006034091580659151,
0.08197898417711258,
0.07958295196294785,
0.024435678496956825,
-0.022503385320305824,
-0.021314293146133423,
-0.056033965200185776,
-0.08212511986494064,
-0.22833757102489471,
-0.1283397525548935,
-0.15862244367599487,
0.006197760347276926,
-0.026291096583008766,
0.07703061401844025,
0.16110998392105103,
0.026213087141513824,
-0.05417909845709801,
-0.04445961117744446,
0.014529362320899963,
0.04650307074189186,
0.245873361825943,
-0.033499062061309814,
0.07770233601331711,
-0.11469482630491257,
-0.03768778219819069,
0.11196491867303848,
0.04909868910908699,
0.18388409912586212,
0.11271174997091293,
0.1058456227183342,
0.09532282501459122,
0.21592749655246735,
0.0764557346701622,
0.10415472835302353,
0.0019122002413496375,
0.0013399542076513171,
-0.0302265714854002,
-0.04609760642051697,
-0.03608473017811775,
0.04349637031555176,
0.08478269726037979,
-0.1502629667520523,
-0.022980161011219025,
-0.14870810508728027,
0.09186756610870361,
0.14098544418811798,
-0.014722444117069244,
0.011059746146202087,
0.043319571763277054,
0.07222539186477661,
-0.014345301315188408,
-0.04905139282345772,
0.10503681749105453,
-0.05855686217546463,
-0.11694197356700897,
0.15121017396450043,
0.00910191610455513,
0.12250985205173492,
0.03876522555947304,
0.05324457958340645,
-0.09687681496143341,
-0.15022993087768555,
0.0797412097454071,
0.13510482013225555,
-0.2515071928501129,
0.1811792403459549,
-0.024180112406611443,
-0.007142605260014534,
-0.10924816876649857,
-0.011420242488384247,
0.09500329941511154,
0.16305139660835266,
0.12151521444320679,
0.02096864953637123,
-0.0400356724858284,
0.024293988943099976,
-0.015019983984529972,
0.04219086840748787,
0.07573212683200836,
0.006041058339178562,
-0.059281233698129654,
-0.054282236844301224,
-0.02483675256371498,
-0.026409508660435677,
0.09278304129838943,
-0.033491384238004684,
-0.1531166136264801,
0.06131106615066528,
0.08275818079710007,
0.01010111067444086,
-0.06778295338153839,
0.013605492189526558,
-0.0648779645562172,
0.10140524804592133,
-0.11605115979909897,
-0.09432517737150192,
-0.0849766805768013,
-0.15227970480918884,
0.058230411261320114,
-0.06507323682308197,
0.06631141155958176,
-0.036780599504709244,
-0.019414419308304787,
-0.12292264401912689,
-0.1853807419538498,
0.06876328587532043,
-0.1117929220199585,
-0.014704877510666847,
0.006786230951547623,
0.23968742787837982,
-0.06895827502012253,
0.034114133566617966,
0.056296881288290024,
-0.03993875905871391,
-0.12225935608148575,
-0.1469525396823883,
-0.06982672214508057,
-0.003494669683277607,
-0.010689226910471916,
-0.036632239818573,
-0.058170661330223083,
-0.023961203172802925,
-0.03257015720009804,
-0.07000485062599182,
0.25440192222595215,
0.1507226526737213,
-0.057432856410741806,
0.1971820592880249,
0.1616307497024536,
-0.05862129107117653,
-0.2751815617084503,
-0.21787559986114502,
-0.14513933658599854,
-0.06314421445131302,
-0.06330873817205429,
-0.11893925815820694,
0.10505316406488419,
0.004446357488632202,
-0.08125533163547516,
0.13498666882514954,
-0.22446505725383759,
-0.09224165976047516,
0.1765730232000351,
-0.041379354894161224,
0.3433792293071747,
-0.13964129984378815,
-0.09970434755086899,
-0.08182869106531143,
-0.17039543390274048,
0.11888638883829117,
-0.13213244080543518,
0.07833942770957947,
0.001988879404962063,
-0.004650409799069166,
-0.006651333533227444,
-0.03487115725874901,
0.1381368488073349,
0.007492275908589363,
-0.025866102427244186,
-0.08093538135290146,
-0.007725649047642946,
0.0884842574596405,
-0.02098005637526512,
0.03657812625169754,
-0.2192690372467041,
0.010942697525024414,
-0.11975350975990295,
-0.021032197400927544,
-0.11476592719554901,
0.07198026776313782,
0.004502355586737394,
-0.038989435881376266,
-0.06981903314590454,
-0.042242780327796936,
0.054654549807310104,
0.03410566225647926,
0.2597770690917969,
-0.043666597455739975,
-0.021100211888551712,
0.06732356548309326,
0.07853607833385468,
-0.22768044471740723,
-0.0766446590423584,
-0.13321290910243988,
-0.05845148116350174,
0.06262459605932236,
-0.1242290511727333,
0.04108157753944397,
0.09468163549900055,
-0.008908318355679512,
0.04969261959195137,
0.05757800117135048,
-0.04846206307411194,
-0.010709559544920921,
0.10717350989580154,
-0.12453235685825348,
-0.04559247940778732,
-0.07755692303180695,
0.11200973391532898,
0.09359148144721985,
0.07425546646118164,
0.12602126598358154,
0.027470145374536514,
-0.020253023132681847,
-0.01974477246403694,
-0.010941542685031891,
-0.12371385097503662,
0.03531353920698166,
0.06858857721090317,
0.02242744155228138,
-0.1506962776184082,
0.044964369386434555,
-0.05111897736787796,
-0.05571495369076729,
-0.037485916167497635,
0.1369684934616089,
-0.13992345333099365,
-0.11532915383577347,
-0.06362226605415344,
0.05050458014011383,
-0.1284322291612625,
-0.12064368277788162,
-0.05460720881819725,
-0.16186699271202087,
0.02718403935432434,
0.1925668716430664,
0.09090525656938553,
0.04192260280251503,
-0.026093855500221252,
-0.06949948519468307,
0.06034420058131218,
-0.04406462982296944,
-0.0407007671892643,
-0.0016451460542157292,
-0.08030107617378235,
-0.07838885486125946,
0.03503579646348953,
0.14817816019058228,
-0.049042072147130966,
-0.031178638339042664,
-0.07461428642272949,
0.05549251288175583,
-0.20481456816196442,
0.013067834079265594,
-0.120116226375103,
-0.03361887484788895,
0.008780489675700665,
-0.07366769760847092,
-0.024339502677321434,
0.04292701184749603,
-0.11616436392068863,
-0.01950869895517826,
-0.02646695449948311,
0.05116003006696701,
-0.0943492129445076,
-0.022697243839502335,
0.09442976117134094,
-0.046898115426301956,
0.1191806048154831,
0.16783474385738373,
-0.08513312041759491,
0.1451592743396759,
-0.11237127333879471,
-0.13850073516368866,
0.08292138576507568,
0.07481701672077179,
0.006833728402853012,
0.043043818324804306,
0.007361477706581354,
0.10268937796354294,
-0.02194422297179699,
0.010936463251709938,
-0.06305796653032303,
-0.12983012199401855,
-0.049794312566518784,
-0.0730682834982872,
-0.06801009923219681,
-0.002114989561960101,
-0.09158603101968765,
0.12544848024845123,
0.0805128887295723,
0.11542613804340363,
-0.00554802268743515,
0.050853852182626724,
-0.029939046129584312,
0.0270913727581501,
-0.01305240299552679,
-0.15766069293022156,
-0.028323078528046608,
-0.11119436472654343,
-0.04113372042775154,
-0.0014121198328211904,
0.22717143595218658,
-0.04567673057317734,
-0.05124102532863617,
0.03471451997756958,
0.11290016025304794,
0.01733183115720749,
-0.009230410680174828,
0.35112178325653076,
0.05192628875374794,
0.015861226245760918,
-0.07745657861232758,
-0.008946530520915985,
0.0037451153621077538,
-0.01899203285574913,
0.07681941241025925,
0.12012375146150589,
0.11893614381551743,
0.08286476135253906,
0.07413427531719208,
0.057788003236055374,
-0.09911981970071793,
-0.15559066832065582,
0.05076856538653374,
0.030768580734729767,
-0.0049784304574131966,
0.12199995666742325,
0.16096985340118408,
-0.05806657299399376,
0.025126079097390175,
0.027445083484053612,
-0.009075450710952282,
-0.12307929992675781,
-0.11253830045461655,
-0.035014472901821136,
-0.11753497272729874,
0.01503756269812584,
-0.03924454376101494,
0.013670837506651878,
0.09247907251119614,
0.06020009145140648,
-0.03545643761754036,
0.026794711127877235,
-0.02843908779323101,
-0.1208965852856636,
0.06472872197628021,
-0.047474607825279236,
-0.024794232100248337,
0.007853658869862556,
-0.011169195175170898,
-0.03490825369954109,
-0.09677230566740036,
-0.020360426977276802,
0.04704097658395767,
-0.04585934802889824,
0.03715931996703148,
-0.10804367065429688,
-0.05395694822072983,
-0.03300059214234352,
0.015826471149921417,
0.055975623428821564,
0.25471535325050354,
0.04643632471561432,
-0.011525340378284454,
0.05596932768821716,
0.12413408607244492,
-0.0536452941596508,
-0.21524640917778015,
-0.00253367330878973,
0.109442338347435,
0.09989847242832184,
0.02023141458630562,
0.015983493998646736,
0.02369743026793003,
-0.021883288398385048,
0.3232622444629669,
0.2524143159389496,
-0.04001496732234955,
0.048007652163505554,
-0.006226977799087763,
0.04191695898771286,
0.056144408881664276,
0.05626441165804863,
0.12212467193603516,
0.25059372186660767,
-0.04393121227622032,
-0.029009846970438957,
-0.06565113365650177,
-0.028191765770316124,
-0.1715874969959259,
0.0674651488661766,
0.003058692207559943,
-0.14065992832183838,
0.01390386838465929,
0.0992671400308609,
-0.16638171672821045,
0.07356959581375122,
-0.010308632627129555,
-0.11179487407207489,
-0.014068159274756908,
-0.004929869901388884,
0.1822381317615509,
0.07107631117105484,
0.019221963360905647,
-0.024874048307538033,
-0.06721246242523193,
0.08127664029598236,
0.013397011905908585,
-0.22377797961235046,
-0.013615122064948082,
0.11976145952939987,
-0.07893993705511093,
0.06770916283130646,
-0.02152561955153942,
0.03838202357292175,
0.061599407345056534,
0.11480335146188736,
-0.010465205647051334,
0.12714914977550507,
0.03410403057932854,
-0.09869271516799927,
-0.042371395975351334,
-0.04936403036117554,
-0.020262673497200012,
-0.08051027357578278,
0.018766943365335464,
-0.08093328028917313,
0.055093906819820404,
0.10231208801269531,
-0.05216087028384209,
-0.022550130262970924,
0.05867413431406021,
-0.10464810580015182,
0.06582805514335632,
-0.042124856263399124,
-0.036728113889694214,
-0.06839348375797272,
-0.05138879269361496,
-0.023816769942641258,
0.040644366294145584,
-0.14322204887866974,
-0.09543617814779282,
-0.010471748188138008,
-0.0716601088643074,
0.020941071212291718,
-0.0028891211841255426,
-0.046771176159381866,
-0.04036440700292587,
-0.06320753693580627,
0.03954038396477699,
-0.13833433389663696,
0.026349259540438652,
0.07941968739032745,
-0.006418921053409576,
0.0335477739572525,
-0.08137256652116776,
0.03269212320446968,
0.009146193973720074,
-0.10344815999269485,
-0.07442187517881393
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | token-classification | kabir5297/deberta_v3_ner_v1 | [
"transformers",
"safetensors",
"deberta-v2",
"token-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:07:01+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #deberta-v2 #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #deberta-v2 #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
52,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #deberta-v2 #token-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06466816365718842,
0.12197275459766388,
-0.004202458541840315,
0.029149644076824188,
0.12036502361297607,
0.006935111712664366,
0.06423568725585938,
0.1065516322851181,
-0.02678372710943222,
0.11724407225847244,
0.019794756546616554,
0.1101258173584938,
0.10480418801307678,
0.1741989552974701,
-0.005228097550570965,
-0.21802054345607758,
0.044925298541784286,
-0.1332864612340927,
-0.024949483573436737,
0.1159522756934166,
0.13522779941558838,
-0.12204837054014206,
0.06930451840162277,
-0.04209818318486214,
-0.007912974804639816,
-0.0322565995156765,
-0.05880237743258476,
-0.04924030974507332,
0.06812424957752228,
0.06659761071205139,
0.0681338757276535,
0.012976576574146748,
0.10400646179914474,
-0.2772457003593445,
0.020696397870779037,
0.08373123407363892,
0.005089117679744959,
0.06848078966140747,
0.05912717059254646,
-0.0766395553946495,
0.07375232130289078,
-0.07348829507827759,
0.15079154074192047,
0.07845799624919891,
-0.09164275974035263,
-0.20210139453411102,
-0.08833131939172745,
0.09044230729341507,
0.19723249971866608,
0.056620024144649506,
-0.028050832450389862,
0.12256447225809097,
-0.07247937470674515,
0.015507095493376255,
0.06527657806873322,
-0.07301480323076248,
-0.05207262933254242,
0.06885527074337006,
0.06803692132234573,
0.10003920644521713,
-0.13086174428462982,
-0.007739242631942034,
0.030164694413542747,
0.013562135398387909,
0.10788191854953766,
0.018514400348067284,
0.12156243622303009,
0.04128511995077133,
-0.14408166706562042,
-0.03994642570614815,
0.09361754357814789,
0.0397270992398262,
-0.05522630363702774,
-0.2456887662410736,
-0.01887490786612034,
-0.03032955899834633,
-0.033128105103969574,
-0.054186753928661346,
0.05287313833832741,
-0.021889202296733856,
0.08110702037811279,
-0.008305862545967102,
-0.07696685940027237,
-0.04674658551812172,
0.08184117078781128,
0.06834331154823303,
0.026918675750494003,
-0.02635417878627777,
0.008553886786103249,
0.11673150211572647,
0.10659375786781311,
-0.11640386283397675,
-0.04757297784090042,
-0.059655386954545975,
-0.08255314826965332,
-0.051134269684553146,
0.02667141705751419,
0.02667512185871601,
0.04563198238611221,
0.21625038981437683,
0.0014771533897146583,
0.04591803625226021,
0.02884340099990368,
0.01423268485814333,
0.06688534468412399,
0.09831439703702927,
-0.05904420465230942,
-0.11931392550468445,
-0.02217327058315277,
0.10633781552314758,
0.0066702705807983875,
-0.03250111639499664,
-0.05059574916958809,
0.06834481656551361,
0.024872468784451485,
0.12410221993923187,
0.06858504563570023,
0.012844047509133816,
-0.07673057168722153,
-0.06398788094520569,
0.17097899317741394,
-0.16335350275039673,
0.03551461920142174,
0.02393929846584797,
-0.053198374807834625,
-0.014516751281917095,
0.020755909383296967,
0.025373658165335655,
-0.011161229573190212,
0.0990719124674797,
-0.05368759483098984,
-0.03255419060587883,
-0.11457451432943344,
-0.053361907601356506,
0.023612400516867638,
0.02046036161482334,
-0.02989322878420353,
-0.04349198937416077,
-0.10500442981719971,
-0.07200396806001663,
0.08398986607789993,
-0.06797605007886887,
-0.04464494064450264,
-0.03520826995372772,
-0.07841871678829193,
0.014052528887987137,
0.00574632128700614,
0.11038210988044739,
-0.023612231016159058,
0.04836830496788025,
-0.05386055260896683,
0.07064847648143768,
0.13456082344055176,
0.02942483313381672,
-0.05367830768227577,
0.05159740895032883,
-0.2461555302143097,
0.10376992076635361,
-0.07264269888401031,
0.04777098074555397,
-0.16307511925697327,
-0.0198860764503479,
0.043219488114118576,
0.023323146626353264,
-0.005688810721039772,
0.1300518661737442,
-0.2077709287405014,
-0.035600125789642334,
0.17244622111320496,
-0.10709037631750107,
-0.08368578553199768,
0.05450167506933212,
-0.05821350961923599,
0.12177194654941559,
0.05322835594415665,
-0.01946827955543995,
0.025913206860423088,
-0.14479942619800568,
-0.010151791386306286,
-0.05895674601197243,
-0.02808297798037529,
0.15971976518630981,
0.05844848230481148,
-0.05183469131588936,
0.06317830085754395,
0.016611700877547264,
-0.01677791401743889,
-0.04923691228032112,
-0.032810237258672714,
-0.09724095463752747,
0.013582640327513218,
-0.07040367275476456,
0.02299460582435131,
-0.03090890683233738,
-0.08987399935722351,
-0.0339132584631443,
-0.15792976319789886,
0.02252473123371601,
0.09328845143318176,
-0.002892867662012577,
-0.021098729223012924,
-0.10148956626653671,
-0.015995139256119728,
0.02491818368434906,
0.0005505141452886164,
-0.14534035325050354,
-0.054077308624982834,
0.019842583686113358,
-0.1618579775094986,
0.03220152109861374,
-0.027691856026649475,
0.04522697255015373,
0.04093098267912865,
-0.0447295606136322,
-0.03263319656252861,
0.012876871041953564,
0.019164353609085083,
-0.0156744085252285,
-0.2759932279586792,
-0.016470206901431084,
-0.03563441336154938,
0.16242848336696625,
-0.2516782283782959,
0.04283948242664337,
0.05301886796951294,
0.13206687569618225,
0.01450103335082531,
-0.027684900909662247,
0.01769263669848442,
-0.06721992790699005,
-0.03368232399225235,
-0.06463972479104996,
-0.013003064319491386,
-0.038854893296957016,
-0.046517930924892426,
0.033844344317913055,
-0.16243807971477509,
-0.04162684455513954,
0.11188272386789322,
0.03858201205730438,
-0.15623866021633148,
-0.03822393715381622,
-0.04364866018295288,
-0.054222021251916885,
-0.07046913355588913,
-0.052408888936042786,
0.1032525822520256,
0.052487947046756744,
0.058352477848529816,
-0.06068374589085579,
-0.0675206333398819,
0.007080634590238333,
-0.023098701611161232,
-0.019511302933096886,
0.0780739039182663,
0.06551919132471085,
-0.12371252477169037,
0.09357328712940216,
0.09488903731107712,
0.08224551379680634,
0.09777180850505829,
0.004860724322497845,
-0.08961226046085358,
-0.03159978613257408,
0.026874035596847534,
0.015613501891493797,
0.15007883310317993,
-0.023757725954055786,
0.04783238098025322,
0.03627374395728111,
-0.0069470712915062904,
0.005104671698063612,
-0.09547439962625504,
0.03146127611398697,
0.027004197239875793,
-0.010457582771778107,
0.045607659965753555,
-0.05758532136678696,
0.016228504478931427,
0.10561399161815643,
0.042750515043735504,
0.04958893731236458,
0.008061937056481838,
-0.04876195266842842,
-0.11458532512187958,
0.17577087879180908,
-0.12034392356872559,
-0.23503929376602173,
-0.11996304988861084,
-0.010993306525051594,
0.034179072827100754,
-0.007477967068552971,
0.022685132920742035,
-0.07265906035900116,
-0.11820217967033386,
-0.09142062067985535,
0.04840506240725517,
0.05493762716650963,
-0.08353719115257263,
-0.06416553258895874,
0.0699615478515625,
0.04743001237511635,
-0.1389177143573761,
0.02380296215415001,
0.034620750695466995,
-0.0872102677822113,
0.0029819237533956766,
0.08519035577774048,
0.05888642370700836,
0.18008075654506683,
0.010788527317345142,
-0.02595219388604164,
0.01976708509027958,
0.2022116482257843,
-0.13552935421466827,
0.10456039011478424,
0.13558508455753326,
-0.06794831901788712,
0.08161590993404388,
0.20791222155094147,
0.041511692106723785,
-0.10927121341228485,
0.045256976038217545,
0.03615958243608475,
-0.022633234038949013,
-0.24851085245609283,
-0.07810220122337341,
0.009074265137314796,
-0.06641479581594467,
0.07369695603847504,
0.08135901391506195,
0.09815715998411179,
0.016307005658745766,
-0.10439245402812958,
-0.05691203474998474,
0.05359639227390289,
0.11178097128868103,
0.0005312497960403562,
-0.015559414401650429,
0.09605181962251663,
-0.019349027425050735,
0.022786341607570648,
0.09166137129068375,
0.003685855306684971,
0.18121956288814545,
0.05096123367547989,
0.14115169644355774,
0.08846595138311386,
0.05260030925273895,
0.010571611113846302,
0.00519154267385602,
0.01690458506345749,
0.022460386157035828,
-0.016591908410191536,
-0.08765360713005066,
-0.0025916658341884613,
0.1303529143333435,
0.023996995761990547,
0.04771316051483154,
0.005256648175418377,
-0.041017983108758926,
0.08706887066364288,
0.1720123142004013,
0.015851903706789017,
-0.19922499358654022,
-0.06961675733327866,
0.07131917029619217,
-0.08101397007703781,
-0.10567333549261093,
-0.03051498904824257,
0.03491778299212456,
-0.17742227017879486,
0.018296167254447937,
-0.022482281550765038,
0.10083698481321335,
-0.12198518961668015,
-0.014369131997227669,
0.050927821546792984,
0.07674054801464081,
-0.016325542703270912,
0.06606277078390121,
-0.18025541305541992,
0.13103583455085754,
0.016894888132810593,
0.07214778661727905,
-0.0892268717288971,
0.08784275501966476,
0.0024716476909816265,
0.0012896760599687696,
0.14359614253044128,
0.002044856082648039,
-0.05432117357850075,
-0.10964708775281906,
-0.08345004916191101,
-0.013369085267186165,
0.12828469276428223,
-0.1321774125099182,
0.0964808538556099,
-0.018887361511588097,
-0.046898361295461655,
0.005365802440792322,
-0.12184431403875351,
-0.1419176608324051,
-0.1701805591583252,
0.04523906111717224,
-0.1329699605703354,
0.04354427382349968,
-0.10502105951309204,
-0.04792432487010956,
-0.04497060924768448,
0.19979332387447357,
-0.2204522341489792,
-0.07042528688907623,
-0.15197555720806122,
-0.05513730272650719,
0.11936050653457642,
-0.045758746564388275,
0.08609742671251297,
0.014394130557775497,
0.18831388652324677,
0.01490827463567257,
-0.012921185232698917,
0.11221019923686981,
-0.10288592427968979,
-0.20936010777950287,
-0.10631230473518372,
0.13907743990421295,
0.14045150578022003,
0.03718477860093117,
0.00017439395014662296,
0.030241772532463074,
-0.008310341276228428,
-0.11130935698747635,
0.02226727083325386,
0.1818859577178955,
0.11698096990585327,
0.034827422350645065,
-0.04558476433157921,
-0.1282479166984558,
-0.08240164071321487,
-0.04210341349244118,
0.013022612780332565,
0.19483055174350739,
-0.07285630702972412,
0.1734362244606018,
0.15553328394889832,
-0.058381255716085434,
-0.19954784214496613,
0.03224672004580498,
0.04052870720624924,
0.0018155946163460612,
0.05836709216237068,
-0.20120330154895782,
0.09613421559333801,
0.003965679090470076,
-0.05700134113430977,
0.12291120737791061,
-0.1794978678226471,
-0.1494377702474594,
0.05977441370487213,
0.06504373997449875,
-0.18471670150756836,
-0.12030193209648132,
-0.0906098261475563,
-0.04907683655619621,
-0.11491015553474426,
0.08198710530996323,
-0.004478295799344778,
0.010844743810594082,
0.03223739564418793,
0.016181785613298416,
0.011378921568393707,
-0.04127343371510506,
0.18362799286842346,
-0.008517292328178883,
0.048652928322553635,
-0.08058580756187439,
-0.06042208895087242,
0.04949076101183891,
-0.06943603605031967,
0.07448729127645493,
-0.010821190662682056,
0.013506816700100899,
-0.10542158782482147,
-0.05650888383388519,
-0.02736050635576248,
0.022536376491189003,
-0.08460354059934616,
-0.10347003489732742,
-0.03973411023616791,
0.10050813108682632,
0.09168742597103119,
-0.04071033000946045,
-0.05761529505252838,
-0.08062192797660828,
0.0355151891708374,
0.19572728872299194,
0.17283788323402405,
0.05753122270107269,
-0.0621308870613575,
-0.0033672878053039312,
-0.012863550335168839,
0.0513107031583786,
-0.22106507420539856,
0.05985407903790474,
0.037558674812316895,
0.030016964301466942,
0.11966370791196823,
-0.024740351364016533,
-0.16162925958633423,
-0.055835455656051636,
0.055034637451171875,
-0.07074437290430069,
-0.16294218599796295,
0.012024333700537682,
0.0750519260764122,
-0.1531451791524887,
-0.023071492090821266,
0.047226645052433014,
-0.022888783365488052,
-0.03552026301622391,
0.008166993036866188,
0.07943081855773926,
0.011047901585698128,
0.08480099588632584,
0.05661538988351822,
0.09406866878271103,
-0.09947272390127182,
0.06839016079902649,
0.08378785848617554,
-0.09536738693714142,
0.036548908799886703,
0.06979306042194366,
-0.06812794506549835,
-0.036443546414375305,
0.04483252763748169,
0.09306086599826813,
0.03461608663201332,
-0.0496140718460083,
0.009975533001124859,
-0.0991714745759964,
0.05783834680914879,
0.11240030825138092,
0.04345378279685974,
0.005570659413933754,
0.03424717113375664,
0.04788278043270111,
-0.09421747177839279,
0.12488389015197754,
0.03327573090791702,
0.028144897893071175,
-0.04884033277630806,
-0.030886849388480186,
0.03401049226522446,
-0.025599602609872818,
-0.013930507935583591,
-0.03926628828048706,
-0.06888161599636078,
-0.013594170100986958,
-0.17240136861801147,
-0.0029535088688135147,
-0.0383978933095932,
0.006674154195934534,
0.018208200111985207,
-0.03393350914120674,
0.010948311537504196,
0.016729488968849182,
-0.0701742097735405,
-0.056464649736881256,
-0.010333591140806675,
0.10393849015235901,
-0.1704758256673813,
0.009352225810289383,
0.07282818108797073,
-0.12330053001642227,
0.0882670134305954,
0.01709691248834133,
0.009928959421813488,
0.03636078163981438,
-0.12878885865211487,
0.04661022499203682,
-0.00919650960713625,
0.014418895356357098,
0.0528080016374588,
-0.21462877094745636,
-0.0046529583632946014,
-0.05383024364709854,
-0.05298306420445442,
-0.009345833212137222,
-0.03066449612379074,
-0.11840657889842987,
0.10337110608816147,
0.00554006127640605,
-0.07858934998512268,
-0.027129173278808594,
0.03534814342856407,
0.08033894002437592,
-0.02888466790318489,
0.15499885380268097,
-0.014964357949793339,
0.06925736367702484,
-0.18353688716888428,
-0.023686321452260017,
-0.017806880176067352,
0.02434595860540867,
-0.03405971825122833,
-0.01627812534570694,
0.04919828101992607,
-0.026282284408807755,
0.19468159973621368,
-0.01404520682990551,
0.053027138113975525,
0.06716758012771606,
-0.01731599122285843,
-0.025200918316841125,
0.10833217203617096,
0.05057381093502045,
0.012596861459314823,
0.03097045235335827,
0.002724191639572382,
-0.032638128846883774,
-0.0015376374358311296,
-0.16326214373111725,
0.07423728704452515,
0.1683049499988556,
0.0821618065237999,
-0.012878294102847576,
0.059728000313043594,
-0.11478280276060104,
-0.12184109538793564,
0.10090997815132141,
-0.05217669531702995,
-0.014969355426728725,
-0.058527734130620956,
0.1378937065601349,
0.14838765561580658,
-0.1930767446756363,
0.0610617995262146,
-0.06931009143590927,
-0.04964228719472885,
-0.10728566348552704,
-0.1681494265794754,
-0.057678062468767166,
-0.05907471105456352,
-0.0213418398052454,
-0.053565166890621185,
0.0680057555437088,
0.07332909107208252,
0.014450018294155598,
0.013244451954960823,
0.07624361664056778,
-0.018915481865406036,
0.008525490760803223,
0.028719795867800713,
0.06702575832605362,
0.009052124805748463,
-0.043495066463947296,
0.012300699949264526,
-0.005083650816231966,
0.03453131020069122,
0.04542510211467743,
0.034886524081230164,
-0.02928536757826805,
0.007193939294666052,
-0.027187315747141838,
-0.11160685122013092,
0.042607881128787994,
-0.024967635050415993,
-0.06782487779855728,
0.13797178864479065,
0.028327153995633125,
-0.009714162908494473,
-0.02516608126461506,
0.2606247663497925,
-0.07522029429674149,
-0.09206689894199371,
-0.1338571459054947,
0.14171090722084045,
-0.024645673111081123,
0.06760124862194061,
0.033446263521909714,
-0.11445143073797226,
0.026036731898784637,
0.1324138194322586,
0.14793062210083008,
-0.05245824158191681,
0.017998777329921722,
0.021558376029133797,
0.0024578277952969074,
-0.040562503039836884,
0.050375621765851974,
0.07319269329309464,
0.12713441252708435,
-0.052478544414043427,
0.08453772962093353,
-0.005550839006900787,
-0.09648649394512177,
-0.02982695773243904,
0.11937443912029266,
-0.006796215195208788,
0.0190102681517601,
-0.06431698054075241,
0.12723416090011597,
-0.0369122140109539,
-0.2703137695789337,
0.06783135235309601,
-0.06748911738395691,
-0.14737756550312042,
-0.02443784847855568,
0.05718240514397621,
-0.01098309550434351,
0.0282635148614645,
0.06607423722743988,
-0.07097610086202621,
0.19599688053131104,
0.035155706107616425,
-0.04580609127879143,
-0.06696486473083496,
0.07266953587532043,
-0.10395272821187973,
0.29104846715927124,
0.008048768155276775,
0.058571070432662964,
0.09949593245983124,
-0.0245056115090847,
-0.13227766752243042,
0.028358159586787224,
0.0850125253200531,
-0.07332444190979004,
0.05352538824081421,
0.21757395565509796,
-0.013260896317660809,
0.11338580399751663,
0.07414449006319046,
-0.10197935998439789,
0.050293050706386566,
-0.10382803529500961,
-0.09831703454256058,
-0.08538446575403214,
0.09536542743444443,
-0.0577712245285511,
0.14520715177059174,
0.1225363090634346,
-0.0463496632874012,
0.021422745659947395,
-0.023490076884627342,
0.046754222363233566,
0.010136643424630165,
0.12518633902072906,
0.014053969644010067,
-0.19339033961296082,
0.027584588155150414,
0.00011195550177944824,
0.10043127089738846,
-0.20675040781497955,
-0.10115297138690948,
0.053753241896629333,
0.0020328264217823744,
-0.06096027418971062,
0.12357921153306961,
0.05461353063583374,
0.04136013612151146,
-0.04750092700123787,
-0.030941203236579895,
-0.010107891634106636,
0.1610364019870758,
-0.10999637842178345,
-0.003804087173193693
] |
null | null | null |
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="shazzz/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
| {"tags": ["FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "q-FrozenLake-v1-4x4-noSlippery", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "FrozenLake-v1-4x4-no_slippery", "type": "FrozenLake-v1-4x4-no_slippery"}, "metrics": [{"type": "mean_reward", "value": "1.00 +/- 0.00", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | shazzz/q-FrozenLake-v1-4x4-noSlippery | [
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | 2024-02-12T11:09:49+00:00 | [] | [] | TAGS
#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
|
# Q-Learning Agent playing1 FrozenLake-v1
This is a trained model of a Q-Learning agent playing FrozenLake-v1 .
## Usage
| [
"# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
"TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n",
"# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
40,
39
] | [
"passage: TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
0.04578453302383423,
-0.08074592798948288,
-0.00430759321898222,
0.10720831900835037,
0.05034215748310089,
-0.040469273924827576,
0.11997015029191971,
0.018999949097633362,
0.20601962506771088,
-0.010012076236307621,
0.1455274522304535,
0.007022971753031015,
-0.006192410364747047,
0.1867983490228653,
0.04572829231619835,
-0.26324528455734253,
0.01831899583339691,
-0.09495259821414948,
-0.07281816750764847,
0.11870454251766205,
0.05470194295048714,
-0.01901467889547348,
-0.0007633853238075972,
0.056141503155231476,
-0.0673527717590332,
0.0007737681735306978,
0.031996939331293106,
-0.012976245954632759,
0.19804789125919342,
-0.02254498563706875,
0.06641989201307297,
0.054705578833818436,
0.0758768692612648,
-0.1998077929019928,
0.0358855277299881,
-0.04215473681688309,
-0.09439758956432343,
-0.03934839740395546,
-0.018780618906021118,
0.05878105387091637,
0.053356342017650604,
0.03858819976449013,
0.058354366570711136,
0.09384993463754654,
-0.0773480236530304,
0.04328357055783272,
0.04280758649110794,
0.024811049923300743,
0.04589218273758888,
-0.0237203948199749,
-0.027002155780792236,
0.08246652781963348,
-0.22182892262935638,
0.10318073630332947,
-0.010159241035580635,
-0.5270710587501526,
-0.00633762264624238,
0.24088262021541595,
0.11517096310853958,
0.05707438662648201,
-0.06903956830501556,
0.10566288232803345,
0.03913382440805435,
-0.007209456991404295,
0.03210983797907829,
0.02150118350982666,
0.12817370891571045,
0.06009242683649063,
-0.09581366181373596,
0.040699947625398636,
0.13722525537014008,
0.012822695076465607,
0.020306183025240898,
-0.08888901025056839,
0.0410032719373703,
-0.03461858257651329,
-0.007679527159780264,
-0.09758518636226654,
0.05478060990571976,
0.012466507963836193,
-0.0934976264834404,
-0.09247440844774246,
-0.04236573353409767,
-0.06708304584026337,
0.11252415925264359,
0.046419668942689896,
-0.0874939113855362,
0.03884070739150047,
-0.06760413944721222,
0.05918780341744423,
-0.16863860189914703,
0.02074250765144825,
-0.06627868115901947,
-0.09376336634159088,
-0.11799788475036621,
-0.01683047041296959,
-0.07946427166461945,
0.009092256426811218,
0.056664444506168365,
0.1447116881608963,
0.22076484560966492,
0.06690320372581482,
0.09728849679231644,
0.07456006109714508,
0.06531001627445221,
0.1538129299879074,
0.10918238013982773,
0.019075315445661545,
-0.015266558155417442,
0.0948706716299057,
-0.06445580720901489,
-0.1351388692855835,
-0.15579092502593994,
0.005488025024533272,
0.0983937531709671,
0.08871900290250778,
-0.044080477207899094,
-0.006702381651848555,
-0.024641724303364754,
0.08566431701183319,
-0.11314457654953003,
-0.024612564593553543,
-0.002267979085445404,
0.06882024556398392,
-0.024801667779684067,
0.020378148183226585,
-0.06242705136537552,
0.12715265154838562,
0.04222423583269119,
-0.059924717992544174,
-0.055308472365140915,
-0.03053177334368229,
-0.014276440255343914,
-0.027539284899830818,
0.02446848154067993,
-0.07659092545509338,
0.04767750948667526,
-0.16766095161437988,
-0.042871296405792236,
-0.04784649610519409,
0.025697942823171616,
-0.03907240927219391,
-0.13557587563991547,
-0.17699143290519714,
-0.048906855285167694,
-0.022438718006014824,
0.03549358621239662,
-0.038111843168735504,
0.006551501806825399,
-0.006318534724414349,
-0.1583600640296936,
0.09783563017845154,
0.09784027189016342,
-0.03643378987908363,
-0.02749447710812092,
0.056263517588377,
-0.07194498926401138,
0.1561182290315628,
-0.21054518222808838,
-0.054014235734939575,
-0.044764336198568344,
-0.06595750898122787,
0.19673264026641846,
0.012690845876932144,
-0.01202624011784792,
0.19873127341270447,
-0.29073721170425415,
-0.06078760325908661,
0.12533614039421082,
-0.07834373414516449,
-0.0936407670378685,
0.06941844522953033,
-0.04206686094403267,
0.023345354944467545,
0.046047765761613846,
0.36345911026000977,
-0.02069227211177349,
-0.16197136044502258,
-0.021782705560326576,
0.13971707224845886,
-0.1184760183095932,
0.059895481914281845,
0.04240793362259865,
0.12543781101703644,
-0.04250509291887283,
-0.018672896549105644,
-0.09023164212703705,
0.05999075248837471,
-0.05241934582591057,
-0.09016361832618713,
-0.03393383324146271,
-0.07645075023174286,
0.13294468820095062,
-0.0629684180021286,
0.05601520463824272,
-0.03255095332860947,
-0.07133250683546066,
-0.050324998795986176,
-0.016492370516061783,
0.04460815340280533,
0.05951254442334175,
-0.12794871628284454,
0.11029167473316193,
0.13025271892547607,
-0.0006193425506353378,
-0.07498852163553238,
-0.17872096598148346,
0.003240168560296297,
0.009576505981385708,
0.039837226271629333,
0.17141658067703247,
0.12209978699684143,
0.033295199275016785,
0.008770671673119068,
-0.06389404833316803,
-0.18276847898960114,
0.058129217475652695,
-0.056212130934000015,
-0.14230976998806,
-0.052409034222364426,
-0.0728459507226944,
0.017381802201271057,
-0.0859743058681488,
-0.017379917204380035,
0.021926190704107285,
0.006908397190272808,
0.02990424446761608,
-0.026645656675100327,
-0.049561817198991776,
0.021254703402519226,
0.06490101665258408,
-0.0037617047782987356,
0.12023693323135376,
0.008277264423668385,
-0.18308481574058533,
0.07930773496627808,
0.08478537946939468,
0.09196605533361435,
0.013250201940536499,
0.02685922384262085,
-0.021522263064980507,
-0.08061408251523972,
-0.054420311003923416,
0.02957955375313759,
0.11417073011398315,
0.1317172348499298,
0.2361993044614792,
0.08753683418035507,
0.04697408527135849,
-0.02164587564766407,
-0.016415923833847046,
0.002810494042932987,
-0.06318057328462601,
-0.029935607686638832,
0.10614971816539764,
0.05865858122706413,
-0.067733034491539,
-0.04576427489519119,
0.09590928256511688,
0.02732124738395214,
0.21205885708332062,
-0.03342745825648308,
0.01286078616976738,
-0.10957037657499313,
-0.06550975888967514,
-0.031982194632291794,
0.09201868623495102,
0.09498392790555954,
0.009755023755133152,
-0.022056059911847115,
-0.04259001836180687,
0.0012916827108711004,
-0.1334889680147171,
-0.10375088453292847,
0.026475343853235245,
0.013400445692241192,
-0.11206940561532974,
0.11674030870199203,
-0.11352457851171494,
0.039504457265138626,
0.06024791672825813,
-0.13837239146232605,
0.04428480193018913,
-0.029713207855820656,
-0.07886212319135666,
0.16866780817508698,
-0.11075661331415176,
-0.094340018928051,
-0.08831550180912018,
0.004082420375198126,
0.0075836325995624065,
-0.03922267258167267,
-0.009283260442316532,
-0.19952571392059326,
-0.005375816952437162,
-0.03544965013861656,
0.013616434298455715,
-0.06988783925771713,
-0.11287739872932434,
-0.010957922786474228,
0.07084179669618607,
-0.043388739228248596,
-0.07803605496883392,
0.007967432029545307,
-0.08923084288835526,
-0.10623309016227722,
0.028189711272716522,
0.019765101373195648,
-0.022883659228682518,
0.16152891516685486,
0.01816628873348236,
0.05626589432358742,
-0.03298520669341087,
0.30665266513824463,
-0.038163769990205765,
0.08371731638908386,
-0.02993497997522354,
-0.07433546334505081,
0.06130730360746384,
-0.022327827289700508,
0.06086638569831848,
-0.020221687853336334,
-0.02362890914082527,
0.0077952733263373375,
-0.08579335361719131,
-0.18365982174873352,
-0.05417544022202492,
0.03724347800016403,
0.195254847407341,
0.031118987128138542,
0.01910330168902874,
-0.0488768145442009,
-0.010547760874032974,
0.1665220558643341,
-0.10005921125411987,
0.04030545800924301,
-0.05366240441799164,
0.11506262421607971,
-0.08640182018280029,
0.06195629760622978,
0.020486772060394287,
0.04266135022044182,
-0.04877188801765442,
0.09486009180545807,
0.0826394334435463,
0.1121082529425621,
-0.02206910029053688,
0.046257395297288895,
0.019012698903679848,
0.07383184134960175,
0.11073657125234604,
0.0368414968252182,
-0.0729052945971489,
0.001982470043003559,
-0.006313489284366369,
-0.039427030831575394,
0.11933320760726929,
0.17963355779647827,
-0.11991413682699203,
-0.05106910318136215,
0.27167606353759766,
0.0031242913100868464,
0.19481229782104492,
-0.01315275114029646,
0.043591804802417755,
-0.04484925419092178,
0.04572054371237755,
-0.05338600277900696,
-0.04086209088563919,
0.2094656229019165,
0.08045925945043564,
-0.17165091633796692,
-0.08549032360315323,
-0.05912299454212189,
0.07081323862075806,
0.10728751868009567,
0.0013539529172703624,
-0.04156802222132683,
0.0004610282776411623,
0.0014198932331055403,
0.08339415490627289,
-0.14520122110843658,
0.11816094070672989,
-0.03172019124031067,
0.05612684786319733,
0.017555562779307365,
-0.045326150953769684,
0.04264266416430473,
0.07474290579557419,
0.26618310809135437,
0.0904107540845871,
-0.040318213403224945,
-0.0892091691493988,
-0.12260187417268753,
0.010461576282978058,
0.029102616012096405,
-0.03534553572535515,
0.0037547778338193893,
-0.020087555050849915,
0.0318896509706974,
0.008264793083071709,
0.016230624169111252,
-0.08987458795309067,
-0.03175399824976921,
-0.027736429125070572,
-0.023839212954044342,
0.10733365267515182,
-0.09495144337415695,
-0.1444292515516281,
-0.15713949501514435,
0.04191131144762039,
-0.0766405463218689,
-0.056593164801597595,
-0.054507751017808914,
-0.05239389091730118,
-0.0311186034232378,
-0.03773957118391991,
0.09099467098712921,
-0.0021037792321294546,
0.14807306230068207,
-0.1920108050107956,
-0.04220759496092796,
0.051812779158353806,
-0.07607918977737427,
-0.08729588985443115,
0.03410962224006653,
0.12136995792388916,
0.05116051807999611,
0.11504370719194412,
0.013609255664050579,
0.09567681699991226,
0.0045484392903745174,
-0.06713183224201202,
0.15302421152591705,
-0.14069625735282898,
-0.27875974774360657,
-0.03836318850517273,
0.016946332529187202,
0.1615200787782669,
-0.05613167956471443,
0.031766023486852646,
0.3335736393928528,
0.27782970666885376,
-0.1428707242012024,
0.25916144251823425,
0.019178593531250954,
0.004398873541504145,
-0.19130495190620422,
-0.10125631093978882,
0.025324683636426926,
0.04740457236766815,
0.12032642960548401,
-0.14564448595046997,
-0.010732659138739109,
-0.04543145373463631,
-0.025908485054969788,
0.10386138409376144,
-0.12300799041986465,
-0.07263197749853134,
0.07765276730060577,
0.039809420704841614,
0.1808302253484726,
0.03932500258088112,
0.0014799144119024277,
0.13626977801322937,
0.06612244248390198,
0.019124457612633705,
0.05216038227081299,
0.08028066903352737,
-0.018944554030895233,
0.14207926392555237,
0.05448179319500923,
-0.02551644667983055,
0.052681710571050644,
-0.0054580713622272015,
-0.03219012916088104,
0.015605825930833817,
-0.183198019862175,
-0.10147556662559509,
-0.0561356320977211,
-0.10798973590135574,
-0.04978342354297638,
0.056853994727134705,
-0.12395523488521576,
-0.007896827533841133,
-0.03841273859143257,
0.03718273714184761,
-0.07831971347332001,
-0.09360362589359283,
-0.036494381725788116,
0.1351792961359024,
0.07210618257522583,
0.04471297934651375,
0.035655103623867035,
-0.07390819489955902,
0.07097936421632767,
0.21671734750270844,
0.08159157633781433,
0.028919655829668045,
-0.19545674324035645,
-0.024042490869760513,
-0.0803457647562027,
0.06306298077106476,
-0.08856996893882751,
-0.016788700595498085,
0.11923003196716309,
0.08616556972265244,
0.05413002520799637,
0.09640096127986908,
-0.045083072036504745,
0.021686913445591927,
0.02684609219431877,
-0.15131035447120667,
-0.18501274287700653,
-0.08534606546163559,
-0.03519878163933754,
0.11561143398284912,
-0.06398691236972809,
0.10897188633680344,
-0.13615410029888153,
0.010051886551082134,
-0.006060056854039431,
0.02693452313542366,
-0.03596206381917,
-0.11251141875982285,
0.15348562598228455,
0.11999429017305374,
-0.06767056882381439,
0.03127254918217659,
-0.09527092427015305,
-0.04423454403877258,
0.12686803936958313,
-0.013623855076730251,
-0.0371493324637413,
-0.054547641426324844,
-0.03628576174378395,
0.15247689187526703,
-0.03436964750289917,
0.008244883269071579,
-0.041229065507650375,
-0.18217355012893677,
0.0798322781920433,
0.09045056998729706,
0.019827889278531075,
-0.031874191015958786,
-0.09797266125679016,
-0.010231015272438526,
-0.0011165260802954435,
0.11730700731277466,
-0.10696814209222794,
-0.10933240503072739,
-0.15144047141075134,
0.06713984161615372,
-0.0007159380475059152,
0.18502596020698547,
-0.06394898891448975,
-0.08904669433832169,
-0.12429379671812057,
0.02344517596065998,
-0.0027384376153349876,
-0.042264558374881744,
0.01618490368127823,
0.07992301136255264,
-0.04095321521162987,
0.02075677551329136,
-0.06651144474744797,
0.06372585147619247,
-0.11786920577287674,
0.09625071287155151,
0.01063506118953228,
0.016993753612041473,
-0.0417880080640316,
-0.01618220843374729,
0.039470795542001724,
-0.057925306260585785,
0.07921463251113892,
0.011758086271584034,
0.0010938759660348296,
0.10196787863969803,
-0.0034960443153977394,
0.06409632414579391,
-0.05372481048107147,
-0.023290161043405533,
0.06578411161899567,
-0.05874887853860855,
-0.03370826691389084,
-0.1573946475982666,
-0.0709633082151413,
0.020051732659339905,
-0.04775108024477959,
0.002077929675579071,
0.03673801198601723,
0.062159497290849686,
-0.06937079131603241,
-0.12125655263662338,
-0.043812792748212814,
-0.028638383373618126,
0.021301284432411194,
0.10829301923513412,
-0.07526551932096481,
0.1547859013080597,
-0.052787959575653076,
-0.00020603960729204118,
0.07437096536159515,
0.04048224538564682,
0.01393822580575943,
-0.10422444343566895,
-0.04698587954044342,
-0.11035211384296417,
0.1502903699874878,
-0.007902312092483044,
-0.03533121198415756,
0.03719403222203255,
-0.11946307867765427,
-0.1572723090648651,
0.03418220207095146,
0.10199101269245148,
0.0448341928422451,
0.025807438418269157,
0.027079269289970398,
-0.04042419046163559,
-0.021270349621772766,
-0.07034418731927872,
0.0882953479886055,
-0.12085357308387756,
-0.09669415652751923,
0.09555385261774063,
0.12178351730108261,
-0.0036850625183433294,
-0.07441367954015732,
0.11554073542356491,
-0.021787192672491074,
0.05525410920381546,
-0.02971339225769043,
0.10308072715997696,
0.0796005055308342,
-0.12273547053337097,
0.005693064536899328,
-0.036891788244247437,
-0.0741485133767128,
-0.12975730001926422,
0.019545545801520348,
-0.061916105449199677,
-0.13383042812347412,
0.12179028987884521,
-0.09376577287912369,
0.030037038028240204,
-0.10506992787122726,
0.021338803693652153,
0.01864001713693142,
0.061665527522563934,
-0.10988292098045349,
0.08575301617383957,
0.13424484431743622,
-0.043199893087148666,
-0.07184189558029175,
-0.12455986440181732,
-0.05022053420543671,
-0.04231856390833855,
-0.13957437872886658,
-0.11600435525178909,
0.0100301094353199,
-0.023418782278895378,
-0.05818291753530502,
0.0015462689334526658,
-0.03659068048000336,
0.008594646118581295,
0.021907730028033257,
0.04032021388411522,
-0.02693161368370056,
0.05134565755724907,
-0.057569269090890884,
-0.052510857582092285,
0.11489357799291611,
0.04113486409187317,
-0.03561042994260788,
-0.052359987050294876,
0.12997733056545258,
-0.11959461867809296,
0.07662346214056015,
-0.020313527435064316,
0.017129231244325638,
-0.06435854732990265,
0.17131924629211426,
0.11673715710639954,
-0.1367570012807846,
-0.005008010193705559,
-0.08210669457912445,
0.020409544929862022,
0.023555370047688484,
0.13693512976169586,
-0.03411718085408211,
-0.0012358218664303422,
-0.1580323874950409,
0.018575575202703476,
-0.18557456135749817,
-0.03716109320521355,
0.04671547934412956,
0.09917585551738739,
0.15293832123279572,
-0.0034432117827236652,
-0.1263325810432434,
0.10424192249774933,
-0.2118520885705948,
0.0907607227563858,
0.05121984705328941,
-0.11874113976955414,
-0.06765396893024445,
-0.06795281916856766,
0.1198519766330719,
0.009196433238685131,
0.2040700763463974,
-0.013615905307233334,
-0.09132910519838333,
-0.07060808688402176,
-0.01980910450220108,
-0.030524181202054024,
0.09714830666780472,
0.041414931416511536,
0.04653804749250412,
0.12821412086486816,
0.00368314771912992,
0.07533777505159378,
0.060310911387205124,
0.02759413793683052,
-0.012300663627684116,
0.04076618701219559,
0.08261215686798096,
-0.14588621258735657,
-0.1659701019525528,
0.1326720416545868,
0.025149408727884293,
0.11792458593845367,
0.03658788278698921,
-0.1549617499113083,
0.06687124073505402,
0.2523096203804016,
-0.11147607117891312,
0.02505038119852543,
0.12737524509429932,
-0.0366884209215641,
0.0672016367316246,
0.1144871786236763,
-0.02633814327418804,
-0.05217865854501724,
-0.011363590136170387,
0.10233135521411896,
0.028660254552960396,
-0.04646271467208862,
-0.02340836264193058,
-0.03373933956027031,
-0.019070526584982872,
-0.011738128960132599,
-0.0909019410610199,
-0.1543993502855301,
-0.10471053421497345,
-0.16619662940502167,
0.04399140924215317,
-0.04626438021659851,
0.13418889045715332,
0.09469578415155411,
-0.012723101302981377,
0.04568437114357948,
0.028575526550412178,
0.07275456190109253,
0.07916246354579926,
-0.02939477376639843,
-0.036159269511699677
] |
null | null | transformers |
## Exllama v2 Quantizations of Pasta-Sea-7b-128k
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.13">turboderp's ExLlamaV2 v0.0.13</a> for quantization.
<b>The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)</b>
Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
Original model: https://huggingface.co/Test157t/Pasta-Sea-7b-128k
| Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description |
| ----- | ---- | ------- | ------ | ------ | ------ | ------------ |
| [8_0](https://huggingface.co/bartowski/Pasta-Sea-7b-128k-exl2/tree/8_0) | 8.0 | 8.0 | 8.4 GB | 9.8 GB | 11.8 GB | Maximum quality that ExLlamaV2 can produce, near unquantized performance. |
| [6_5](https://huggingface.co/bartowski/Pasta-Sea-7b-128k-exl2/tree/6_5) | 6.5 | 8.0 | 7.2 GB | 8.6 GB | 10.6 GB | Very similar to 8.0, good tradeoff of size vs performance, **recommended**. |
| [5_0](https://huggingface.co/bartowski/Pasta-Sea-7b-128k-exl2/tree/5_0) | 5.0 | 6.0 | 6.0 GB | 7.4 GB | 9.4 GB | Slightly lower quality vs 6.5, but usable on 8GB cards. |
| [4_25](https://huggingface.co/bartowski/Pasta-Sea-7b-128k-exl2/tree/4_25) | 4.25 | 6.0 | 5.3 GB | 6.7 GB | 8.7 GB | GPTQ equivalent bits per weight, slightly higher quality. |
| [3_5](https://huggingface.co/bartowski/Pasta-Sea-7b-128k-exl2/tree/3_5) | 3.5 | 6.0 | 4.7 GB | 6.1 GB | 8.1 GB | Lower quality, only use if you have to. |
## Download instructions
With git:
```shell
git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/Pasta-Sea-7b-128k-exl2 Pasta-Sea-7b-128k-exl2-6_5
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Pasta-Sea-7b-128k-exl2`:
```shell
mkdir Pasta-Sea-7b-128k-exl2
huggingface-cli download bartowski/Pasta-Sea-7b-128k-exl2 --local-dir Pasta-Sea-7b-128k-exl2 --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
Linux:
```shell
mkdir Pasta-Sea-7b-128k-exl2-6_5
huggingface-cli download bartowski/Pasta-Sea-7b-128k-exl2 --revision 6_5 --local-dir Pasta-Sea-7b-128k-exl2-6_5 --local-dir-use-symlinks False
```
Windows (which apparently doesn't like _ in folders sometimes?):
```shell
mkdir Pasta-Sea-7b-128k-exl2-6.5
huggingface-cli download bartowski/Pasta-Sea-7b-128k-exl2 --revision 6_5 --local-dir Pasta-Sea-7b-128k-exl2-6.5 --local-dir-use-symlinks False
```
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski | {"library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": ["Test157t/Kunocchini-7b-128k-test", "Test157t/Pasta-Lake-7b"], "quantized_by": "bartowski", "pipeline_tag": "text-generation"} | text-generation | bartowski/Pasta-Sea-7b-128k-exl2 | [
"transformers",
"mergekit",
"merge",
"text-generation",
"base_model:Test157t/Kunocchini-7b-128k-test",
"base_model:Test157t/Pasta-Lake-7b",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:15:12+00:00 | [] | [] | TAGS
#transformers #mergekit #merge #text-generation #base_model-Test157t/Kunocchini-7b-128k-test #base_model-Test157t/Pasta-Lake-7b #endpoints_compatible #region-us
| Exllama v2 Quantizations of Pasta-Sea-7b-128k
---------------------------------------------
Using <a href="URL ExLlamaV2 v0.0.13 for quantization.
**The "main" branch only contains the URL, download one of the other branches for the model (see below)**
Each branch contains an individual bits per weight, with the main one containing only the URL for further conversions.
Original model: URL
Download instructions
---------------------
With git:
With huggingface hub (credit to TheBloke for instructions):
To download the 'main' (only useful if you only care about URL) branch to a folder called 'Pasta-Sea-7b-128k-exl2':
To download from a different branch, add the '--revision' parameter:
Linux:
Windows (which apparently doesn't like \_ in folders sometimes?):
Want to support my work? Visit my ko-fi page here: URL
| [] | [
"TAGS\n#transformers #mergekit #merge #text-generation #base_model-Test157t/Kunocchini-7b-128k-test #base_model-Test157t/Pasta-Lake-7b #endpoints_compatible #region-us \n"
] | [
65
] | [
"passage: TAGS\n#transformers #mergekit #merge #text-generation #base_model-Test157t/Kunocchini-7b-128k-test #base_model-Test157t/Pasta-Lake-7b #endpoints_compatible #region-us \n"
] | [
-0.11759983003139496,
-0.058717429637908936,
-0.0011112202191725373,
-0.0445229671895504,
0.02159525267779827,
0.05498930439352989,
0.13363045454025269,
0.16824647784233093,
0.15857890248298645,
0.03693034499883652,
0.1185997948050499,
0.0948122888803482,
0.0075857327319681644,
0.12685474753379822,
-0.10502706468105316,
-0.15456780791282654,
0.08385659009218216,
0.10256676375865936,
-0.17270015180110931,
0.12223528325557709,
0.11179419606924057,
-0.05233658105134964,
0.10727608948945999,
-0.018655598163604736,
-0.1363353431224823,
0.11304344981908798,
0.011566284112632275,
-0.05437616631388664,
0.08994650840759277,
0.0557260625064373,
0.1418716013431549,
0.06765183806419373,
-0.10617968440055847,
-0.18408985435962677,
0.03537488728761673,
0.012104240246117115,
0.008359323255717754,
0.023517057299613953,
0.05584615841507912,
0.011218556202948093,
0.08695706725120544,
-0.009639039635658264,
0.008711096830666065,
0.0604356974363327,
-0.10935378074645996,
0.023194313049316406,
-0.10379517823457718,
0.06593151390552521,
0.10240137577056885,
0.06384745985269547,
-0.00925957877188921,
0.12721461057662964,
-0.08247224241495132,
0.09165779501199722,
0.13387465476989746,
-0.2245040088891983,
-0.0401279553771019,
0.15964898467063904,
0.13487017154693604,
-0.011570428498089314,
0.029225017875432968,
0.06971403956413269,
0.07330485433340073,
-0.02878548949956894,
-0.14564432203769684,
-0.08230311423540115,
0.10174532234668732,
0.025802046060562134,
-0.08242330700159073,
-0.02743009850382805,
0.20229730010032654,
0.0285763181746006,
0.029185403138399124,
-0.024867108091711998,
-0.08979076147079468,
0.008300885558128357,
0.012539182789623737,
-0.06348758935928345,
-0.01603165827691555,
0.016879098489880562,
0.014171990565955639,
-0.06827881187200546,
-0.04372625797986984,
-0.08704593777656555,
-0.14132936298847198,
0.23815284669399261,
0.009752905927598476,
0.028001151978969574,
-0.13013699650764465,
0.03255565091967583,
-0.14247339963912964,
-0.12471447885036469,
-0.027482975274324417,
-0.07986477762460709,
-0.0312989205121994,
-0.04984026402235031,
-0.1248139813542366,
-0.15582172572612762,
0.1556435078382492,
0.21450310945510864,
-0.0186101533472538,
0.061462290585041046,
0.02172205224633217,
0.0602307952940464,
0.0056152851320803165,
0.053370118141174316,
-0.05607955530285835,
-0.11834716796875,
-0.004280197434127331,
-0.016654886305332184,
-0.03182213008403778,
0.005670212209224701,
-0.03526913747191429,
-0.08972953259944916,
-0.002568894997239113,
0.03150706738233566,
0.07476162910461426,
-0.0022225501015782356,
-0.03333357349038124,
-0.03473399952054024,
0.06506102532148361,
-0.066293865442276,
0.005672630853950977,
-0.027906648814678192,
0.0026361437048763037,
0.15171964466571808,
-0.004038918763399124,
0.023838268592953682,
0.01256760023534298,
0.027526507154107094,
-0.07332664728164673,
-0.040897078812122345,
-0.0466386154294014,
-0.026116570457816124,
0.058167021721601486,
-0.1534176617860794,
0.022305190563201904,
-0.10871175676584244,
-0.17412696778774261,
-0.02381972037255764,
0.02343602292239666,
-0.06474169343709946,
0.00510885426774621,
-0.007740001659840345,
0.024285590276122093,
-0.010626631788909435,
-0.01646500453352928,
-0.03636660799384117,
-0.0314754918217659,
-0.00007138778892112896,
0.044661715626716614,
0.0435682013630867,
-0.1068837121129036,
0.038813650608062744,
-0.07383010536432266,
0.10970403254032135,
-0.1716294288635254,
0.06495154649019241,
-0.12177902460098267,
0.12111715972423553,
-0.10408280044794083,
-0.017476502805948257,
-0.12924295663833618,
0.015369622968137264,
0.07108349353075027,
0.22334718704223633,
-0.06935412436723709,
-0.06385108083486557,
0.09407749027013779,
-0.09908674657344818,
-0.17836181819438934,
0.0826873928308487,
0.0030235962476581335,
0.0396643690764904,
0.08196649700403214,
0.25826501846313477,
0.12317190319299698,
-0.012379967607557774,
0.027890397235751152,
0.10812915116548538,
-0.0207249466329813,
-0.09081564098596573,
0.09680153429508209,
-0.054314952343702316,
-0.1572333574295044,
0.04820379242300987,
0.020160745829343796,
0.07194754481315613,
-0.06408233940601349,
-0.030626874417066574,
-0.11208049207925797,
-0.07980755716562271,
0.09535375237464905,
-0.010559341870248318,
0.09210969507694244,
-0.11260376125574112,
-0.026509344577789307,
0.09071272611618042,
0.0918993353843689,
-0.035355471074581146,
0.027794968336820602,
-0.035361215472221375,
0.22565606236457825,
-0.13342472910881042,
0.00004887516115559265,
-0.12859439849853516,
-0.04348395764827728,
0.002100316109135747,
0.05780505761504173,
-0.04105885326862335,
0.08483584225177765,
0.09120023995637894,
-0.12425497174263,
-0.03761446475982666,
0.006021125707775354,
0.08090968430042267,
0.06647387892007828,
-0.05955314636230469,
-0.09676126390695572,
-0.02325589209794998,
-0.03817833960056305,
0.049025051295757294,
-0.0472269132733345,
0.008169908076524734,
0.045003898441791534,
0.20170582830905914,
-0.06871204823255539,
0.03856390342116356,
-0.009464893490076065,
0.022692343220114708,
-0.032219480723142624,
0.006183205172419548,
0.04363398998975754,
-0.025672437623143196,
-0.12755723297595978,
0.10244785249233246,
0.06670883297920227,
0.09890873730182648,
0.09285648167133331,
-0.10174662619829178,
-0.011347324587404728,
-0.024210896342992783,
-0.03204241022467613,
-0.012758918106555939,
0.06657996028661728,
-0.03316105529665947,
0.038929011672735214,
-0.011712010018527508,
0.09153618663549423,
-0.09348241239786148,
-0.009854063391685486,
0.013420072384178638,
-0.036133524030447006,
-0.04285385459661484,
0.10069544613361359,
0.006266091484576464,
-0.20840266346931458,
0.11929988861083984,
0.18151843547821045,
-0.0001347086508758366,
0.10512859374284744,
-0.04900164529681206,
-0.07833811640739441,
-0.01574534736573696,
0.06496033817529678,
-0.052473071962594986,
0.055085912346839905,
-0.22743935883045197,
0.06452988088130951,
0.04262205585837364,
0.04298737645149231,
0.12518714368343353,
-0.06461498886346817,
-0.0056320377625525,
0.03517661988735199,
-0.011016066186130047,
0.007475619670003653,
0.12190605700016022,
0.01888263411819935,
0.06591246277093887,
0.04572739079594612,
-0.0030250216368585825,
0.04998824745416641,
0.0035302401520311832,
-0.07853110134601593,
0.24372927844524384,
-0.09178964048624039,
-0.19821882247924805,
-0.1722191721200943,
-0.0070183430798351765,
-0.07331415265798569,
-0.01805577054619789,
0.06064704433083534,
-0.09193439781665802,
0.0005967374891042709,
-0.03551092371344566,
0.031090231612324715,
-0.011773735284805298,
0.0165648702532053,
0.020109819248318672,
0.03385666757822037,
-0.0327756367623806,
-0.07036929577589035,
-0.05319926515221596,
0.007960213348269463,
0.042836692184209824,
0.017688022926449776,
-0.1807386726140976,
0.08655332773923874,
0.13002081215381622,
-0.037408359348773956,
0.034045904874801636,
-0.004603871610015631,
0.17173154652118683,
-0.07550439238548279,
0.040085963904857635,
0.19923332333564758,
-0.04330948367714882,
0.019738148897886276,
0.0939759910106659,
-0.030784189701080322,
-0.04842565953731537,
-0.018269380554556847,
-0.013055138289928436,
-0.04554883390665054,
-0.2850126326084137,
-0.07080460339784622,
-0.10187102854251862,
0.05339189991354942,
0.0010152502218261361,
0.0201991219073534,
0.044182535260915756,
0.10988769680261612,
0.030897676944732666,
-0.07722846418619156,
-0.05338965728878975,
0.023407643660902977,
0.25139251351356506,
-0.03490646928548813,
0.09754221141338348,
-0.08138718456029892,
-0.0009423047304153442,
0.06625519692897797,
0.11437592655420303,
0.09411126375198364,
0.08994074165821075,
0.08670590817928314,
0.06308484822511673,
0.09196349233388901,
0.07844125479459763,
0.07319780439138412,
-0.005409506615251303,
-0.057242248207330704,
0.021865082904696465,
-0.04232906922698021,
0.011289021000266075,
0.03403148427605629,
0.015151518397033215,
-0.05537893623113632,
-0.04332675784826279,
-0.09173274785280228,
0.05733688175678253,
0.13280504941940308,
0.1463824212551117,
-0.229741171002388,
-0.055035900324583054,
0.025014013051986694,
-0.009132834151387215,
-0.07927632331848145,
0.018247131258249283,
-0.01855684071779251,
-0.10005716979503632,
0.12664738297462463,
-0.030075445771217346,
0.1261899322271347,
0.1130712553858757,
-0.013641124591231346,
0.009017339907586575,
-0.04363224655389786,
-0.010682814754545689,
0.09080889075994492,
-0.1746392399072647,
0.22155877947807312,
-0.01582268625497818,
-0.054236866533756256,
-0.021189430728554726,
0.01008697971701622,
-0.011270551010966301,
0.2156161367893219,
0.08241990208625793,
0.020040975883603096,
-0.05730598047375679,
-0.04584203660488129,
-0.07110859453678131,
0.028661096468567848,
0.058012597262859344,
0.01704755611717701,
0.03444264084100723,
-0.031001029536128044,
-0.014463592320680618,
-0.003221604274585843,
0.030934233218431473,
-0.11512989550828934,
-0.06938999146223068,
0.058783985674381256,
-0.013000546023249626,
0.016254963353276253,
-0.045274846255779266,
-0.03122583031654358,
-0.014028433710336685,
0.2708388566970825,
0.02391468733549118,
-0.05265246331691742,
-0.1245475709438324,
-0.04769494757056236,
0.21450433135032654,
-0.06177540123462677,
0.06268444657325745,
-0.08853709697723389,
0.004321608226746321,
-0.07251693308353424,
-0.16619302332401276,
0.10962523519992828,
-0.08564111590385437,
-0.049826882779598236,
-0.011028729379177094,
0.14170865714550018,
-0.0886283814907074,
0.014998024329543114,
0.04760262370109558,
0.02942141331732273,
-0.07539334893226624,
-0.030090752989053726,
-0.05820469930768013,
0.013157879933714867,
-0.04387794807553291,
0.12867899239063263,
-0.017536958679556847,
0.047029759734869,
-0.025537213310599327,
0.0013151700841262937,
0.1768529713153839,
0.21535317599773407,
-0.007531741168349981,
0.057076431810855865,
0.06879666447639465,
-0.0779547467827797,
-0.21693895757198334,
0.08813199400901794,
-0.06804336607456207,
0.0414271205663681,
-0.049840617924928665,
-0.044174570590257645,
0.10322974622249603,
0.11380700021982193,
-0.007305423263460398,
0.10741495341062546,
-0.2622353732585907,
-0.13222157955169678,
0.08791334182024002,
-0.014208066277205944,
0.356649786233902,
-0.13083969056606293,
-0.08699706941843033,
-0.09765612334012985,
-0.1892523616552353,
-0.05519324168562889,
-0.018758287653326988,
0.06855597347021103,
-0.07363981008529663,
-0.011237556114792824,
0.02081061713397503,
-0.07216659188270569,
0.16916213929653168,
-0.018530501052737236,
-0.014569519087672234,
-0.04813418164849281,
-0.0034560400526970625,
0.09208732843399048,
-0.04449484869837761,
0.05170706287026405,
-0.06363930553197861,
0.06854340434074402,
-0.1896332949399948,
-0.049733903259038925,
-0.06668549031019211,
0.08884633332490921,
-0.009762734174728394,
-0.010018253698945045,
-0.06786483526229858,
0.022455822676420212,
-0.02089637517929077,
-0.011789649724960327,
0.18949829041957855,
-0.0794949010014534,
0.1536051481962204,
0.10406138747930527,
0.08337844163179398,
-0.12381863594055176,
-0.05217880383133888,
0.06946437060832977,
-0.05912703275680542,
0.031100314110517502,
-0.1544506847858429,
-0.013655295595526695,
0.13216429948806763,
0.004426749888807535,
0.011807255446910858,
0.06249838322401047,
-0.03182050585746765,
-0.027743516489863396,
0.11406727135181427,
-0.13998205959796906,
-0.033793043345212936,
-0.07020898908376694,
-0.11322521418333054,
-0.051487550139427185,
0.0731346607208252,
0.15319976210594177,
-0.013218613341450691,
-0.002386661944910884,
-0.03286723792552948,
-0.03884439542889595,
-0.17992709577083588,
0.11578303575515747,
0.06619555503129959,
0.05253077298402786,
-0.05789942666888237,
0.06839132308959961,
-0.06647302955389023,
-0.18021537363529205,
-0.01703447662293911,
0.06186330318450928,
-0.08798696845769882,
-0.08600709587335587,
-0.014368678443133831,
0.15299803018569946,
-0.11858198046684265,
-0.05636017024517059,
-0.18545246124267578,
-0.11473182588815689,
0.07246588170528412,
0.24610665440559387,
0.12155159562826157,
0.06122172996401787,
0.0250434298068285,
-0.07588190585374832,
-0.02910231426358223,
0.020832182839512825,
0.047255389392375946,
0.08574557304382324,
-0.06171341985464096,
-0.055627159774303436,
-0.06586001813411713,
0.09045302867889404,
-0.06680712848901749,
-0.005779446102678776,
-0.17047390341758728,
0.014059703797101974,
-0.2561977207660675,
-0.06990141421556473,
-0.1213487833738327,
-0.07019741088151932,
-0.003470391035079956,
-0.059996411204338074,
-0.07353968173265457,
-0.0010829988168552518,
-0.07065977901220322,
0.010789317078888416,
-0.027983060106635094,
0.02059219591319561,
-0.05899517238140106,
0.037731871008872986,
0.10726463049650192,
-0.07237586379051208,
0.029295053333044052,
0.0465582050383091,
-0.03555228188633919,
0.01680600270628929,
-0.058194562792778015,
-0.06668305397033691,
0.014245827682316303,
-0.0019523566588759422,
0.01840980537235737,
-0.09525832533836365,
0.04877288267016411,
0.08913255482912064,
0.07136258482933044,
0.06381348520517349,
0.008558432571589947,
-0.05966072529554367,
-0.04558219760656357,
-0.07767423242330551,
-0.12700068950653076,
-0.018338998779654503,
-0.04547375813126564,
0.12484569847583771,
0.08241826295852661,
0.07385869324207306,
-0.03246213123202324,
0.06619948893785477,
-0.12000001221895218,
-0.02139822579920292,
0.0021383524872362614,
-0.1278977394104004,
0.060291361063718796,
-0.07906460762023926,
0.02978440932929516,
0.013197141699492931,
0.24813300371170044,
-0.04478324577212334,
-0.007219554856419563,
0.06593441963195801,
-0.06679828464984894,
0.13375133275985718,
0.07554691284894943,
0.3790670335292816,
0.0709400400519371,
-0.007346659433096647,
-0.10061062127351761,
0.07238701730966568,
-0.028044236823916435,
0.0003267655265517533,
0.06430725008249283,
0.25226762890815735,
-0.09301528334617615,
0.14640596508979797,
0.09233061969280243,
0.06432710587978363,
0.06949584931135178,
-0.1346830278635025,
-0.0704631358385086,
0.05457283928990364,
0.035627640783786774,
0.11205428838729858,
0.10277760028839111,
-0.18867194652557373,
0.051360923796892166,
-0.02730083465576172,
-0.07706283777952194,
-0.12734073400497437,
-0.049489520490169525,
-0.10861920565366745,
-0.2344992607831955,
0.04897797480225563,
-0.0985092967748642,
0.012968494556844234,
0.09390945732593536,
0.046590425074100494,
-0.052337486296892166,
0.18962840735912323,
0.0056386468932032585,
-0.04332747310400009,
0.10940030962228775,
-0.02327217534184456,
0.016608405858278275,
0.027006125077605247,
-0.07726065069437027,
0.003699708264321089,
-0.07328426092863083,
-0.05107009410858154,
0.0009676407789811492,
-0.0659884363412857,
0.015957176685333252,
-0.03273696452379227,
-0.09958311915397644,
-0.025764690712094307,
0.0338013581931591,
0.044963110238313675,
0.027325810864567757,
0.004588389303535223,
0.01584506779909134,
0.005389448720961809,
0.11290287971496582,
-0.02955455705523491,
-0.0935303196310997,
-0.09634820371866226,
0.09769902378320694,
0.018744124099612236,
0.08727921545505524,
-0.01904560811817646,
-0.024437230080366135,
-0.0126601941883564,
0.23637008666992188,
0.22625453770160675,
0.015789354220032692,
-0.029064210131764412,
0.05443385988473892,
0.025880826637148857,
0.08509145677089691,
0.09338634461164474,
-0.01758110523223877,
0.045881807804107666,
-0.07118965685367584,
-0.07940671592950821,
-0.0001466671674279496,
-0.10321928560733795,
-0.0774245485663414,
0.006812260951846838,
0.09927818924188614,
-0.09874457865953445,
-0.015830138698220253,
0.17681768536567688,
-0.07688049972057343,
0.09538865834474564,
0.11103413999080658,
-0.20174425840377808,
-0.11334607005119324,
-0.10326419025659561,
0.10736523568630219,
-0.08993111550807953,
0.08342785388231277,
-0.06222042813897133,
-0.058839499950408936,
0.08358579128980637,
-0.0036001987755298615,
-0.12172197550535202,
-0.16609756648540497,
0.06429838389158249,
-0.008877434767782688,
-0.09905868768692017,
-0.008707530796527863,
0.10481256991624832,
0.13643625378608704,
0.025584030896425247,
-0.06465871632099152,
-0.021980810910463333,
0.05409650877118111,
-0.03874719515442848,
-0.008419200778007507,
-0.015295198187232018,
0.014949467033147812,
0.011012238450348377,
0.051421552896499634,
-0.1644018143415451,
0.04421447217464447,
-0.07629997283220291,
0.0072318497113883495,
-0.13845962285995483,
0.027737513184547424,
-0.08264730870723724,
0.1337576061487198,
0.15860095620155334,
-0.057376690208911896,
0.007871126756072044,
-0.03619014471769333,
0.05146158114075661,
0.08054393529891968,
0.04724984988570213,
-0.07149055600166321,
-0.18742257356643677,
-0.06620435416698456,
0.06564997881650925,
-0.019210541620850563,
-0.2558926045894623,
-0.0651387870311737,
-0.09126868098974228,
0.0116850221529603,
-0.001007641083560884,
0.11263391375541687,
0.1058906763792038,
0.026831477880477905,
0.017490454018115997,
-0.24685660004615784,
0.031726446002721786,
0.09692732244729996,
-0.11315490305423737,
-0.07954369485378265
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mixtral-rap-finetune_0.6_test_by_part_themes_replace
This model is a fine-tuned version of [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5372
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.6279 | 0.14 | 1000 | 1.6172 |
| 1.5675 | 0.27 | 2000 | 1.5896 |
| 1.5138 | 0.41 | 3000 | 1.5719 |
| 1.5578 | 0.55 | 4000 | 1.5593 |
| 1.5244 | 0.68 | 5000 | 1.5492 |
| 1.5893 | 0.82 | 6000 | 1.5417 |
| 1.5488 | 0.96 | 7000 | 1.5372 |
### Framework versions
- PEFT 0.8.2
- Transformers 4.38.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1 | {"license": "apache-2.0", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "mistralai/Mixtral-8x7B-Instruct-v0.1", "model-index": [{"name": "mixtral-rap-finetune_0.6_test_by_part_themes_replace", "results": []}]} | null | JulesGo/mixtral-rap-finetune_0.6_test_by_part_themes_replace | [
"peft",
"safetensors",
"generated_from_trainer",
"base_model:mistralai/Mixtral-8x7B-Instruct-v0.1",
"license:apache-2.0",
"region:us"
] | 2024-02-12T11:15:34+00:00 | [] | [] | TAGS
#peft #safetensors #generated_from_trainer #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #license-apache-2.0 #region-us
| mixtral-rap-finetune\_0.6\_test\_by\_part\_themes\_replace
==========================================================
This model is a fine-tuned version of mistralai/Mixtral-8x7B-Instruct-v0.1 on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 1.5372
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 16
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1
* num\_epochs: 1.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* PEFT 0.8.2
* Transformers 4.38.0.dev0
* Pytorch 2.1.2+cu121
* Datasets 2.16.1
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
"TAGS\n#peft #safetensors #generated_from_trainer #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #license-apache-2.0 #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
51,
159,
4,
44
] | [
"passage: TAGS\n#peft #safetensors #generated_from_trainer #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #license-apache-2.0 #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
-0.1403985172510147,
0.07642120122909546,
-0.002520229434594512,
0.08060621470212936,
0.12750089168548584,
0.021033016964793205,
0.12302561104297638,
0.11881318688392639,
-0.04947776347398758,
0.10111026465892792,
0.11377934366464615,
0.06278721988201141,
0.05420111492276192,
0.1479583978652954,
-0.035793472081422806,
-0.2946555018424988,
0.024210715666413307,
-0.0011456990614533424,
-0.08846393972635269,
0.12211688607931137,
0.10049895942211151,
-0.09755048900842667,
0.05605553090572357,
0.007044319063425064,
-0.11466760188341141,
-0.004876809194684029,
-0.00697574857622385,
-0.037639208137989044,
0.11446740478277206,
0.0363546758890152,
0.12534180283546448,
0.01219271495938301,
0.1017235741019249,
-0.22790180146694183,
0.011103600263595581,
0.07879304885864258,
0.025863630697131157,
0.08274030685424805,
0.10506858676671982,
0.014259472489356995,
0.1170104369521141,
-0.05682609602808952,
0.07669569551944733,
0.04618165269494057,
-0.13669347763061523,
-0.3004467785358429,
-0.11593550443649292,
0.09122347831726074,
0.13728366792201996,
0.07645542174577713,
-0.00783013366162777,
0.10273659229278564,
-0.03361239284276962,
0.07795833796262741,
0.2675017714500427,
-0.2747681736946106,
-0.10325144976377487,
0.007817397825419903,
0.047256581485271454,
0.08548978716135025,
-0.11683963984251022,
-0.04922723025083542,
0.0514814555644989,
0.05020715668797493,
0.1041054055094719,
0.010615370236337185,
0.038357269018888474,
-0.007392909377813339,
-0.15494701266288757,
-0.04880775138735771,
0.13762661814689636,
0.06698504090309143,
-0.05185865983366966,
-0.05605650320649147,
-0.05733957141637802,
-0.21326056122779846,
-0.048144616186618805,
-0.00970976147800684,
0.018099773675203323,
-0.03357100859284401,
-0.014486798085272312,
-0.01833355613052845,
-0.0752064436674118,
-0.1099863052368164,
0.033285558223724365,
0.20174150168895721,
0.05935797467827797,
-0.0071379574947059155,
0.005886827129870653,
0.1248723715543747,
0.01606966368854046,
-0.15275494754314423,
-0.012299998663365841,
0.006975880358368158,
-0.03701759874820709,
-0.03402061015367508,
-0.04531549662351608,
0.015539093874394894,
-0.01046836655586958,
0.18683063983917236,
-0.11690264940261841,
0.07373672723770142,
0.03891522064805031,
0.029440056532621384,
-0.11748281866312027,
0.1373881995677948,
-0.05617307126522064,
-0.00889224000275135,
-0.018881335854530334,
0.13032414019107819,
0.028642291203141212,
-0.006070550996810198,
-0.06857321411371231,
0.029567645862698555,
0.0941290557384491,
0.03700669854879379,
-0.041094712913036346,
-0.002102020662277937,
-0.0653664842247963,
-0.027280211448669434,
0.06624897569417953,
-0.09736736863851547,
0.0356835201382637,
0.015088064596056938,
-0.05359929800033569,
-0.02773999609053135,
0.016948863863945007,
0.019626431167125702,
0.007160792592912912,
0.11933303624391556,
-0.08880047500133514,
0.008063237182796001,
-0.09115441888570786,
-0.09654504060745239,
0.017271114513278008,
-0.03667211905121803,
0.003297087736427784,
-0.09130624681711197,
-0.13929305970668793,
-0.03889228031039238,
0.04614712670445442,
-0.059293776750564575,
-0.06584999710321426,
-0.016690196469426155,
-0.08868876099586487,
0.03325449675321579,
-0.01300495769828558,
0.10596607625484467,
-0.0633767619729042,
0.11355101317167282,
0.03263083100318909,
0.03886135667562485,
-0.00867930892854929,
0.037456780672073364,
-0.05214729532599449,
0.06355860829353333,
-0.1755477786064148,
0.022984957322478294,
-0.08493492752313614,
0.04167148098349571,
-0.12295019626617432,
-0.10714863240718842,
-0.034053411334753036,
-0.01041263435035944,
0.11489029973745346,
0.1252082735300064,
-0.13392004370689392,
-0.0655018538236618,
0.17210836708545685,
-0.11087032407522202,
-0.1420966535806656,
0.11600761115550995,
-0.010434516705572605,
0.0009564794017933309,
0.021635141223669052,
0.12606236338615417,
0.10777489840984344,
-0.1359882950782776,
0.013911619782447815,
-0.036986593157052994,
0.1500503271818161,
0.01108350045979023,
0.10863044857978821,
-0.03694089502096176,
-0.02613687887787819,
-0.00034488848177716136,
-0.07760776579380035,
0.05399174243211746,
-0.09908949583768845,
-0.07570777833461761,
-0.03786325827240944,
-0.08505294471979141,
0.04922183230519295,
0.059397608041763306,
0.02916244976222515,
-0.08881835639476776,
-0.1124735176563263,
0.06435628980398178,
0.11684688180685043,
-0.05272882431745529,
0.017578214406967163,
-0.041549913585186005,
0.09495913982391357,
-0.05318407714366913,
-0.02935463935136795,
-0.16407442092895508,
-0.07666393369436264,
0.020707309246063232,
-0.03347869962453842,
-0.02465939335525036,
-0.04381142929196358,
0.08371628075838089,
0.09630260616540909,
-0.0721963420510292,
-0.07003923505544662,
-0.11917746067047119,
-0.0017518502427265048,
-0.1059010848402977,
-0.20451660454273224,
-0.09363368898630142,
-0.027517125010490417,
0.15114884078502655,
-0.2423422634601593,
0.033575884997844696,
-0.007748977746814489,
0.12593545019626617,
0.03928769752383232,
-0.05584648996591568,
-0.03030925989151001,
0.07838943600654602,
-0.021851148456335068,
-0.07678446918725967,
0.04024858400225639,
0.01265608612447977,
-0.08144870400428772,
-0.04062124714255333,
-0.13215766847133636,
0.16484257578849792,
0.08998792618513107,
0.020818540826439857,
-0.09664781391620636,
-0.05585865303874016,
-0.08024125546216965,
-0.0459982231259346,
-0.04781671240925789,
0.004005992319434881,
0.10535824298858643,
0.03058168664574623,
0.130578875541687,
-0.0920698344707489,
-0.04431195557117462,
0.0444815494120121,
-0.017529595643281937,
-0.0023688897490501404,
0.11323569715023041,
0.0832473635673523,
-0.03540937229990959,
0.12397731095552444,
0.15700313448905945,
-0.07222045958042145,
0.1098075807094574,
-0.06508711725473404,
-0.10578832030296326,
-0.02225891500711441,
0.05279389023780823,
0.026381194591522217,
0.1470455676317215,
-0.051302291452884674,
0.028314150869846344,
0.009242420084774494,
0.02910010702908039,
0.014449650421738625,
-0.2189105749130249,
-0.028648963198065758,
0.03939579799771309,
-0.044380251318216324,
-0.026979954913258553,
-0.01935558393597603,
-0.00298128928989172,
0.092474065721035,
0.01583600789308548,
-0.05263133719563484,
-0.006079027894884348,
0.00615917332470417,
-0.08796501904726028,
0.20102837681770325,
-0.09180833399295807,
-0.08986061811447144,
-0.15231375396251678,
0.0003347835736349225,
-0.052242159843444824,
-0.0076878732070326805,
0.049626000225543976,
-0.07103334367275238,
-0.025253944098949432,
-0.08333279937505722,
0.016830436885356903,
-0.009929480962455273,
0.014816490933299065,
0.01116050872951746,
-0.014136017300188541,
0.10015928000211716,
-0.08723172545433044,
0.013765467330813408,
-0.005874867085367441,
-0.037382595241069794,
0.006112407892942429,
0.029839524999260902,
0.10697543621063232,
0.14481671154499054,
0.03710730001330376,
0.01741841994225979,
-0.02796747349202633,
0.23638825118541718,
-0.09304285049438477,
-0.00606977054849267,
0.13944096863269806,
-0.0063568344339728355,
0.06212630495429039,
0.12044385820627213,
0.06701143085956573,
-0.10179877281188965,
0.019813697785139084,
0.050103336572647095,
-0.022279180586338043,
-0.21033301949501038,
-0.033175647258758545,
-0.05211702361702919,
-0.035838354378938675,
0.12999294698238373,
0.0426739826798439,
0.008879031054675579,
0.05069965124130249,
-0.03927323594689369,
0.018969375640153885,
-0.006222088355571032,
0.09389351308345795,
0.04408322647213936,
0.04464532062411308,
0.10939853638410568,
-0.026648707687854767,
-0.010464235208928585,
0.03932364657521248,
0.009743958711624146,
0.21519994735717773,
0.003193835960701108,
0.09290920943021774,
0.05165481939911842,
0.1825922578573227,
-0.012724541127681732,
0.0758989006280899,
0.013714432716369629,
-0.03931340575218201,
0.012188133783638477,
-0.06636025011539459,
-0.029707079753279686,
0.05162171274423599,
-0.03808153048157692,
0.08554377406835556,
-0.1380932480096817,
-0.02796008251607418,
0.020131805911660194,
0.35241836309432983,
0.0638350322842598,
-0.3271894156932831,
-0.11607976257801056,
0.008980450220406055,
-0.02902546338737011,
-0.06548336893320084,
0.031433966010808945,
0.11829251050949097,
-0.0748496800661087,
0.06578924506902695,
-0.0857185646891594,
0.09140607714653015,
-0.0005878385272808373,
0.003088434925302863,
0.11225377768278122,
0.10015250742435455,
-0.010573048144578934,
0.03406338766217232,
-0.2165083885192871,
0.29117676615715027,
0.004062581807374954,
0.07545758783817291,
-0.011321830563247204,
0.023065583780407906,
0.03606599196791649,
0.04568543657660484,
0.05759020894765854,
-0.005791796837002039,
-0.06832842528820038,
-0.22406548261642456,
-0.08052189648151398,
0.020530248060822487,
0.11551573127508163,
-0.08208367973566055,
0.1244719848036766,
-0.04535439983010292,
0.005502612330019474,
0.04499341547489166,
-0.04145459830760956,
-0.10531364381313324,
-0.07665908336639404,
0.021904395893216133,
0.005323491524904966,
0.03462326526641846,
-0.11745332181453705,
-0.1096377819776535,
-0.05521009489893913,
0.09648890048265457,
-0.09865237772464752,
-0.036381181329488754,
-0.14487621188163757,
0.07114864140748978,
0.13296230137348175,
-0.068910151720047,
0.05189770832657814,
0.007098489440977573,
0.11280692368745804,
0.011487428098917007,
-0.015199320390820503,
0.10641837865114212,
-0.08943161368370056,
-0.2268073707818985,
-0.03581194952130318,
0.15735076367855072,
0.032182738184928894,
0.044506095349788666,
-0.036641523241996765,
0.044236909598112106,
0.0037591925356537104,
-0.09727509319782257,
0.02263541705906391,
0.012296319007873535,
0.02532731182873249,
0.025132611393928528,
-0.04711465537548065,
0.07709291577339172,
-0.056993890553712845,
-0.03892065957188606,
0.07775181531906128,
0.31954675912857056,
-0.0941740944981575,
-0.006661900784820318,
0.027803262695670128,
-0.04650525003671646,
-0.14515624940395355,
0.01059119775891304,
0.11754225194454193,
-0.0024963561445474625,
0.04811840131878853,
-0.1876845806837082,
0.05488577485084534,
0.1285477876663208,
-0.0401364341378212,
0.13290397822856903,
-0.319256067276001,
-0.12410957366228104,
0.08936610817909241,
0.1434483826160431,
0.015545097179710865,
-0.18682710826396942,
-0.06818239390850067,
0.008201955817639828,
-0.1395197957754135,
0.03967876359820366,
-0.06570663303136826,
0.08572915941476822,
-0.019211137667298317,
0.019252289086580276,
0.017706379294395447,
-0.04037947952747345,
0.1877082735300064,
-0.03521757200360298,
0.08567468076944351,
-0.019499002024531364,
0.05148429423570633,
-0.015515287406742573,
-0.07085330784320831,
0.016854429617524147,
-0.05858436971902847,
0.037510696798563004,
-0.11954150348901749,
-0.013274138793349266,
-0.09308699518442154,
0.022175922989845276,
-0.05683652684092522,
-0.036852434277534485,
-0.014882455579936504,
0.039927221834659576,
0.027447113767266273,
-0.01556396670639515,
0.17358200252056122,
-0.007210540119558573,
0.22424004971981049,
0.11709748208522797,
0.03875739499926567,
-0.03354047238826752,
-0.09863177686929703,
-0.011811390519142151,
-0.027856118977069855,
0.07132454961538315,
-0.14580507576465607,
0.01623235084116459,
0.12474218755960464,
0.04716261476278305,
0.12846583127975464,
0.06370537728071213,
-0.06475663930177689,
0.006185989361256361,
0.07478056848049164,
-0.12035622447729111,
-0.13751207292079926,
-0.034771863371133804,
0.0730166882276535,
-0.1575073003768921,
0.027384500950574875,
0.12638798356056213,
-0.0794791579246521,
-0.011678451672196388,
0.0027015304658561945,
0.033814262598752975,
-0.043218739330768585,
0.20127399265766144,
0.0560978427529335,
0.07935483008623123,
-0.07619161903858185,
0.08091213554143906,
0.04693630710244179,
-0.11293362081050873,
0.004292445257306099,
0.09546496719121933,
-0.06035325676202774,
-0.0221790112555027,
0.039596036076545715,
0.059563376009464264,
-0.030001163482666016,
-0.051060330122709274,
-0.12893225252628326,
-0.1323668658733368,
0.0706913098692894,
0.1206953153014183,
0.05287175253033638,
0.012732626870274544,
-0.0010449762921780348,
0.038262464106082916,
-0.10635893791913986,
0.10431426763534546,
0.050751227885484695,
0.09567765146493912,
-0.1678120493888855,
0.10431907325983047,
0.007443130947649479,
0.009128749370574951,
-0.00251245079562068,
0.024895934388041496,
-0.11199624836444855,
0.002796923043206334,
-0.1319458931684494,
-0.031407248228788376,
-0.04398777335882187,
0.002928324742242694,
-0.006874114274978638,
-0.06928204745054245,
-0.062287960201501846,
0.02769467420876026,
-0.10805347561836243,
-0.04954374581575394,
-0.006115793250501156,
0.05791938677430153,
-0.12096861004829407,
-0.007230097893625498,
0.05245248228311539,
-0.11924048513174057,
0.07377343624830246,
0.030064379796385765,
0.04800456762313843,
0.04677416756749153,
-0.08275856822729111,
0.0492333360016346,
0.035889632999897,
-0.018813718110322952,
0.02777041308581829,
-0.14995074272155762,
-0.008520844392478466,
-0.0440298356115818,
0.03405696526169777,
0.005420232657343149,
0.05509159341454506,
-0.14138363301753998,
-0.026506872847676277,
-0.01579180732369423,
-0.055342018604278564,
-0.045376501977443695,
0.02310856617987156,
0.07349099218845367,
0.025637244805693626,
0.14721481502056122,
-0.08616448938846588,
0.03811679407954216,
-0.23232677578926086,
-0.012576794251799583,
-0.03186114877462387,
-0.05859695374965668,
-0.08126507699489594,
-0.027792759239673615,
0.0683211013674736,
-0.043139103800058365,
0.056339945644140244,
-0.05642044171690941,
0.0930800586938858,
0.04681171849370003,
-0.06940638273954391,
0.02504827082157135,
0.035815417766571045,
0.19960007071495056,
0.04043116420507431,
-0.02397044189274311,
0.049132827669382095,
0.0243474580347538,
0.058075178414583206,
0.10393206775188446,
0.1853717714548111,
0.1658736765384674,
-0.006331764627248049,
0.08441763371229172,
0.039686862379312515,
-0.10627532750368118,
-0.1335672289133072,
0.05265101417899132,
-0.0019676454830914736,
0.10436781495809555,
-0.01588320918381214,
0.18625056743621826,
0.13235695660114288,
-0.19805949926376343,
0.028523173183202744,
-0.036101460456848145,
-0.06811567395925522,
-0.1188250407576561,
-0.0042322976514697075,
-0.06134095415472984,
-0.1887592226266861,
-0.003799036843702197,
-0.11033785343170166,
0.02586141601204872,
0.09451250731945038,
0.011625535786151886,
0.025466013699769974,
0.16952581703662872,
0.061646997928619385,
0.013142520561814308,
0.06503872573375702,
0.028002286329865456,
0.006214631721377373,
-0.059116464108228683,
-0.1163477972149849,
0.035230498760938644,
-0.06342646479606628,
0.035812802612781525,
-0.05601176992058754,
-0.08749328553676605,
0.05258690565824509,
0.02320500649511814,
-0.10753615200519562,
0.0385478250682354,
0.009427273645997047,
0.045187998563051224,
0.08764445036649704,
0.014428933151066303,
0.014442852698266506,
-0.03228316456079483,
0.25345587730407715,
-0.09352250397205353,
-0.03208129107952118,
-0.13395775854587555,
0.28484728932380676,
0.015472035855054855,
-0.015645788982510567,
0.02095746248960495,
-0.08779137581586838,
-0.003171765711158514,
0.13311195373535156,
0.09350493550300598,
-0.047127142548561096,
-0.011563231237232685,
0.015443400479853153,
-0.012689298950135708,
-0.05266384035348892,
0.10172496736049652,
0.12469092011451721,
0.05742943286895752,
-0.07043900340795517,
-0.017728302627801895,
-0.05027955025434494,
-0.021413687616586685,
-0.03566908463835716,
0.06201016902923584,
0.01731652393937111,
-0.002251017140224576,
-0.044098831713199615,
0.10905279219150543,
-0.01464552991092205,
-0.09137021005153656,
0.07546181976795197,
-0.16976024210453033,
-0.18652985990047455,
-0.03497160226106644,
0.05252178758382797,
-0.0005675515276379883,
0.06049323081970215,
-0.013186264783143997,
-0.03630208596587181,
0.13778279721736908,
-0.021209461614489555,
-0.02252492681145668,
-0.16309452056884766,
0.0904005840420723,
-0.0712452083826065,
0.23235660791397095,
-0.03852823004126549,
0.011620313860476017,
0.11723054200410843,
0.037617165595293045,
-0.11594412475824356,
0.03835868462920189,
0.08475877344608307,
-0.1037902683019638,
0.0036623745691031218,
0.13686971366405487,
-0.042830854654312134,
0.11117374151945114,
0.041611604392528534,
-0.15314112603664398,
-0.009238854981958866,
-0.04440213739871979,
-0.06877659261226654,
-0.04512938857078552,
-0.010984191671013832,
-0.04778043553233147,
0.13986295461654663,
0.20612260699272156,
-0.06323648244142532,
-0.01071853656321764,
-0.041216231882572174,
0.05013488605618477,
0.06298211961984634,
0.1299571543931961,
-0.009172621183097363,
-0.25571727752685547,
0.026821767911314964,
0.022977126762270927,
-0.010436849668622017,
-0.26497817039489746,
-0.090354785323143,
0.03877728432416916,
-0.05208839476108551,
-0.0665099248290062,
0.10814198851585388,
0.05490865558385849,
0.0457228384912014,
-0.05254186689853668,
-0.13682347536087036,
-0.07081417739391327,
0.18581977486610413,
-0.13968253135681152,
-0.06870311498641968
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_billsum_model
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5803
- Rouge1: 0.1414
- Rouge2: 0.0501
- Rougel: 0.1176
- Rougelsum: 0.1176
- Gen Len: 19.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 200
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| 3.9137 | 0.65 | 40 | 3.0404 | 0.1351 | 0.044 | 0.1138 | 0.114 | 19.0 |
| 3.0852 | 1.29 | 80 | 2.7349 | 0.1363 | 0.0453 | 0.1143 | 0.1144 | 19.0 |
| 2.9298 | 1.94 | 120 | 2.6341 | 0.1405 | 0.0471 | 0.1162 | 0.1164 | 19.0 |
| 2.8389 | 2.58 | 160 | 2.5929 | 0.1413 | 0.049 | 0.1176 | 0.118 | 19.0 |
| 2.8414 | 3.23 | 200 | 2.5803 | 0.1414 | 0.0501 | 0.1176 | 0.1176 | 19.0 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "base_model": "t5-small", "model-index": [{"name": "my_billsum_model", "results": []}]} | text2text-generation | trtd56/practical_nlp_course_6 | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:t5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T11:18:21+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| my\_billsum\_model
==================
This model is a fine-tuned version of t5-small on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 2.5803
* Rouge1: 0.1414
* Rouge2: 0.0501
* Rougel: 0.1176
* Rougelsum: 0.1176
* Gen Len: 19.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* training\_steps: 200
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 200\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 200\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
77,
112,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 200\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.10615981370210648,
0.09725072979927063,
-0.0029234467074275017,
0.08210135251283646,
0.0899440124630928,
-0.020458711311221123,
0.17157931625843048,
0.16035079956054688,
-0.11420242488384247,
0.057817503809928894,
0.1388174146413803,
0.11414162814617157,
0.05249058082699776,
0.18598568439483643,
-0.0726654976606369,
-0.2219923734664917,
0.05126737430691719,
0.0338805690407753,
-0.0222341138869524,
0.12478717416524887,
0.09594476968050003,
-0.11760150641202927,
0.08633635193109512,
0.01848401129245758,
-0.163665309548378,
-0.01272746454924345,
0.021319616585969925,
-0.08027706295251846,
0.11493455618619919,
0.035558611154556274,
0.0813014954328537,
0.053755082190036774,
0.04281807318329811,
-0.1660708487033844,
0.011102590709924698,
0.06340667605400085,
0.00287034222856164,
0.09716420620679855,
0.05153341591358185,
-0.0055781761184334755,
0.07928404957056046,
-0.08477255702018738,
0.06842295080423355,
0.03028784692287445,
-0.13044403493404388,
-0.26722782850265503,
-0.11203597486019135,
0.04637768492102623,
0.1037912517786026,
0.0771515816450119,
-0.008001969195902348,
0.18312802910804749,
-0.00818798691034317,
0.11281783133745193,
0.23396046459674835,
-0.3165214955806732,
-0.05707905441522598,
-0.00889216922223568,
0.06697023659944534,
0.09531394392251968,
-0.08388782292604446,
-0.019108004868030548,
0.0350818857550621,
0.03220455348491669,
0.15733423829078674,
-0.01728387549519539,
-0.01972627080976963,
-0.023977182805538177,
-0.13219091296195984,
-0.06413190066814423,
0.16419588029384613,
0.03891958296298981,
-0.06112769618630409,
-0.09065902978181839,
-0.07287406176328659,
-0.15415439009666443,
-0.05391639098525047,
-0.0004790609236806631,
0.04034899175167084,
-0.02712344005703926,
-0.08043541759252548,
-0.03550573065876961,
-0.0877012088894844,
-0.05203506350517273,
-0.02969152107834816,
0.12340675294399261,
0.03458705171942711,
0.011523776687681675,
-0.055137477815151215,
0.07011453062295914,
-0.05182250589132309,
-0.16189216077327728,
-0.016147475689649582,
0.013915427960455418,
0.014998330734670162,
-0.04600764811038971,
-0.03406166657805443,
-0.13492299616336823,
0.023255813866853714,
0.16447362303733826,
-0.09721969813108444,
0.07881073653697968,
-0.0428597554564476,
0.03960392624139786,
-0.09663897007703781,
0.17708611488342285,
-0.014013906009495258,
0.008696340024471283,
0.038148052990436554,
0.09400375187397003,
0.07734239846467972,
-0.027750253677368164,
-0.1144772320985794,
0.040242575109004974,
0.12253012508153915,
0.04251299053430557,
-0.022086435928940773,
0.05417085438966751,
-0.0458965003490448,
-0.006621057633310556,
0.07624100893735886,
-0.10140600800514221,
0.027123020961880684,
-0.017023950815200806,
-0.043044790625572205,
-0.06459387391805649,
0.020269494503736496,
0.011918717995285988,
-0.025344867259263992,
0.06742313504219055,
-0.07211694121360779,
0.006830667611211538,
-0.07561216503381729,
-0.138627827167511,
0.03161703050136566,
-0.08123213052749634,
0.006434269715100527,
-0.10096453130245209,
-0.1509837657213211,
-0.007375079672783613,
0.051947735249996185,
-0.04257073625922203,
-0.04729436710476875,
-0.046230416744947433,
-0.09402777999639511,
0.050329506397247314,
-0.021869022399187088,
0.07379741221666336,
-0.07557813078165054,
0.09020280838012695,
0.046596232801675797,
0.07414528727531433,
-0.04249562695622444,
0.02540612407028675,
-0.09412692487239838,
0.05399736016988754,
-0.23493997752666473,
0.04304995387792587,
-0.05055262893438339,
0.10037481039762497,
-0.11285439133644104,
-0.07621648162603378,
0.026333412155508995,
-0.01952190138399601,
0.10204629600048065,
0.10620982944965363,
-0.17169280350208282,
-0.05142917111515999,
0.2063182294368744,
-0.10535965114831924,
-0.16573838889598846,
0.14188404381275177,
-0.035200320184230804,
0.040022172033786774,
0.05713558569550514,
0.23442012071609497,
0.0667535737156868,
-0.10465027391910553,
0.0026255741249769926,
-0.03573517128825188,
0.06273598968982697,
-0.05767429247498512,
0.07427626103162766,
-0.006129640154540539,
0.0556425005197525,
-0.0034716771915555,
-0.009259514510631561,
0.03619605302810669,
-0.0650312677025795,
-0.07515333592891693,
-0.05640606954693794,
-0.07798481732606888,
0.015121466480195522,
0.03187325596809387,
0.054741743952035904,
-0.14755423367023468,
-0.10920146107673645,
0.04089125990867615,
0.07124681025743484,
-0.07736727595329285,
0.04891232028603554,
-0.10873667150735855,
0.11198469996452332,
-0.07400407642126083,
0.0014037120854482055,
-0.16065546870231628,
-0.02144271694123745,
0.03611705079674721,
0.0021362893749028444,
0.004400260280817747,
-0.07556222379207611,
0.07867012172937393,
0.07130037248134613,
-0.052420638501644135,
-0.04675579071044922,
-0.006888381205499172,
0.015708135440945625,
-0.1125703975558281,
-0.203426331281662,
-0.021816059947013855,
-0.04659494012594223,
0.10261132568120956,
-0.17080563306808472,
0.05265055224299431,
0.07265307009220123,
0.10346093028783798,
0.05497365817427635,
-0.019766397774219513,
0.006319660227745771,
0.05324535444378853,
-0.04337919503450394,
-0.08211671561002731,
0.05438375845551491,
0.04288490489125252,
-0.09969010204076767,
0.034436870366334915,
-0.182633176445961,
0.17577321827411652,
0.13509874045848846,
0.01894761249423027,
-0.06259563565254211,
-0.011932424269616604,
-0.03887477144598961,
-0.02648051828145981,
-0.018803879618644714,
0.01027898583561182,
0.11862337589263916,
0.02130783163011074,
0.15764959156513214,
-0.11345050483942032,
-0.05678771436214447,
0.04961197450757027,
-0.04508163779973984,
-0.0012665328104048967,
0.1023167297244072,
-0.006000624503940344,
-0.147896409034729,
0.13900481164455414,
0.1709977388381958,
-0.043868161737918854,
0.13303670287132263,
-0.077269546687603,
-0.06230432540178299,
-0.031163452193140984,
0.0324987955391407,
0.050387632101774216,
0.11829909682273865,
-0.08942679315805435,
-0.016503842547535896,
0.023250075057148933,
0.024362796917557716,
-0.00670285802334547,
-0.19304122030735016,
0.003860549535602331,
0.046741727739572525,
-0.06196632236242294,
-0.03666633740067482,
-0.010647173970937729,
-0.005676926113665104,
0.09420585632324219,
0.0135497422888875,
-0.048708587884902954,
0.03440325707197189,
0.009412645362317562,
-0.07516806572675705,
0.191216841340065,
-0.10420974344015121,
-0.16551493108272552,
-0.12291181832551956,
-0.07710694521665573,
-0.04443088918924332,
0.007142551243305206,
0.08373303711414337,
-0.07194237411022186,
-0.05137042701244354,
-0.12950603663921356,
-0.04253653809428215,
0.015828151255846024,
0.020578932017087936,
0.025996133685112,
-0.004396375734359026,
0.09185339510440826,
-0.10461779683828354,
-0.023686150088906288,
0.0007797087309882045,
0.01325953658670187,
0.05622752755880356,
0.006513850297778845,
0.10697519779205322,
0.1156882792711258,
-0.025898128747940063,
0.025782093405723572,
-0.0498378649353981,
0.22250670194625854,
-0.07474067807197571,
-0.0013876368757337332,
0.14467018842697144,
-0.014714623801410198,
0.08186877518892288,
0.1382196843624115,
0.03665526583790779,
-0.10227647423744202,
0.010793210938572884,
-0.002699904842302203,
-0.03676435723900795,
-0.21183259785175323,
-0.0006508520455099642,
-0.04812668636441231,
0.015512625686824322,
0.0991290807723999,
0.03659878298640251,
0.03677483648061752,
0.044140346348285675,
-0.001623076619580388,
0.0480724461376667,
0.01888859085738659,
0.11648323386907578,
0.11378227174282074,
0.05532217398285866,
0.14020778238773346,
-0.06674325466156006,
-0.014189689420163631,
0.04197424650192261,
0.015926575288176537,
0.19786904752254486,
0.0034346680622547865,
0.2184843271970749,
0.044662974774837494,
0.15579387545585632,
0.029154375195503235,
0.07166529446840286,
-0.017752885818481445,
-0.021690089255571365,
-0.008293633349239826,
-0.05529181659221649,
-0.035478416830301285,
0.027942828834056854,
-0.09246642887592316,
0.04117816314101219,
-0.11385569721460342,
0.028425155207514763,
0.05304564908146858,
0.30931562185287476,
0.046260394155979156,
-0.3638051748275757,
-0.11857954412698746,
0.02536562643945217,
-0.046062834560871124,
-0.05628114938735962,
0.0049726818688213825,
0.11822395771741867,
-0.05484848842024803,
0.0712202712893486,
-0.08680637925863266,
0.1016140729188919,
-0.03788217157125473,
0.03247760236263275,
0.030203107744455338,
0.07635040581226349,
-0.022366877645254135,
0.04794425144791603,
-0.2944004535675049,
0.28186848759651184,
0.03616195544600487,
0.08070012927055359,
-0.056163087487220764,
0.01617521233856678,
0.014888686127960682,
0.0793016329407692,
0.07903143763542175,
-0.014849654398858547,
-0.14104245603084564,
-0.15859249234199524,
-0.10367986559867859,
0.00730295991525054,
0.08799707144498825,
0.013760117813944817,
0.1150117740035057,
-0.02555539458990097,
-0.008080522529780865,
0.05294498801231384,
-0.049825772643089294,
-0.058449357748031616,
-0.10143531113862991,
0.0098213329911232,
0.04289984703063965,
-0.010113161988556385,
-0.09549986571073532,
-0.0847354605793953,
-0.04192718118429184,
0.17439652979373932,
-0.005414355546236038,
-0.06674597412347794,
-0.12579651176929474,
0.018801270052790642,
0.06542954593896866,
-0.07992120087146759,
0.045567046850919724,
-0.012329883873462677,
0.13286808133125305,
-0.004106123931705952,
-0.06207187846302986,
0.130485400557518,
-0.07137156277894974,
-0.1787666380405426,
-0.04918490722775459,
0.12262427806854248,
-0.006591042969375849,
0.04828309267759323,
0.0028582557570189238,
0.036153651773929596,
-0.019374512135982513,
-0.06020129844546318,
0.025251321494579315,
-0.00668400339782238,
0.07626804709434509,
-0.0566042885184288,
-0.0011070852633565664,
0.01764604076743126,
-0.0708712786436081,
-0.04133019596338272,
0.15820981562137604,
0.2953490912914276,
-0.08413901180028915,
0.06385137885808945,
0.049527861177921295,
-0.048750024288892746,
-0.1636916548013687,
0.017235659062862396,
0.03281617537140846,
0.008147630840539932,
0.013153712265193462,
-0.13645800948143005,
0.009656253270804882,
0.07224352657794952,
-0.03052925504744053,
0.0704646110534668,
-0.29867032170295715,
-0.13408753275871277,
0.08856206387281418,
0.13864780962467194,
0.08680827915668488,
-0.17035020887851715,
-0.057334788143634796,
-0.035853635519742966,
-0.10140758007764816,
0.10012880712747574,
-0.1384146511554718,
0.0974712148308754,
-0.01580645516514778,
0.06494653970003128,
0.010882958769798279,
-0.05704445019364357,
0.12303636968135834,
-0.05570610240101814,
0.09108929336071014,
-0.0742298811674118,
0.055785294622182846,
0.11523783206939697,
-0.09839685261249542,
0.04671685770153999,
-0.12622307240962982,
0.04520836099982262,
-0.08916813135147095,
-0.01233873050659895,
-0.05316086858510971,
0.008620484732091427,
-0.037111423909664154,
-0.03291871026158333,
-0.042392756789922714,
0.01128093060106039,
0.058219268918037415,
-0.025193868204951286,
0.1940305382013321,
0.0025842366740107536,
0.15723088383674622,
0.1753491461277008,
0.1192166656255722,
-0.12051089107990265,
-0.018988195806741714,
0.023443102836608887,
-0.04434137046337128,
0.04474744200706482,
-0.18234330415725708,
0.04752596095204353,
0.11191336065530777,
0.0011437726207077503,
0.1191791221499443,
0.051515303552150726,
-0.06669965386390686,
0.02187197282910347,
0.061716314405202866,
-0.1675442010164261,
-0.1261644810438156,
0.001959932968020439,
0.08552093058824539,
-0.12485659867525101,
0.048132069408893585,
0.13260334730148315,
-0.06461737304925919,
-0.015185782685875893,
0.0039192987605929375,
0.02447914518415928,
-0.014395397156476974,
0.17039677500724792,
0.026610922068357468,
0.07275524735450745,
-0.09089648723602295,
0.08440220355987549,
0.055966250598430634,
-0.11721483618021011,
0.05338446423411369,
0.09317799657583237,
-0.10700295865535736,
-0.03509163111448288,
0.06008245050907135,
0.15010890364646912,
-0.023324964568018913,
-0.0648263543844223,
-0.1638646423816681,
-0.12673832476139069,
0.07060518115758896,
0.203072190284729,
0.058420613408088684,
0.006892344448715448,
-0.007342596538364887,
-0.006492124404758215,
-0.11535002291202545,
0.1140698567032814,
0.04390338808298111,
0.08830610662698746,
-0.13741053640842438,
0.10350076854228973,
-0.012836960144340992,
0.01629280485212803,
-0.011101621203124523,
0.033114880323410034,
-0.11276467889547348,
0.005293880123645067,
-0.1437724530696869,
0.018172087147831917,
-0.04230794683098793,
-0.0007184471469372511,
-0.01868550106883049,
-0.0381145179271698,
-0.053334079682826996,
0.01920231245458126,
-0.10498278588056564,
-0.034436143934726715,
0.012150739319622517,
0.026909131556749344,
-0.11767887324094772,
-0.031398095190525055,
0.013088016770780087,
-0.09302453696727753,
0.07574011385440826,
0.023401863873004913,
-0.0020322136115282774,
0.023570729419589043,
-0.05354335531592369,
0.016549116000533104,
0.05564183369278908,
-0.0030784839764237404,
0.053518977016210556,
-0.12250363826751709,
-0.025488395243883133,
0.017790012061595917,
0.011441751383244991,
0.02741885557770729,
0.11462469398975372,
-0.1103067696094513,
0.0012023444287478924,
-0.0004650464979931712,
-0.049076009541749954,
-0.06195446476340294,
0.0633193626999855,
0.09739852696657181,
0.0028595668263733387,
0.1889418512582779,
-0.09844236820936203,
0.0078062331303954124,
-0.19384057819843292,
0.00265464442782104,
0.003814241848886013,
-0.1467488557100296,
-0.09202678501605988,
-0.03592154011130333,
0.06444614380598068,
-0.07319071888923645,
0.10118504613637924,
-0.014829810708761215,
0.047007277607917786,
0.05789684131741524,
-0.05948210135102272,
0.0035738330334424973,
0.027324505150318146,
0.19882284104824066,
0.00633238023146987,
-0.04106684774160385,
0.06349378824234009,
0.012848136015236378,
0.08438537269830704,
0.09454280138015747,
0.17073026299476624,
0.14636079967021942,
0.02896592579782009,
0.11327608674764633,
0.031028978526592255,
-0.01850351132452488,
-0.16512669622898102,
0.05885887145996094,
-0.03452853485941887,
0.13616974651813507,
-0.011452113278210163,
0.19009806215763092,
0.1674342006444931,
-0.15414807200431824,
0.03340970724821091,
-0.04799512028694153,
-0.07969029992818832,
-0.10640739649534225,
-0.09712887555360794,
-0.09726692736148834,
-0.15310119092464447,
-0.020786434412002563,
-0.11306928098201752,
0.04438066855072975,
0.039279673248529434,
0.016291983425617218,
-0.0008199230651371181,
0.14310908317565918,
0.03799868002533913,
0.01489147450774908,
0.06520965695381165,
-0.0048977104015648365,
-0.03896370530128479,
-0.031466949731111526,
-0.08868836611509323,
0.03104265034198761,
-0.012503912672400475,
0.040556930005550385,
0.0016037822933867574,
-0.0030346454586833715,
0.048399608582258224,
-0.021200746297836304,
-0.12273146957159042,
0.016473587602376938,
0.03866879642009735,
0.06019596382975578,
0.04316774010658264,
0.02291880175471306,
-0.007768592797219753,
-0.004922663327306509,
0.21916811168193817,
-0.08449788391590118,
-0.07226002961397171,
-0.10765359550714493,
0.23900236189365387,
0.001430882140994072,
-0.03377031534910202,
0.01360961515456438,
-0.07720102369785309,
0.01533872727304697,
0.18860463798046112,
0.1561071127653122,
-0.022631971165537834,
-0.004807643126696348,
-0.0446077361702919,
-0.012563313357532024,
-0.04445318877696991,
0.10680598020553589,
0.11831606179475784,
-0.013580521568655968,
-0.06664329022169113,
-0.03486016392707825,
-0.04775251820683479,
-0.013721926137804985,
-0.06321389973163605,
0.07363135367631912,
0.007285879459232092,
0.006786066573113203,
-0.03000662475824356,
0.06894145160913467,
-0.01667844131588936,
-0.06140400469303131,
0.0006757932715117931,
-0.19631709158420563,
-0.149160698056221,
-0.0030580000020563602,
0.08194990456104279,
-0.01849699579179287,
0.04490560665726662,
-0.002323838882148266,
0.01523518841713667,
0.06964034587144852,
-0.02406616322696209,
-0.05115535482764244,
-0.07661522179841995,
0.08077628165483475,
-0.164631187915802,
0.20505569875240326,
-0.02518352121114731,
0.01521302666515112,
0.14588244259357452,
0.02761113829910755,
-0.1225055456161499,
0.07892198115587234,
0.04819807782769203,
-0.04700801149010658,
0.025898702442646027,
0.12943822145462036,
-0.02738753892481327,
0.09751599282026291,
0.04617941752076149,
-0.12066187709569931,
-0.005342378746718168,
-0.09987451881170273,
-0.03598747029900551,
-0.032089244574308395,
-0.038279395550489426,
-0.048013292253017426,
0.12755435705184937,
0.16903825104236603,
-0.05494844168424606,
0.005060712806880474,
-0.053931836038827896,
0.015517919324338436,
0.07701929658651352,
-0.005443364381790161,
-0.03717942163348198,
-0.26116904616355896,
0.012315191328525543,
0.09293162822723389,
-0.0008765630191192031,
-0.29104918241500854,
-0.08944279700517654,
-0.006462309043854475,
-0.03352370113134384,
-0.1074008047580719,
0.09902018308639526,
0.10939700156450272,
0.04410039260983467,
-0.06998177617788315,
-0.03964638337492943,
-0.0659458339214325,
0.1732543259859085,
-0.1204405352473259,
-0.06695820391178131
] |
null | null | null |
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="shazzz/taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
| {"tags": ["Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "taxi-v3", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "Taxi-v3", "type": "Taxi-v3"}, "metrics": [{"type": "mean_reward", "value": "7.56 +/- 2.71", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | shazzz/taxi-v3 | [
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | 2024-02-12T11:21:58+00:00 | [] | [] | TAGS
#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
|
# Q-Learning Agent playing1 Taxi-v3
This is a trained model of a Q-Learning agent playing Taxi-v3 .
## Usage
| [
"# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
"TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n",
"# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
32,
33
] | [
"passage: TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
0.048862796276807785,
-0.16549694538116455,
-0.005485367961227894,
0.02960980497300625,
0.1345081776380539,
-0.01784728653728962,
0.11895976960659027,
0.07759871333837509,
-0.07461097836494446,
-0.055395450443029404,
0.1418241262435913,
0.09088201075792313,
0.055222880095243454,
0.05699880048632622,
0.09511256217956543,
-0.27440664172172546,
0.048217080533504486,
-0.02918700873851776,
0.05621987581253052,
0.11878681182861328,
0.0670095682144165,
-0.040441032499074936,
0.061956584453582764,
0.11818158626556396,
-0.1018151044845581,
-0.007344264071434736,
0.035402704030275345,
-0.09440053254365921,
0.17413531243801117,
0.07204403728246689,
0.12337774783372879,
0.05132639780640602,
0.179361954331398,
-0.12762396037578583,
0.024310702458024025,
-0.0010275895474478602,
-0.10138072073459625,
-0.03909514099359512,
-0.012415820732712746,
-0.08349097520112991,
0.03230205550789833,
0.23522862792015076,
0.07199250161647797,
0.06632792949676514,
-0.17707863450050354,
-0.06584878265857697,
-0.04375573247671127,
0.069611094892025,
0.14951466023921967,
0.03758616745471954,
-0.033800311386585236,
0.1684885323047638,
-0.2564343810081482,
0.05066783353686333,
0.037275806069374084,
-0.42313119769096375,
0.017119819298386574,
0.1507398933172226,
0.15090937912464142,
0.06909667700529099,
-0.10573802888393402,
0.013512322679162025,
0.051325585693120956,
-0.0005318621988408267,
0.024325110018253326,
0.006554204970598221,
0.15601307153701782,
0.08537693321704865,
-0.1487821787595749,
-0.058576688170433044,
0.17441977560520172,
-0.03788546845316887,
-0.02613203600049019,
-0.039745692163705826,
0.0067160045728087425,
-0.06427708268165588,
-0.004067842848598957,
-0.1777995079755783,
0.00734262028709054,
0.06666424125432968,
-0.014348524622619152,
0.014901017770171165,
-0.035522811114788055,
-0.0966939702630043,
-0.023098144680261612,
-0.08592145889997482,
0.01677769608795643,
-0.006319406442344189,
-0.10187895596027374,
0.05002119392156601,
-0.061138734221458435,
0.0014382408699020743,
-0.05123179033398628,
-0.15047866106033325,
-0.049055423587560654,
-0.03481535613536835,
0.1474713832139969,
-0.0044205985032022,
-0.01873963139951229,
-0.03164304047822952,
0.15474793314933777,
0.049551334232091904,
-0.05370146036148071,
0.05625450983643532,
0.07605006545782089,
0.23867930471897125,
0.10401605814695358,
0.10196955502033234,
-0.06798075139522552,
0.10180158913135529,
-0.12330973148345947,
-0.08915644884109497,
-0.17508824169635773,
0.11820860952138901,
0.00015364694991149008,
0.1317785084247589,
-0.12023144960403442,
0.07898581773042679,
-0.067511186003685,
0.013453764840960503,
0.01636839471757412,
0.0820009782910347,
-0.012399360537528992,
0.10676060616970062,
-0.005061192903667688,
-0.06941985338926315,
0.014177112840116024,
0.05935845896601677,
0.03754841163754463,
-0.038601722568273544,
-0.03192409873008728,
-0.05762290954589844,
-0.05065649375319481,
-0.10128600150346756,
-0.06447898596525192,
0.018573462963104248,
-0.007677143905311823,
-0.1833900660276413,
-0.06407523155212402,
0.00897200871258974,
0.015712225809693336,
-0.03988850116729736,
-0.05148044601082802,
-0.15265507996082306,
-0.042461175471544266,
-0.015450406819581985,
-0.03500641882419586,
-0.06214277446269989,
-0.0383245050907135,
0.046435944736003876,
-0.07560601085424423,
0.013364278711378574,
0.023342855274677277,
0.05405820533633232,
-0.025881100445985794,
0.06068144738674164,
-0.08357544988393784,
0.09493788331747055,
-0.1540430635213852,
-0.03271956741809845,
-0.025445878505706787,
-0.041183918714523315,
0.1752462536096573,
0.06099751964211464,
-0.015994304791092873,
0.15260063111782074,
-0.17141541838645935,
-0.058121129870414734,
0.15596486628055573,
0.008629098534584045,
-0.09967197477817535,
-0.003560945624485612,
-0.09397093951702118,
0.1428760588169098,
0.08571921288967133,
0.2478504776954651,
0.12005335837602615,
-0.22748184204101562,
0.055358242243528366,
0.12515293061733246,
-0.14365963637828827,
0.10365243256092072,
0.07344598323106766,
0.005470725707709789,
-0.18886831402778625,
-0.06843198090791702,
-0.06121627986431122,
0.1053021252155304,
-0.08522345870733261,
-0.0776243582367897,
0.09323626756668091,
-0.05086790770292282,
0.24641476571559906,
-0.028281206265091896,
0.06174173951148987,
-0.026681531220674515,
-0.1389324963092804,
-0.01723906397819519,
0.060955192893743515,
0.05258452147245407,
-0.024835573509335518,
-0.25895482301712036,
0.13646544516086578,
0.048650871962308884,
0.025074828416109085,
0.004106190986931324,
-0.05691491439938545,
0.016934165731072426,
0.1511998474597931,
0.020012924447655678,
0.13717477023601532,
0.027723990380764008,
0.0706823319196701,
-0.006239562761038542,
-0.10560829937458038,
-0.04169593006372452,
0.061916545033454895,
-0.08518962562084198,
-0.06641357392072678,
0.011197872459888458,
-0.06935211271047592,
-0.11783787608146667,
-0.12166737765073776,
-0.026334572583436966,
-0.02980303019285202,
-0.07444227486848831,
0.02368103712797165,
0.06536602973937988,
-0.06702698022127151,
-0.0023908785078674555,
0.007125476840883493,
-0.011537045240402222,
0.16434046626091003,
0.011393417604267597,
-0.007796820718795061,
0.1328643560409546,
-0.11533161997795105,
0.12461213022470474,
0.049438029527664185,
-0.024806302040815353,
-0.04662557691335678,
0.0014137453399598598,
-0.057529181241989136,
0.029044216498732567,
-0.04390640929341316,
0.02774495631456375,
0.20111067593097687,
0.02772962674498558,
0.11389166116714478,
-0.0656520202755928,
0.04385066404938698,
-0.007961965166032314,
-0.009693224914371967,
0.018563594669103622,
0.07608018070459366,
0.07813210040330887,
-0.1324140727519989,
0.02262016013264656,
0.22455167770385742,
0.1385764330625534,
0.18313980102539062,
-0.010877152904868126,
0.06325667351484299,
-0.04875868931412697,
0.027505528181791306,
0.024100203067064285,
0.10314226150512695,
-0.10732068121433258,
-0.0322517491877079,
-0.025407759472727776,
0.023599207401275635,
-0.08197105675935745,
-0.1055799350142479,
-0.090115025639534,
0.01222382951527834,
-0.03125503659248352,
-0.15570329129695892,
0.13300658762454987,
-0.10451057553291321,
0.01802753657102585,
0.04692702740430832,
-0.22163605690002441,
0.11530312895774841,
0.014291439205408096,
-0.10303618758916855,
0.11281087249517441,
-0.12051989883184433,
-0.08699832111597061,
-0.05777236074209213,
-0.18658851087093353,
0.05280197039246559,
0.04673841595649719,
0.05166793242096901,
-0.18521739542484283,
0.024835903197526932,
0.05545609071850777,
0.13426995277404785,
-0.09743253141641617,
-0.07142634689807892,
-0.15038461983203888,
0.016068490222096443,
-0.033661190420389175,
-0.16029728949069977,
-0.005609163548797369,
-0.032781440764665604,
-0.18849676847457886,
-0.04539939761161804,
-0.15086813271045685,
-0.034627582877874374,
0.20464378595352173,
0.026907702907919884,
0.09480511397123337,
-0.07926445454359055,
0.3802889585494995,
-0.042039383202791214,
-0.06146497279405594,
-0.01321389526128769,
-0.07072482258081436,
0.02512686513364315,
0.13271741569042206,
0.0036099457647651434,
-0.017886579036712646,
-0.0037857077550143003,
0.0024592927657067776,
-0.06234965845942497,
-0.13400450348854065,
0.0028710351325571537,
0.03905198723077774,
0.1874423623085022,
0.004639793653041124,
0.06659388542175293,
0.03133883699774742,
0.057546284049749374,
0.07748064398765564,
0.030926106497645378,
0.0011591583024710417,
-0.01591806672513485,
0.06604493409395218,
-0.11684755235910416,
0.042466625571250916,
-0.030429253354668617,
-0.10143838077783585,
-0.013183288276195526,
0.07950251549482346,
0.12755028903484344,
0.17849206924438477,
-0.04790908098220825,
0.17489230632781982,
0.13580141961574554,
0.16576050221920013,
0.049315933138132095,
-0.020801831036806107,
-0.08773037046194077,
-0.06118565797805786,
0.004774159751832485,
-0.031952597200870514,
0.04869702458381653,
0.3231290578842163,
0.037619613111019135,
-0.09036035090684891,
0.11149907857179642,
0.009480619803071022,
0.05359881371259689,
0.022797370329499245,
-0.11162138730287552,
0.11170321702957153,
0.07968773692846298,
-0.06341761350631714,
-0.07602835446596146,
0.16758501529693604,
-0.1109386757016182,
-0.26646625995635986,
-0.11410990357398987,
-0.012305386364459991,
0.07903840392827988,
0.005651174578815699,
0.05498376116156578,
-0.11829282343387604,
-0.16034497320652008,
-0.034191906452178955,
0.1335442066192627,
-0.3077351450920105,
0.2065143585205078,
-0.0198091771453619,
0.06707923114299774,
-0.039657969027757645,
-0.07026876509189606,
0.09694647043943405,
0.13174086809158325,
0.29124146699905396,
0.01396956667304039,
0.04841272905468941,
-0.15176129341125488,
-0.0976925864815712,
0.0018439020495861769,
0.015482662245631218,
-0.02563396655023098,
0.028520405292510986,
-0.0540912002325058,
0.008404579944908619,
-0.018086453899741173,
0.2102297693490982,
-0.11316607892513275,
0.004344627261161804,
-0.06968966871500015,
-0.11707738786935806,
0.19409789144992828,
-0.07178345322608948,
-0.04543264955282211,
-0.14959357678890228,
-0.15512511134147644,
-0.004174166824668646,
-0.02413962036371231,
-0.019664527848362923,
-0.17603960633277893,
-0.18804074823856354,
-0.05204557999968529,
-0.005645004566758871,
-0.003464865731075406,
0.05867868289351463,
-0.07517234236001968,
-0.04805335775017738,
0.1009904220700264,
-0.07743175327777863,
-0.056063808500766754,
-0.1103200614452362,
0.1391381323337555,
0.06248528137803078,
0.16743235290050507,
0.05907081440091133,
0.0006117874872870743,
0.11471151560544968,
-0.02913086675107479,
0.11103474348783493,
-0.11291708797216415,
-0.17145049571990967,
-0.08334989100694656,
-0.018775060772895813,
0.09519003331661224,
-0.04789286106824875,
0.0028788831550627947,
0.2550160884857178,
0.14880181849002838,
-0.0897710770368576,
0.27680760622024536,
0.04414956644177437,
-0.09375058114528656,
-0.18432219326496124,
-0.15961645543575287,
0.03759992495179176,
0.060025621205568314,
0.13095876574516296,
-0.057205069810152054,
-0.08483537286520004,
-0.08492398262023926,
-0.07478608191013336,
-0.13140805065631866,
-0.24232175946235657,
-0.030598774552345276,
0.22874866425991058,
0.08656918257474899,
0.08219650387763977,
-0.012482990510761738,
-0.01186054851859808,
0.00526038184762001,
0.02680150233209133,
0.12018456310033798,
-0.13341329991817474,
0.11107480525970459,
0.022198403254151344,
0.044267985969781876,
0.009712530300021172,
0.07929777354001999,
0.03375575691461563,
-0.003218587953597307,
-0.0006439819699153304,
-0.0988350659608841,
-0.2596651017665863,
0.0816885456442833,
-0.01623627357184887,
-0.09960969537496567,
0.014988959766924381,
0.02061903104186058,
-0.2089255303144455,
0.011128270998597145,
-0.019883770495653152,
-0.03150356933474541,
-0.06483490765094757,
-0.10664787143468857,
-0.056551624089479446,
0.04928823933005333,
0.10853826254606247,
0.011660109274089336,
0.05354316532611847,
-0.0404130220413208,
0.07917837053537369,
0.0826287642121315,
0.15132710337638855,
0.06795957684516907,
-0.190711110830307,
-0.10953907668590546,
-0.0414445661008358,
0.12121522426605225,
-0.12505418062210083,
0.036917757242918015,
0.053161121904850006,
-0.016534561291337013,
0.14621229469776154,
0.1070784479379654,
-0.07452095299959183,
0.11915595084428787,
0.08904775977134705,
-0.04094788804650307,
-0.23367151618003845,
-0.07120766490697861,
0.11133213341236115,
0.07195597887039185,
-0.03961895406246185,
0.018120890483260155,
-0.04960581287741661,
-0.013980977237224579,
0.048759616911411285,
-0.0538676381111145,
-0.07230538129806519,
0.004421027842909098,
0.1247575581073761,
0.1029362753033638,
-0.04655474051833153,
0.01296416949480772,
0.037371400743722916,
0.003788623260334134,
0.04730486497282982,
0.0407949760556221,
-0.08269952982664108,
-0.04124005511403084,
0.02782733179628849,
0.37552911043167114,
-0.010165480896830559,
-0.020456433296203613,
0.018555615097284317,
-0.19949445128440857,
0.09135842323303223,
0.13205479085445404,
0.04697350412607193,
0.004247748292982578,
-0.08139242231845856,
0.026877427473664284,
-0.010625290684401989,
0.09936143457889557,
-0.07806670665740967,
-0.05493134260177612,
-0.21631066501140594,
-0.025010565295815468,
0.017490221187472343,
0.24077683687210083,
-0.08458559215068817,
-0.12801732122898102,
-0.20628872513771057,
0.13128381967544556,
-0.11333390325307846,
-0.03695881739258766,
-0.024473199620842934,
0.03926658630371094,
-0.01989821158349514,
0.06291737407445908,
-0.0710630789399147,
0.006373001262545586,
-0.11024709790945053,
0.055267609655857086,
0.04204455390572548,
0.1229788213968277,
0.014207782223820686,
0.02016810141503811,
0.05822525918483734,
-0.01837925612926483,
0.07173580676317215,
-0.06203491613268852,
-0.04550490900874138,
0.14224006235599518,
-0.020255116745829582,
-0.04152837023139,
-0.0483345128595829,
-0.036874305456876755,
0.11981741338968277,
-0.05059147998690605,
-0.007141099311411381,
-0.054929375648498535,
-0.06906463205814362,
0.03462086617946625,
-0.009175732731819153,
-0.008798843249678612,
0.06801853328943253,
0.04024988040328026,
-0.026994358748197556,
0.005263668950647116,
0.03447828069329262,
-0.10330043733119965,
-0.04955084249377251,
0.16955432295799255,
-0.0749620869755745,
0.10274054110050201,
-0.031069839373230934,
0.018015999346971512,
0.005847334861755371,
-0.022399673238396645,
-0.015360680408775806,
-0.1457086056470871,
-0.06137600541114807,
-0.09489979594945908,
0.11565322428941727,
0.08146517723798752,
0.03358805552124977,
0.04274565726518631,
0.019532648846507072,
-0.04414922371506691,
-0.038583990186452866,
0.12961317598819733,
0.08133101463317871,
0.012996876612305641,
0.01137041300535202,
0.01941833831369877,
-0.020302120596170425,
0.0028480992186814547,
-0.01250747125595808,
-0.07239153981208801,
-0.05874783173203468,
0.09400010108947754,
0.1600283533334732,
-0.06127211079001427,
-0.13325586915016174,
-0.020593497902154922,
0.04988488554954529,
0.0014717020094394684,
-0.08777432143688202,
0.04833676666021347,
0.15805292129516602,
-0.05623878911137581,
0.03216489031910896,
-0.09984751045703888,
-0.07263360917568207,
-0.16060975193977356,
-0.10029061883687973,
-0.06092562898993492,
-0.28350353240966797,
0.09752398729324341,
0.006392303854227066,
-0.014731393195688725,
0.059529416263103485,
0.051305368542671204,
-0.052508849650621414,
0.07068239152431488,
-0.18146829307079315,
-0.007054794579744339,
0.03497592359781265,
-0.13212306797504425,
0.02475893869996071,
-0.2378365397453308,
0.10198072344064713,
-0.04623803123831749,
-0.1519704908132553,
-0.04004510119557381,
0.0641569048166275,
-0.09540136158466339,
-0.01822364516556263,
-0.0475153923034668,
-0.01922670193016529,
0.01624443754553795,
-0.009348669089376926,
-0.031147832050919533,
0.13716529309749603,
0.02827494591474533,
-0.03268734738230705,
0.005254602525383234,
0.0223685409873724,
0.03955082967877388,
-0.0969657450914383,
-0.05986930429935455,
0.08311155438423157,
-0.031056145206093788,
0.14728976786136627,
0.000341245875461027,
0.04181376099586487,
-0.06758682429790497,
0.2593761384487152,
0.2023983597755432,
-0.12479214370250702,
0.008118697442114353,
-0.021801479160785675,
0.012670028023421764,
-0.041751839220523834,
0.13110700249671936,
0.013386172242462635,
0.12186761200428009,
-0.17513342201709747,
-0.01036517322063446,
-0.0818324014544487,
-0.04501292482018471,
0.06702108681201935,
0.14714950323104858,
0.15742522478103638,
0.03436789661645889,
-0.07328428328037262,
0.06722653657197952,
-0.30119743943214417,
0.20540550351142883,
-0.1346001923084259,
-0.01498429011553526,
-0.040251150727272034,
-0.058389630168676376,
0.061147745698690414,
0.11309876292943954,
0.10832664370536804,
-0.021150551736354828,
-0.0905047357082367,
-0.04486766457557678,
-0.039378076791763306,
-0.13019338250160217,
-0.02718670479953289,
0.1654091775417328,
0.06799814850091934,
0.31520840525627136,
-0.017577875405550003,
0.07702425122261047,
0.034410297870635986,
0.06451138854026794,
0.004519328009337187,
0.09537279605865479,
0.07960964739322662,
-0.06345855444669724,
-0.07373003661632538,
-0.001637450186535716,
0.05033271387219429,
0.14567798376083374,
-0.03826142102479935,
-0.18691548705101013,
0.15858715772628784,
0.07192251086235046,
-0.13762691617012024,
-0.05777517706155777,
0.08409425616264343,
-0.0739973932504654,
0.0550808347761631,
0.08115427941083908,
0.015876613557338715,
-0.017793258652091026,
-0.004664506763219833,
0.06074233725667,
0.024694660678505898,
-0.02343848906457424,
0.003570882137864828,
-0.08337053656578064,
-0.04151543974876404,
0.07267895340919495,
-0.0844460055232048,
-0.20546193420886993,
-0.0957019031047821,
-0.07551700621843338,
0.030557552352547646,
-0.0649830624461174,
0.12575586140155792,
0.1717868149280548,
0.0593598335981369,
-0.03307248651981354,
-0.10721943527460098,
-0.035562749952077866,
0.07602505385875702,
-0.044773899018764496,
-0.09409699589014053
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | automatic-speech-recognition | SpideyDLK/wav2vec2-large-xls-r-300m-sinhala-test4-with-checkpoints | [
"transformers",
"tensorboard",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:24:52+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
51,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06918960809707642,
0.13210147619247437,
-0.0040207370184361935,
0.023134203627705574,
0.11738458275794983,
0.003100133500993252,
0.06489233672618866,
0.1062328964471817,
-0.018454808741807938,
0.11934409290552139,
0.02399194799363613,
0.10645237565040588,
0.10633884370326996,
0.1783033311367035,
-0.006676932331174612,
-0.20753470063209534,
0.05159076303243637,
-0.1328369528055191,
-0.006210802122950554,
0.12206359207630157,
0.12859149277210236,
-0.12210913747549057,
0.0661126896739006,
-0.03582390025258064,
-0.006673390511423349,
-0.036393795162439346,
-0.05692959576845169,
-0.05386972799897194,
0.06668701767921448,
0.062350641936063766,
0.060644831508398056,
0.018570519983768463,
0.09333337843418121,
-0.2811316251754761,
0.022820472717285156,
0.08144836127758026,
0.006955916993319988,
0.06573166698217392,
0.07077054679393768,
-0.07532189786434174,
0.07820203900337219,
-0.06897217780351639,
0.14961306750774384,
0.07984381169080734,
-0.09012829512357712,
-0.1924186497926712,
-0.08871616423130035,
0.0939040556550026,
0.18705229461193085,
0.05621805414557457,
-0.031164970248937607,
0.13427454233169556,
-0.06805921345949173,
0.01878679171204567,
0.0681581050157547,
-0.07620836049318314,
-0.052718792110681534,
0.06274411827325821,
0.07032535970211029,
0.09331566095352173,
-0.13174302875995636,
-0.00581010989844799,
0.02852778322994709,
0.010611804202198982,
0.10465876758098602,
0.019570648670196533,
0.12078016996383667,
0.03880659490823746,
-0.14213255047798157,
-0.04347489774227142,
0.08553854376077652,
0.040430404245853424,
-0.053023193031549454,
-0.25508078932762146,
-0.01728249341249466,
-0.03535711020231247,
-0.03508080542087555,
-0.050225600600242615,
0.04455358535051346,
-0.02446228265762329,
0.07527779787778854,
-0.005772874690592289,
-0.07288429886102676,
-0.049391333013772964,
0.07827333360910416,
0.07604440301656723,
0.027127955108880997,
-0.02607012540102005,
0.012724175117909908,
0.11518751829862595,
0.11186537146568298,
-0.11170512437820435,
-0.052296292036771774,
-0.06167195737361908,
-0.09369548410177231,
-0.047245271503925323,
0.03096519224345684,
0.04075217619538307,
0.05507682263851166,
0.21005180478096008,
0.004100160673260689,
0.05025365948677063,
0.030208947136998177,
0.013425402343273163,
0.06431768089532852,
0.09155748784542084,
-0.0652301162481308,
-0.12225554138422012,
-0.02715214155614376,
0.10966562479734421,
0.009606653824448586,
-0.03282571956515312,
-0.04075070470571518,
0.0665077269077301,
0.030208082869648933,
0.12366250902414322,
0.0723525807261467,
0.018685176968574524,
-0.07855737954378128,
-0.06267400830984116,
0.1677972972393036,
-0.1649521440267563,
0.03285328298807144,
0.02912791818380356,
-0.050073519349098206,
-0.008440917357802391,
0.01682254858314991,
0.021022414788603783,
-0.018704243004322052,
0.08882031589746475,
-0.054653100669384,
-0.03264474496245384,
-0.11321555823087692,
-0.05006399378180504,
0.028676055371761322,
0.006981914862990379,
-0.03174450621008873,
-0.04053306579589844,
-0.10819326341152191,
-0.07601769268512726,
0.07845603674650192,
-0.06794282793998718,
-0.04567456990480423,
-0.03693155571818352,
-0.077850341796875,
0.013987138867378235,
-0.001372430007904768,
0.11866221576929092,
-0.028359893709421158,
0.049781348556280136,
-0.06040623039007187,
0.07331450283527374,
0.1427365392446518,
0.027582714334130287,
-0.05536656826734543,
0.05209227278828621,
-0.22961750626564026,
0.10650996118783951,
-0.0820845440030098,
0.039568543434143066,
-0.16523221135139465,
-0.01437871903181076,
0.04151884838938713,
0.02703598327934742,
-0.011580551974475384,
0.13367699086666107,
-0.20120634138584137,
-0.03629620373249054,
0.17902998626232147,
-0.11463885754346848,
-0.08275967836380005,
0.05660289525985718,
-0.05534304678440094,
0.12154120951890945,
0.04968025162816048,
-0.015457268804311752,
0.02872299961745739,
-0.14586561918258667,
-0.015341621823608875,
-0.06385710090398788,
-0.031775522977113724,
0.15648432075977325,
0.058627333492040634,
-0.05283202603459358,
0.06168147549033165,
0.01965263858437538,
-0.018219612538814545,
-0.04959159716963768,
-0.03271770104765892,
-0.09723224490880966,
0.011255990713834763,
-0.0728980302810669,
0.023943135514855385,
-0.031872402876615524,
-0.09092787653207779,
-0.03651702031493187,
-0.15960368514060974,
0.006672970950603485,
0.09574975073337555,
-0.005800875835120678,
-0.02275932766497135,
-0.11338774859905243,
-0.010310402140021324,
0.020829740911722183,
-0.0006964936037547886,
-0.14685183763504028,
-0.05314113572239876,
0.017828308045864105,
-0.16250769793987274,
0.031012238934636116,
-0.03655901551246643,
0.04738416150212288,
0.03556562215089798,
-0.03982981666922569,
-0.03375418856739998,
0.019630931317806244,
0.022369354963302612,
-0.010214408859610558,
-0.2756194770336151,
-0.015468244440853596,
-0.043052829802036285,
0.16435527801513672,
-0.2469322234392166,
0.04182727262377739,
0.07295827567577362,
0.1338571161031723,
0.015705497935414314,
-0.03647774085402489,
0.028713135048747063,
-0.06289805471897125,
-0.030222538858652115,
-0.06501726806163788,
-0.007188703399151564,
-0.039097823202610016,
-0.04806915298104286,
0.04462466016411781,
-0.16899824142456055,
-0.033922191709280014,
0.1186266764998436,
0.04557104408740997,
-0.15134701132774353,
-0.04948775842785835,
-0.04092395305633545,
-0.056753676384687424,
-0.06932670623064041,
-0.0517798475921154,
0.10663432627916336,
0.05747092142701149,
0.05196038633584976,
-0.05911761149764061,
-0.06484735757112503,
0.00799498613923788,
-0.01853559911251068,
-0.023748042061924934,
0.07913291454315186,
0.06702018529176712,
-0.11829525977373123,
0.09312599897384644,
0.08573136478662491,
0.07933273166418076,
0.10508506000041962,
-0.0014733473071828485,
-0.09117123484611511,
-0.025300826877355576,
0.029316658154129982,
0.016105778515338898,
0.14908336102962494,
-0.04350128397345543,
0.04314031824469566,
0.040114615112543106,
-0.01687462255358696,
0.008028145879507065,
-0.09918303042650223,
0.030367493629455566,
0.026081476360559464,
-0.012195796705782413,
0.041467417031526566,
-0.05302301421761513,
0.021834537386894226,
0.10195169597864151,
0.03181454911828041,
0.04113520681858063,
0.011278065852820873,
-0.050533477216959,
-0.11812540888786316,
0.17222443222999573,
-0.10861039906740189,
-0.2369978129863739,
-0.12320686131715775,
-0.01618431694805622,
0.02991701476275921,
-0.015134924091398716,
0.01900940015912056,
-0.06770696491003036,
-0.11834623664617538,
-0.09672471135854721,
0.04564153030514717,
0.06599046289920807,
-0.08051323890686035,
-0.055777665227651596,
0.06501153111457825,
0.048011794686317444,
-0.13664643466472626,
0.02571168728172779,
0.03327706828713417,
-0.08857693523168564,
0.00793769583106041,
0.08559047430753708,
0.06839455664157867,
0.18071474134922028,
0.01134483702480793,
-0.023087946698069572,
0.017521869391202927,
0.19720622897148132,
-0.14027054607868195,
0.10202740132808685,
0.13801661133766174,
-0.07145930081605911,
0.07873693108558655,
0.2032429575920105,
0.039016321301460266,
-0.10376140475273132,
0.039679598063230515,
0.036421533674001694,
-0.025852223858237267,
-0.24745285511016846,
-0.08099643886089325,
0.00836301501840353,
-0.0664474293589592,
0.0802333801984787,
0.08307429403066635,
0.09203000366687775,
0.023238254711031914,
-0.1043974831700325,
-0.07363210618495941,
0.05418974906206131,
0.11036353558301926,
-0.004034504294395447,
-0.011317858472466469,
0.09753942489624023,
-0.020273780450224876,
0.02676866576075554,
0.08875394612550735,
0.012205728329718113,
0.18836407363414764,
0.050518929958343506,
0.14771167933940887,
0.09208200126886368,
0.053752463310956955,
0.016467519104480743,
0.010000402107834816,
0.017887894064188004,
0.02435637265443802,
-0.014350295066833496,
-0.08589190989732742,
-0.006933859083801508,
0.1298609972000122,
0.027646880596876144,
0.04127250239253044,
0.013248836621642113,
-0.04125351831316948,
0.08765199780464172,
0.17516882717609406,
0.013442369177937508,
-0.20506484806537628,
-0.06488820165395737,
0.0686659887433052,
-0.08813467621803284,
-0.10374542325735092,
-0.021716099232435226,
0.04023343697190285,
-0.1762947142124176,
0.02770446240901947,
-0.025082001462578773,
0.0983029454946518,
-0.12493812292814255,
-0.01920684240758419,
0.0476171039044857,
0.06939635425806046,
-0.018209589645266533,
0.0625329241156578,
-0.17832936346530914,
0.13725855946540833,
0.012600419111549854,
0.07603015750646591,
-0.0920197069644928,
0.0829358845949173,
0.010243658907711506,
-0.008985995315015316,
0.14880549907684326,
-0.002428766805678606,
-0.056611087173223495,
-0.10275979340076447,
-0.09291432052850723,
-0.01180565357208252,
0.11795864999294281,
-0.11873860657215118,
0.09995509684085846,
-0.017298342660069466,
-0.043639615178108215,
0.0016699014231562614,
-0.12897762656211853,
-0.1380222588777542,
-0.17400150001049042,
0.041601065546274185,
-0.12252611666917801,
0.04249255359172821,
-0.10634490847587585,
-0.05313412845134735,
-0.058118730783462524,
0.19448153674602509,
-0.2263878583908081,
-0.07106572389602661,
-0.1503530591726303,
-0.06515897810459137,
0.11819497495889664,
-0.042735762894153595,
0.08508200198411942,
0.017862383276224136,
0.19214710593223572,
0.010283242911100388,
-0.013114631175994873,
0.10883224755525589,
-0.10211063176393509,
-0.21299202740192413,
-0.10015871375799179,
0.13945214450359344,
0.13517092168331146,
0.038856618106365204,
0.002108179498463869,
0.030881604179739952,
-0.006152692716568708,
-0.11462404578924179,
0.028862472623586655,
0.18585458397865295,
0.10306477546691895,
0.03526908904314041,
-0.03260820358991623,
-0.14471980929374695,
-0.08779244124889374,
-0.045098960399627686,
0.017435450106859207,
0.19264571368694305,
-0.07120641320943832,
0.17354503273963928,
0.15474873781204224,
-0.053835928440093994,
-0.20943360030651093,
0.03015606477856636,
0.036211419850587845,
0.0007652041967958212,
0.05587008595466614,
-0.19489167630672455,
0.0909743532538414,
0.0033501458819955587,
-0.057322751730680466,
0.12121490389108658,
-0.17501963675022125,
-0.15013514459133148,
0.07031099498271942,
0.07301220297813416,
-0.17921873927116394,
-0.12142012268304825,
-0.09439031779766083,
-0.04026462882757187,
-0.11460573226213455,
0.07970702648162842,
-0.016233494505286217,
0.010252374224364758,
0.032961323857307434,
0.018216567113995552,
0.010428756475448608,
-0.04740371182560921,
0.1864585429430008,
-0.003947122488170862,
0.04788469523191452,
-0.07597782462835312,
-0.06253167986869812,
0.045070283114910126,
-0.06455249339342117,
0.0716865211725235,
-0.00903246272355318,
0.006079745013266802,
-0.1052967831492424,
-0.06088602915406227,
-0.03328738734126091,
0.02272024378180504,
-0.07930614799261093,
-0.09432698786258698,
-0.03726235777139664,
0.10006307810544968,
0.09058371931314468,
-0.03892482817173004,
-0.06462740153074265,
-0.08978539705276489,
0.028800709173083305,
0.21877005696296692,
0.177296444773674,
0.05685123801231384,
-0.066028892993927,
-0.00540707865729928,
-0.01588953658938408,
0.053271859884262085,
-0.2026120126247406,
0.0566285103559494,
0.035300228744745255,
0.033545590937137604,
0.11711569130420685,
-0.026464059948921204,
-0.16407892107963562,
-0.048686347901821136,
0.05304291099309921,
-0.07358507066965103,
-0.17289869487285614,
0.014132710173726082,
0.07088939845561981,
-0.1477956771850586,
-0.023786291480064392,
0.04775075986981392,
-0.017420068383216858,
-0.03159533068537712,
0.006238185800611973,
0.08124099671840668,
0.01671770215034485,
0.09224288910627365,
0.053469255566596985,
0.09704500436782837,
-0.10683690756559372,
0.06699982285499573,
0.07745448499917984,
-0.10474617779254913,
0.03967198729515076,
0.0603945255279541,
-0.06895622611045837,
-0.03619396686553955,
0.033563096076250076,
0.08692663908004761,
0.04178347438573837,
-0.060071151703596115,
0.0073408023454248905,
-0.10486608743667603,
0.06092875450849533,
0.1210157498717308,
0.04285310208797455,
0.0076990588568151,
0.036018576472997665,
0.04045969620347023,
-0.09288305044174194,
0.12451037764549255,
0.04114879295229912,
0.028287222608923912,
-0.05418051406741142,
-0.028997255489230156,
0.03649618849158287,
-0.03188192844390869,
-0.01566455140709877,
-0.04152749106287956,
-0.06663620471954346,
-0.010323094204068184,
-0.16889281570911407,
0.006573607679456472,
-0.05270812287926674,
0.008401375263929367,
0.021295055747032166,
-0.03304858133196831,
0.005127503536641598,
0.019244063645601273,
-0.07131489366292953,
-0.052214257419109344,
-0.006754601374268532,
0.10161449760198593,
-0.17169132828712463,
0.014349433593451977,
0.0744767114520073,
-0.12469461560249329,
0.08815638720989227,
0.018520260229706764,
0.0005999338463880122,
0.03465453162789345,
-0.13307695090770721,
0.043367430567741394,
-0.006723123602569103,
0.011691853404045105,
0.048354603350162506,
-0.21661832928657532,
-0.0025545719545334578,
-0.04856108874082565,
-0.055710889399051666,
-0.006375120021402836,
-0.02562650851905346,
-0.11432337760925293,
0.10399775207042694,
0.010540200397372246,
-0.0755159854888916,
-0.02542583830654621,
0.037674929946660995,
0.0969945415854454,
-0.03298725560307503,
0.16065140068531036,
-0.01863807439804077,
0.06254526972770691,
-0.1797095239162445,
-0.018202031031250954,
-0.01975269988179207,
0.023043567314743996,
-0.03248249739408493,
-0.008440588600933552,
0.05180126056075096,
-0.023841936141252518,
0.20870842039585114,
-0.022057142108678818,
0.033427316695451736,
0.06674833595752716,
-0.021141132339835167,
-0.02877473458647728,
0.1086326614022255,
0.054397158324718475,
0.012029323726892471,
0.03175004944205284,
0.006914193741977215,
-0.04090225324034691,
-0.004564614500850439,
-0.1556052416563034,
0.07673801481723785,
0.17203287780284882,
0.0805397778749466,
-0.00828546192497015,
0.06094660609960556,
-0.11003988236188889,
-0.11399497091770172,
0.10722645372152328,
-0.05822233483195305,
-0.014757114462554455,
-0.05772337689995766,
0.14011409878730774,
0.15646083652973175,
-0.19130073487758636,
0.06022409349679947,
-0.06736859679222107,
-0.04819837212562561,
-0.10633485019207001,
-0.17335662245750427,
-0.061282314360141754,
-0.0583864226937294,
-0.01613355241715908,
-0.05076048895716667,
0.06713438034057617,
0.08348768949508667,
0.02054755762219429,
0.016258614137768745,
0.0817527249455452,
-0.02199946530163288,
0.007656866684556007,
0.034995537251234055,
0.06331320106983185,
0.0073803807608783245,
-0.04667557775974274,
0.009565448388457298,
0.0006085589993745089,
0.035281602293252945,
0.04957476258277893,
0.037472013384103775,
-0.026353945955634117,
0.007689491845667362,
-0.02916470356285572,
-0.11019428819417953,
0.04115133360028267,
-0.026625385507941246,
-0.06341774761676788,
0.1439228504896164,
0.031860120594501495,
-0.008713874034583569,
-0.025656426325440407,
0.25211021304130554,
-0.07529866695404053,
-0.08892348408699036,
-0.1387489140033722,
0.13557645678520203,
-0.031552400439977646,
0.06481313705444336,
0.037692490965127945,
-0.11259825527667999,
0.03179538995027542,
0.1362704634666443,
0.1458069533109665,
-0.049145035445690155,
0.019655266776680946,
0.013711978681385517,
0.0032459446229040623,
-0.04005579650402069,
0.04973040521144867,
0.06590425968170166,
0.12457112967967987,
-0.05082963407039642,
0.08012272417545319,
-0.0028764382004737854,
-0.10040896385908127,
-0.02852385863661766,
0.12230420112609863,
-0.003029873361811042,
0.019506774842739105,
-0.0761401429772377,
0.12728425860404968,
-0.043905097991228104,
-0.2665610611438751,
0.06613168120384216,
-0.0650629922747612,
-0.14912083745002747,
-0.022557994350790977,
0.05126400291919708,
-0.008650023490190506,
0.026705266907811165,
0.06785756349563599,
-0.0670214518904686,
0.18420551717281342,
0.03873218223452568,
-0.05507900193333626,
-0.058854296803474426,
0.07306438684463501,
-0.09833692759275436,
0.2929907441139221,
0.00751500902697444,
0.05993965268135071,
0.09920700639486313,
-0.029096059501171112,
-0.13847678899765015,
0.031734831631183624,
0.08172675222158432,
-0.07410130649805069,
0.055994872003793716,
0.21827135980129242,
-0.008840959519147873,
0.11804516613483429,
0.07454971224069595,
-0.09561564773321152,
0.05016838759183884,
-0.10613930225372314,
-0.09673135727643967,
-0.08329153805971146,
0.09532807767391205,
-0.05763502046465874,
0.14755868911743164,
0.1186022087931633,
-0.04606860503554344,
0.02281493879854679,
-0.018614748492836952,
0.048749152570962906,
0.0023650694638490677,
0.12439922988414764,
0.020209291949868202,
-0.19710010290145874,
0.026845410466194153,
-0.008902255445718765,
0.10291280597448349,
-0.2202581763267517,
-0.09718955308198929,
0.04764820635318756,
0.0019112902227789164,
-0.05895697697997093,
0.12370198965072632,
0.055919989943504333,
0.04170476272702217,
-0.04714735969901085,
-0.028212912380695343,
-0.002841046778485179,
0.16146929562091827,
-0.11127673834562302,
0.0008471902110613883
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | SudiptoPramanik/LLama_FineTunedModel | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:31:29+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | hingeankit/model-quant | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | 2024-02-12T11:32:20+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
63,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04107276722788811,
0.18429043889045715,
-0.005466660484671593,
0.01707964763045311,
0.09889248013496399,
0.006426886655390263,
0.0552387572824955,
0.11628306657075882,
-0.04961670935153961,
0.12391628324985504,
0.04131103307008743,
0.10889437794685364,
0.11733400076627731,
0.14055128395557404,
-0.0011570020578801632,
-0.21881097555160522,
0.04984959214925766,
-0.10812798142433167,
-0.010629421100020409,
0.1229625791311264,
0.14664164185523987,
-0.10059566050767899,
0.0685141384601593,
-0.03650353476405144,
-0.018727488815784454,
-0.037872105836868286,
-0.06369106471538544,
-0.04232647642493248,
0.040759239345788956,
0.05543952435255051,
0.06582973897457123,
0.0008197632851079106,
0.08302918821573257,
-0.27891653776168823,
0.019616063684225082,
0.06732217967510223,
-0.001859458046965301,
0.06605353951454163,
0.0656529888510704,
-0.06341347843408585,
0.10816936939954758,
-0.05340541526675224,
0.1388159543275833,
0.0840948224067688,
-0.09252381324768066,
-0.1801718920469284,
-0.09092263877391815,
0.10972056537866592,
0.17807576060295105,
0.04927121847867966,
-0.027590133249759674,
0.10392458736896515,
-0.08213759958744049,
0.018306581303477287,
0.05128932744264603,
-0.08673480898141861,
-0.05636805668473244,
0.06892803311347961,
0.0909840315580368,
0.05535217374563217,
-0.127517431974411,
-0.033952608704566956,
0.005268120672553778,
0.017083169892430305,
0.07191447168588638,
0.02225526049733162,
0.15091390907764435,
0.03642427921295166,
-0.13347287476062775,
-0.052214302122592926,
0.10417953133583069,
0.039806388318538666,
-0.04001800715923309,
-0.2435668408870697,
-0.026749979704618454,
-0.023580020293593407,
-0.03506208956241608,
-0.04501988738775253,
0.040063899010419846,
-0.0024812635965645313,
0.0845893919467926,
-0.007499109022319317,
-0.0730346143245697,
-0.032884787768125534,
0.06573588401079178,
0.063177689909935,
0.029224421828985214,
-0.020591720938682556,
0.022940656170248985,
0.10680833458900452,
0.09384170919656754,
-0.11641748249530792,
-0.05923080071806908,
-0.06575440615415573,
-0.07281184196472168,
-0.043026648461818695,
0.03432915359735489,
0.016957378014922142,
0.0712345615029335,
0.2568737864494324,
0.015325438231229782,
0.05812793970108032,
0.030348559841513634,
0.007086529396474361,
0.051940593868494034,
0.10396376997232437,
-0.06229512766003609,
-0.11413563787937164,
-0.021461904048919678,
0.08955145627260208,
0.018870148807764053,
-0.03690025582909584,
-0.0473799966275692,
0.06586680561304092,
0.04421477019786835,
0.11019476503133774,
0.09768681228160858,
0.019824521616101265,
-0.0734618529677391,
-0.06183047220110893,
0.2012585699558258,
-0.1553279310464859,
0.0378098301589489,
0.04337082803249359,
-0.03577252849936485,
-0.02073810063302517,
0.008140976540744305,
0.026272648945450783,
-0.03220336139202118,
0.0870632603764534,
-0.05507489666342735,
-0.04573002830147743,
-0.1106005385518074,
-0.031349990516901016,
0.04460166394710541,
0.01050558965653181,
-0.034751567989587784,
-0.03554822877049446,
-0.07533694058656693,
-0.08437393605709076,
0.08658894151449203,
-0.06975018978118896,
-0.057597868144512177,
-0.027681710198521614,
-0.08278068900108337,
0.023360921069979668,
0.020822593942284584,
0.0743173137307167,
-0.025394825264811516,
0.0548558309674263,
-0.053450483828783035,
0.05534251779317856,
0.10136574506759644,
0.03449949249625206,
-0.05991470813751221,
0.05828521400690079,
-0.22980304062366486,
0.08651042729616165,
-0.0686769187450409,
0.0635417103767395,
-0.15884330868721008,
-0.025311613455414772,
0.035479720681905746,
0.0039023179560899734,
-0.005805973429232836,
0.13394178450107574,
-0.20890115201473236,
-0.024045290425419807,
0.1660108119249344,
-0.09623157232999802,
-0.07049477100372314,
0.052300989627838135,
-0.04749181121587753,
0.1015239804983139,
0.03422097489237785,
0.0028637154027819633,
0.06358838081359863,
-0.10867808759212494,
-0.009417007677257061,
-0.05510617420077324,
-0.024897700175642967,
0.14054812490940094,
0.07615227997303009,
-0.07858359813690186,
0.06282083690166473,
0.022458016872406006,
-0.023672401905059814,
-0.06433894485235214,
-0.01893056184053421,
-0.10131831467151642,
0.01600930653512478,
-0.06640354543924332,
0.014057212509214878,
-0.01763610914349556,
-0.09435714781284332,
-0.02923819050192833,
-0.16873657703399658,
-0.02738712728023529,
0.08095064759254456,
-0.0036453690845519304,
-0.014766527339816093,
-0.11192154884338379,
0.02360657975077629,
0.03575372323393822,
0.004308606963604689,
-0.13218367099761963,
-0.038643669337034225,
0.03360716253519058,
-0.15031422674655914,
0.03721858188509941,
-0.07071434706449509,
0.05296730995178223,
0.015814051032066345,
-0.028946246951818466,
-0.026068145409226418,
0.02240581624209881,
0.009327845647931099,
-0.01549700926989317,
-0.23842118680477142,
-0.024945400655269623,
-0.030812138691544533,
0.16146178543567657,
-0.2060156911611557,
0.0347215011715889,
0.08237865567207336,
0.15714368224143982,
0.004293185658752918,
-0.05037878081202507,
0.018603991717100143,
-0.06939104199409485,
-0.023690514266490936,
-0.05550501495599747,
0.00425359234213829,
-0.019322706386446953,
-0.047018662095069885,
0.02767501398921013,
-0.17775531113147736,
-0.048257146030664444,
0.09694717079401016,
0.04846562072634697,
-0.12656573951244354,
-0.027750778943300247,
-0.03659443557262421,
-0.05207927152514458,
-0.04021235555410385,
-0.062280114740133286,
0.1025606319308281,
0.062151361256837845,
0.03721151500940323,
-0.060664571821689606,
-0.07998596131801605,
-0.005254825577139854,
-0.015869658440351486,
-0.025883622467517853,
0.09547285735607147,
0.07314612716436386,
-0.13318641483783722,
0.09358029067516327,
0.08555903285741806,
0.07665112614631653,
0.09008699655532837,
-0.021513260900974274,
-0.0734030082821846,
-0.03775930777192116,
0.03842119127511978,
0.01799904927611351,
0.12194488197565079,
-0.04294387623667717,
0.042145516723394394,
0.0407242551445961,
-0.026636878028512,
0.01757904887199402,
-0.07903152704238892,
0.03391219675540924,
0.021363548934459686,
-0.015404801815748215,
0.05207730829715729,
-0.03648284450173378,
0.019491707906126976,
0.08773515373468399,
0.05895758792757988,
0.04076753929257393,
0.015477783046662807,
-0.05245675891637802,
-0.11156144738197327,
0.1588587909936905,
-0.12423332780599594,
-0.21536298096179962,
-0.1318657398223877,
0.011994147673249245,
0.027378013357520103,
-0.015752362087368965,
0.00472033629193902,
-0.06156763806939125,
-0.10836432874202728,
-0.09137389808893204,
0.006965515669435263,
0.05607304349541664,
-0.08397942781448364,
-0.057693157345056534,
0.045506685972213745,
0.03968849778175354,
-0.14271289110183716,
0.020852934569120407,
0.0429975800216198,
-0.09161601960659027,
-0.010225369594991207,
0.07823923975229263,
0.07498764246702194,
0.18603990972042084,
0.021536244079470634,
-0.019713889807462692,
0.029369689524173737,
0.22151710093021393,
-0.13627390563488007,
0.11189884692430496,
0.1317344307899475,
-0.08768384158611298,
0.0811038687825203,
0.21081633865833282,
0.04229486733675003,
-0.09574094414710999,
0.030903132632374763,
0.029754992574453354,
-0.023434791713953018,
-0.23411248624324799,
-0.07023821771144867,
-0.0005590264336206019,
-0.06695892661809921,
0.07743334025144577,
0.095066137611866,
0.07732266187667847,
0.016324089840054512,
-0.09633003175258636,
-0.0940609723329544,
0.06234998628497124,
0.10914576053619385,
0.013988446444272995,
-0.0083477683365345,
0.08871278911828995,
-0.03466309234499931,
0.014458059333264828,
0.08648307621479034,
0.004773330874741077,
0.16410057246685028,
0.04966660216450691,
0.1783807873725891,
0.08461165428161621,
0.0707983523607254,
0.0036198562011122704,
0.006051016505807638,
0.01179120410233736,
0.04098048433661461,
-0.006200188305228949,
-0.08307646214962006,
-0.024332163855433464,
0.10978074371814728,
0.06830116361379623,
0.018119553104043007,
0.01537923514842987,
-0.0483124777674675,
0.08804096281528473,
0.17867277562618256,
0.00015456941036973149,
-0.18041066825389862,
-0.05957181379199028,
0.0758126825094223,
-0.09812118113040924,
-0.10091771185398102,
-0.009177285246551037,
0.016242317855358124,
-0.16525320708751678,
0.037138622254133224,
-0.020507901906967163,
0.10839620232582092,
-0.1315004527568817,
-0.01726866327226162,
0.07468607276678085,
0.07066181302070618,
-0.0023129789624363184,
0.05676833167672157,
-0.1800561100244522,
0.10016647726297379,
0.011189276352524757,
0.0706777423620224,
-0.09628693759441376,
0.09071888029575348,
-0.008443507365882397,
-0.02712937630712986,
0.14120987057685852,
-0.003397545777261257,
-0.07548630982637405,
-0.06316021829843521,
-0.09301228821277618,
-0.011178667657077312,
0.12596110999584198,
-0.12770214676856995,
0.09133239090442657,
-0.033733878284692764,
-0.03574102371931076,
-0.010073747485876083,
-0.08690870553255081,
-0.1102650910615921,
-0.17856551706790924,
0.05966923385858536,
-0.1299193650484085,
0.03618106618523598,
-0.1058541014790535,
-0.026455020532011986,
-0.025722770020365715,
0.1780775636434555,
-0.24061058461666107,
-0.07271577417850494,
-0.14271517097949982,
-0.09457890689373016,
0.13225026428699493,
-0.04737140238285065,
0.08879992365837097,
-0.01565241813659668,
0.1602710634469986,
0.019751977175474167,
-0.019318873062729836,
0.08594216406345367,
-0.0841052308678627,
-0.19547666609287262,
-0.06833874434232712,
0.16493964195251465,
0.12200415879487991,
0.034798283129930496,
0.00001234960109286476,
0.03777633607387543,
-0.021238334476947784,
-0.11896125227212906,
0.02185136452317238,
0.15354816615581512,
0.06429044902324677,
0.010137155652046204,
-0.023354806005954742,
-0.11455096304416656,
-0.07674204558134079,
-0.028791263699531555,
0.032510749995708466,
0.1711813509464264,
-0.07065609097480774,
0.17291003465652466,
0.14641587436199188,
-0.05902004987001419,
-0.20745272934436798,
-0.00104529841337353,
0.027895038947463036,
-0.009799708612263203,
0.01052815094590187,
-0.18731199204921722,
0.08426815271377563,
-0.0027930939104408026,
-0.053573500365018845,
0.10236258804798126,
-0.16446185111999512,
-0.13778766989707947,
0.08314235508441925,
0.05050567165017128,
-0.1846107542514801,
-0.13593068718910217,
-0.0956527590751648,
-0.04017403721809387,
-0.16093777120113373,
0.094541534781456,
0.020600248128175735,
0.011286545544862747,
0.031222112476825714,
0.01729961298406124,
0.024437133222818375,
-0.04827499762177467,
0.1766505390405655,
-0.017875220626592636,
0.023337768390774727,
-0.09548802673816681,
-0.08213721215724945,
0.018314361572265625,
-0.05043570697307587,
0.07148107886314392,
-0.019170429557561874,
0.010796970687806606,
-0.10002472996711731,
-0.036158230155706406,
-0.04323146492242813,
0.01651306450366974,
-0.09900262951850891,
-0.08600834012031555,
-0.04710615426301956,
0.09495867788791656,
0.09545248001813889,
-0.02352539822459221,
-0.024527089670300484,
-0.07954668998718262,
0.057502321898937225,
0.20494122803211212,
0.18888618052005768,
0.0420912466943264,
-0.060861583799123764,
-0.0007815357530489564,
-0.01528065837919712,
0.04347316175699234,
-0.1974499374628067,
0.05885709822177887,
0.05645278841257095,
0.020958000794053078,
0.1056937575340271,
-0.019571878015995026,
-0.15761134028434753,
-0.07663667947053909,
0.06839237362146378,
-0.0623336136341095,
-0.20320051908493042,
0.01006594579666853,
0.06082194298505783,
-0.17550897598266602,
-0.03924595192074776,
0.047061655670404434,
-0.0033835566136986017,
-0.03930600732564926,
0.024218931794166565,
0.09521005302667618,
0.0040947855450212955,
0.0774473026394844,
0.06932910531759262,
0.08241982758045197,
-0.0995473861694336,
0.08199375867843628,
0.09614647924900055,
-0.07453403621912003,
0.028697360306978226,
0.10224969685077667,
-0.05786004289984703,
-0.03986092656850815,
0.03258426487445831,
0.08356176316738129,
0.026442093774676323,
-0.04274778440594673,
0.012539389543235302,
-0.0972883403301239,
0.06648266315460205,
0.10121845453977585,
0.029486022889614105,
0.017875563353300095,
0.0438971146941185,
0.046800848096609116,
-0.07319830358028412,
0.1247049942612648,
0.03236815705895424,
0.015036910772323608,
-0.039863456040620804,
-0.04264143109321594,
0.009134208783507347,
-0.032499782741069794,
-0.0050306725315749645,
-0.022684602066874504,
-0.08841663599014282,
-0.015251140110194683,
-0.13281330466270447,
-0.00814450066536665,
-0.06321382522583008,
0.014864852651953697,
0.028430765494704247,
-0.03241420537233353,
0.006226730532944202,
0.005371582228690386,
-0.07067935168743134,
-0.06628155708312988,
-0.013359855860471725,
0.09593217074871063,
-0.164642333984375,
0.025939282029867172,
0.08370168507099152,
-0.11173626035451889,
0.09951487928628922,
0.01100022904574871,
-0.007797855418175459,
0.02429228089749813,
-0.14975731074810028,
0.03672046959400177,
-0.03686909377574921,
0.00826497282832861,
0.022785359993577003,
-0.20199745893478394,
-0.00023884387337602675,
-0.034025534987449646,
-0.06921693682670593,
-0.007965313270688057,
-0.024753388017416,
-0.11103565245866776,
0.10458120703697205,
0.0002950492489617318,
-0.08280477672815323,
-0.029978938400745392,
0.031977053731679916,
0.07958480715751648,
-0.027453763410449028,
0.15224815905094147,
-0.013632738031446934,
0.06552384793758392,
-0.15841995179653168,
-0.011964929290115833,
-0.01016872376203537,
0.014539518393576145,
-0.03842446580529213,
-0.007748632226139307,
0.05015689507126808,
-0.015035315416753292,
0.1729361116886139,
-0.037935223430395126,
0.013986855745315552,
0.06666132807731628,
0.046996477991342545,
-0.03511105850338936,
0.09919431805610657,
0.05079413577914238,
0.01744815893471241,
0.008659602142870426,
0.011847072280943394,
-0.04212842136621475,
-0.038148507475852966,
-0.19158609211444855,
0.07039409875869751,
0.18760721385478973,
0.09802322834730148,
-0.02246714197099209,
0.07361312955617905,
-0.09906915575265884,
-0.0919000506401062,
0.15422968566417694,
-0.03650514408946037,
-0.007622900884598494,
-0.07365479320287704,
0.13077658414840698,
0.14880754053592682,
-0.18213480710983276,
0.06703998148441315,
-0.07155536860227585,
-0.042435649782419205,
-0.10888257622718811,
-0.19546253979206085,
-0.06245911121368408,
-0.04822525382041931,
-0.018525561317801476,
-0.047860633581876755,
0.06643376499414444,
0.05883597210049629,
0.0012075765989720821,
-0.007492885459214449,
0.06769093126058578,
-0.03296395391225815,
-0.0035727631766349077,
0.028557414188981056,
0.0605304129421711,
0.009092770516872406,
-0.03663676232099533,
0.017620893195271492,
-0.009870080277323723,
0.05314849689602852,
0.07527779787778854,
0.04976372793316841,
-0.02661348506808281,
0.018153073266148567,
-0.04167592152953148,
-0.10696607828140259,
0.049551915377378464,
-0.027341017499566078,
-0.07445981353521347,
0.15360838174819946,
0.020564887672662735,
0.004388691857457161,
-0.012038860470056534,
0.23854023218154907,
-0.06319069117307663,
-0.10509941726922989,
-0.14511722326278687,
0.07529381662607193,
-0.04123938828706741,
0.05031263083219528,
0.03968263044953346,
-0.10978218168020248,
0.026148788630962372,
0.1434495449066162,
0.1547095775604248,
-0.039126113057136536,
0.022912416607141495,
0.03374101594090462,
0.008956367149949074,
-0.023304572328925133,
0.03796272352337837,
0.06387558579444885,
0.1512088030576706,
-0.0479457788169384,
0.0797942504286766,
-0.0006020418368279934,
-0.08719385415315628,
-0.03671582415699959,
0.11214786022901535,
-0.009403618052601814,
0.0136125348508358,
-0.05995306745171547,
0.11950281262397766,
-0.07238344103097916,
-0.2178061157464981,
0.040135592222213745,
-0.07077405601739883,
-0.13199284672737122,
-0.02403392642736435,
0.07905852794647217,
-0.01147072110325098,
0.022253720089793205,
0.07827538251876831,
-0.07115045189857483,
0.18890751898288727,
0.038776300847530365,
-0.058570344001054764,
-0.05077022314071655,
0.07556261867284775,
-0.07555456459522247,
0.29114294052124023,
0.015565842390060425,
0.041523341089487076,
0.11074405163526535,
-0.015121901407837868,
-0.14093846082687378,
0.02415761910378933,
0.09602424502372742,
-0.09811476618051529,
0.05229158699512482,
0.18022818863391876,
0.0020607570186257362,
0.12836496531963348,
0.07837866246700287,
-0.0887366384267807,
0.04519768804311752,
-0.07400911301374435,
-0.07113581895828247,
-0.09759877622127533,
0.10631079226732254,
-0.0886058658361435,
0.144490584731102,
0.12301621586084366,
-0.055002361536026,
0.010657761245965958,
-0.03463399037718773,
0.04656387120485306,
-0.0036977268755435944,
0.12035106122493744,
0.009613994508981705,
-0.19130825996398926,
0.025861777365207672,
-0.026803046464920044,
0.10188493877649307,
-0.1675824373960495,
-0.08771826326847076,
0.04396218806505203,
0.011473724618554115,
-0.07196498662233353,
0.12621460855007172,
0.05924959108233452,
0.029668346047401428,
-0.047655340284109116,
-0.025236770510673523,
-0.010040738619863987,
0.1404636949300766,
-0.10325537621974945,
-0.003536330536007881
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-squad
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "bert-base-cased", "model-index": [{"name": "bert-finetuned-squad", "results": []}]} | question-answering | sophiayk/bert-finetuned-squad | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"question-answering",
"generated_from_trainer",
"base_model:bert-base-cased",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:32:45+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #bert #question-answering #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-finetuned-squad
This model is a fine-tuned version of bert-base-cased on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| [
"# bert-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #bert #question-answering #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
60,
35,
6,
12,
8,
3,
103,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #bert #question-answering #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #endpoints_compatible #region-us \n# bert-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.061459798365831375,
0.061447951942682266,
-0.0023345649242401123,
0.05422242730855942,
0.14727948606014252,
0.020580867305397987,
0.133071169257164,
0.1109636053442955,
-0.099789097905159,
0.0471339225769043,
0.04277966171503067,
0.027452314272522926,
0.04439014568924904,
0.10173879563808441,
-0.04047640040516853,
-0.2326717972755432,
0.0243214163929224,
0.015390816144645214,
-0.09670136868953705,
0.07859394699335098,
0.10860180854797363,
-0.12125761061906815,
0.04846896603703499,
0.009541328996419907,
-0.14627864956855774,
0.044559869915246964,
-0.017151489853858948,
-0.0479029081761837,
0.10530135035514832,
0.043245647102594376,
0.1464504599571228,
0.009714404121041298,
0.11924252659082413,
-0.23196730017662048,
0.013637912459671497,
0.08863382786512375,
0.0286929402500391,
0.06396042555570602,
0.057905957102775574,
0.024468891322612762,
0.06762804836034775,
-0.10307248681783676,
0.12413535267114639,
0.01943388395011425,
-0.07848954945802689,
-0.20726418495178223,
-0.08991456776857376,
0.021897481754422188,
0.10657299309968948,
0.06853597611188889,
-0.0014294445281848311,
0.13080278038978577,
-0.12465435266494751,
0.05454849824309349,
0.18297156691551208,
-0.2775663435459137,
-0.0742194801568985,
0.044039271771907806,
0.066988006234169,
0.05576687306165695,
-0.10548405349254608,
-0.02604071982204914,
0.04436066374182701,
0.045092348009347916,
0.09539152681827545,
-0.01563568040728569,
-0.1185605600476265,
0.012210484594106674,
-0.14206890761852264,
0.012168851681053638,
0.09484157711267471,
0.04601401463150978,
-0.04385574162006378,
-0.09959746152162552,
-0.032543595880270004,
-0.01876622810959816,
-0.04760074242949486,
-0.0338420532643795,
0.03985750675201416,
-0.032872460782527924,
-0.06725551187992096,
-0.05386665090918541,
-0.08165749907493591,
-0.08186745643615723,
-0.021592821925878525,
0.12031173706054688,
0.049023035913705826,
0.0020565874874591827,
-0.05340089648962021,
0.08596295863389969,
-0.04782279580831528,
-0.0972023606300354,
0.009081766940653324,
-0.0008378953789360821,
-0.09486114233732224,
-0.07832273095846176,
-0.0621207058429718,
-0.06785807758569717,
0.0348195806145668,
0.14466795325279236,
-0.03551895171403885,
0.08212119340896606,
-0.04245670139789581,
0.011188340373337269,
-0.023814590647816658,
0.11640627682209015,
-0.035820312798023224,
-0.01497437059879303,
-0.0031569628044962883,
0.09074471145868301,
0.003783012507483363,
0.0028517041355371475,
-0.0839226022362709,
0.00034435841371305287,
0.06287102401256561,
0.04945514351129532,
-0.05412912368774414,
0.03270968049764633,
-0.05242898687720299,
-0.014940381981432438,
-0.013021047227084637,
-0.11373132467269897,
0.050237782299518585,
0.005551122594624758,
-0.05133688449859619,
-0.03309698402881622,
0.013901998288929462,
0.02568003721535206,
0.0024044683668762445,
0.10671638697385788,
-0.06286582350730896,
0.001527723390609026,
-0.10509428381919861,
-0.10192393511533737,
0.026301341131329536,
-0.04589732736349106,
-0.0001480861392337829,
-0.07383490353822708,
-0.16255640983581543,
-0.028034863993525505,
0.0522027425467968,
-0.04300333186984062,
-0.009100627154111862,
-0.036343641579151154,
-0.03701746091246605,
0.012845677323639393,
-0.021096475422382355,
0.17049647867679596,
-0.05358185991644859,
0.06594842672348022,
-0.019995305687189102,
0.026813561096787453,
-0.013226971961557865,
0.041318267583847046,
-0.07363813370466232,
0.03031962923705578,
-0.16215889155864716,
0.05207372456789017,
-0.11184148490428925,
0.013519112020730972,
-0.12260060012340546,
-0.08264897763729095,
0.013123290613293648,
0.009542727842926979,
0.08599310368299484,
0.0929957926273346,
-0.16837601363658905,
-0.03492205590009689,
0.132857546210289,
-0.09205308556556702,
-0.06740245223045349,
0.10494925081729889,
-0.06968790292739868,
0.03958335891366005,
0.06577906012535095,
0.14707745611667633,
0.07223697751760483,
-0.14001241326332092,
0.028791893273591995,
-0.0025663531851023436,
0.09249944984912872,
0.0470452681183815,
0.04789413884282112,
-0.02686789073050022,
-0.035463228821754456,
0.000380219571525231,
-0.056678012013435364,
0.024655528366565704,
-0.08302492648363113,
-0.07098092138767242,
-0.03393018990755081,
-0.08446280658245087,
0.057038187980651855,
0.023242488503456116,
0.03703390434384346,
-0.08535196632146835,
-0.09979818761348724,
0.18490739166736603,
0.10960789024829865,
-0.06901982426643372,
0.015217185020446777,
-0.09181997179985046,
0.02399349957704544,
-0.029291432350873947,
-0.0160883329808712,
-0.1937323659658432,
-0.124361552298069,
0.029358062893152237,
-0.0499911904335022,
0.042770374566316605,
0.047171566635370255,
0.07468511909246445,
0.06069914251565933,
-0.07250207662582397,
-0.009682667441666126,
-0.08157332986593246,
0.01827055588364601,
-0.10930684208869934,
-0.21364574134349823,
-0.0561688095331192,
-0.04888840764760971,
0.14983327686786652,
-0.2371571958065033,
0.01744062267243862,
-0.0158857349306345,
0.14663065969944,
0.04917171224951744,
-0.028531232848763466,
-0.025619521737098694,
0.08447664231061935,
0.006524949800223112,
-0.07085540145635605,
0.0456293523311615,
0.0035106351133435965,
-0.10825884342193604,
-0.05158180370926857,
-0.1254018247127533,
0.05313687399029732,
0.08509916067123413,
0.010861638933420181,
-0.09549152106046677,
-0.06123816594481468,
-0.04289982467889786,
-0.03770140931010246,
-0.09153592586517334,
-0.006009100005030632,
0.19756795465946198,
0.008866924792528152,
0.13024085760116577,
-0.06384221464395523,
-0.04316077008843422,
-0.011104362085461617,
-0.003277657087892294,
0.0065968395210802555,
0.08201264590024948,
0.06290043890476227,
-0.10100670158863068,
0.08966144919395447,
0.1427372545003891,
-0.08809559792280197,
0.12802675366401672,
-0.06669536232948303,
-0.08382763713598251,
-0.0035146523732692003,
0.018052376806735992,
-0.0009520510211586952,
0.12291523069143295,
-0.11392845958471298,
0.01214613951742649,
0.020349163562059402,
0.024035392329096794,
0.033255793154239655,
-0.1866011917591095,
-0.0001348506921203807,
0.01857338659465313,
-0.031072042882442474,
-0.005707079544663429,
-0.034159861505031586,
0.04566868394613266,
0.0961286723613739,
0.026690978556871414,
-0.026047369465231895,
0.030917497351765633,
-0.017370129004120827,
-0.07888409495353699,
0.19535638391971588,
-0.134768545627594,
-0.10342710465192795,
-0.0994999036192894,
-0.01285966020077467,
-0.029272371903061867,
-0.025446409359574318,
0.0300839152187109,
-0.10529284924268723,
-0.06118984892964363,
-0.08911913633346558,
0.026114005595445633,
-0.015716243535280228,
-0.01027036365121603,
0.016980791464447975,
0.007922949269413948,
0.09388486295938492,
-0.13049335777759552,
0.014737911522388458,
-0.02808591164648533,
-0.10750939697027206,
0.006074283272027969,
0.05640389025211334,
0.07634573429822922,
0.12668631970882416,
-0.02135971561074257,
-0.012784563936293125,
-0.05513405054807663,
0.20797152817249298,
-0.05621897801756859,
-0.021660933271050453,
0.11368010938167572,
-0.02144048735499382,
0.040778279304504395,
0.14336541295051575,
0.056096289306879044,
-0.09655733406543732,
0.04084358364343643,
0.09027957916259766,
-0.007372742053121328,
-0.2477952539920807,
-0.035485442727804184,
-0.03706059232354164,
-0.09619531035423279,
0.09314313530921936,
0.04598979651927948,
-0.03920724615454674,
0.046859025955200195,
-0.010759986005723476,
0.04798964783549309,
0.004607341252267361,
0.08061698079109192,
0.13040615618228912,
0.026539716869592667,
0.10464926809072495,
-0.031242653727531433,
-0.03924533352255821,
0.06145051121711731,
0.01601589098572731,
0.2622612416744232,
0.0032494233455508947,
0.06713822484016418,
0.07083655148744583,
0.12690454721450806,
-0.006245773751288652,
0.015555835328996181,
-0.000842420500703156,
-0.019448664039373398,
-0.00855666771531105,
-0.06403692066669464,
-0.00960704404860735,
0.024987639859318733,
-0.02751791477203369,
0.03134535253047943,
-0.09673149138689041,
-0.04363711178302765,
0.01712709292769432,
0.259813129901886,
0.033559225499629974,
-0.24476152658462524,
-0.06446326524019241,
0.03345059975981712,
-0.054381150752305984,
-0.06443323194980621,
0.01147520449012518,
0.13037928938865662,
-0.1164151281118393,
0.05030885711312294,
-0.062406305223703384,
0.10151725262403488,
-0.004453735426068306,
0.0011111715575680137,
0.03996695950627327,
0.12747614085674286,
-0.012449676170945168,
0.07362329959869385,
-0.24941353499889374,
0.22383655607700348,
0.02319640852510929,
0.11503933370113373,
-0.046525727957487106,
0.03555026277899742,
0.026707086712121964,
0.06300769001245499,
0.0560712032020092,
-0.022611534222960472,
-0.07392343878746033,
-0.19212935864925385,
-0.03508710861206055,
0.059186238795518875,
0.10614626854658127,
-0.003535062773153186,
0.08273082971572876,
-0.03742257505655289,
0.018663646653294563,
0.0643555074930191,
-0.05351274833083153,
-0.19107525050640106,
-0.12351815402507782,
-0.016547195613384247,
0.020241545513272285,
-0.011910421773791313,
-0.11267317086458206,
-0.10366198420524597,
-0.051850952208042145,
0.16490136086940765,
0.01943778060376644,
-0.029776055365800858,
-0.12596632540225983,
0.0854724869132042,
0.1040196567773819,
-0.0347907580435276,
0.03230150043964386,
0.017272591590881348,
0.12235206365585327,
0.02306991070508957,
-0.08323754370212555,
0.08243215829133987,
-0.0786900743842125,
-0.15695762634277344,
-0.08197148889303207,
0.10315465182065964,
0.07422619313001633,
0.05436359718441963,
0.01108220312744379,
0.016497192904353142,
0.02324766479432583,
-0.07448835670948029,
-0.012275688350200653,
0.09931504726409912,
0.07955443859100342,
0.0918746367096901,
-0.10934382677078247,
-0.014584965072572231,
-0.018572811037302017,
-0.014820524491369724,
0.1233733668923378,
0.22609290480613708,
-0.07465904206037521,
0.0728081688284874,
0.10633830726146698,
-0.08155998587608337,
-0.1912233829498291,
0.10248123109340668,
0.1091802567243576,
0.014924203976988792,
0.0620192289352417,
-0.20946091413497925,
0.1343722641468048,
0.12071874737739563,
-0.023595836013555527,
0.02381988614797592,
-0.29926279187202454,
-0.12816034257411957,
0.1080908253788948,
0.1308019906282425,
0.029860712587833405,
-0.14037245512008667,
-0.029754674062132835,
-0.03576713800430298,
-0.1251954287290573,
0.12356105446815491,
-0.15127962827682495,
0.09242217242717743,
0.014962718822062016,
0.0752267837524414,
0.02469545230269432,
-0.03275773674249649,
0.14851243793964386,
-0.022257039323449135,
0.1015954315662384,
-0.0469214990735054,
0.052410323172807693,
0.052272725850343704,
-0.051470305770635605,
0.0027104844339191914,
-0.051835622638463974,
0.0386844128370285,
-0.1199895590543747,
-0.025339633226394653,
-0.06346026062965393,
0.058189574629068375,
-0.04288136959075928,
-0.06441446393728256,
-0.05519048497080803,
0.05625461786985397,
0.039457056671381,
-0.031044045463204384,
0.07836628705263138,
-0.018602069467306137,
0.15986160933971405,
0.05192641168832779,
0.14238160848617554,
-0.028629746288061142,
-0.08330649882555008,
0.016823383048176765,
-0.037374380975961685,
0.09144406765699387,
-0.11978122591972351,
0.040791310369968414,
0.11425328254699707,
0.02885865420103073,
0.15491695702075958,
0.04755830019712448,
-0.059327129274606705,
0.01985377073287964,
0.04295020550489426,
-0.08762049674987793,
-0.15524885058403015,
0.019345000386238098,
0.05994473397731781,
-0.14515921473503113,
0.01810130663216114,
0.11084803938865662,
-0.04176182299852371,
-0.018473336473107338,
-0.000029236023692646995,
0.011293904855847359,
-0.03950151801109314,
0.17660193145275116,
0.03044256381690502,
0.050335001200437546,
-0.08342214673757553,
0.10434763133525848,
0.08080239593982697,
-0.11423652619123459,
0.050117090344429016,
0.04423784092068672,
-0.06290752440690994,
-0.0224289707839489,
0.09095968306064606,
0.2181142121553421,
0.0026791319251060486,
-0.05532265082001686,
-0.07284814119338989,
-0.12529242038726807,
0.033083245158195496,
0.1457516849040985,
0.04752615466713905,
-0.021746931597590446,
-0.02143864706158638,
0.048404790461063385,
-0.10574515163898468,
0.07167717069387436,
0.02479260042309761,
0.07577353715896606,
-0.10183551907539368,
0.10504424571990967,
0.00980623159557581,
0.00805351510643959,
-0.017236854881048203,
0.02171151712536812,
-0.1222962886095047,
-0.00987266469746828,
-0.18757635354995728,
0.0020522947888821363,
-0.02818240597844124,
0.0168794896453619,
0.010995532386004925,
-0.04504646360874176,
-0.036076620221138,
0.030973490327596664,
-0.08395108580589294,
-0.031977470964193344,
0.022327955812215805,
0.07035724818706512,
-0.1359185129404068,
-0.015433784574270248,
0.02833874523639679,
-0.08806543052196503,
0.059240829199552536,
0.02517775446176529,
0.028964368626475334,
0.052969977259635925,
-0.17566116154193878,
-0.03748887777328491,
0.026944991201162338,
0.027665691450238228,
0.05950703099370003,
-0.08567380160093307,
-0.022085702046751976,
-0.016151318326592445,
0.05988408997654915,
0.015331972390413284,
0.05757523700594902,
-0.1131846234202385,
-0.040379103273153305,
-0.046796757727861404,
-0.0734470933675766,
-0.05157812312245369,
0.032347459346055984,
0.10132639855146408,
0.05063449591398239,
0.16778309643268585,
-0.1006341353058815,
0.033364713191986084,
-0.18055792152881622,
-0.02917565405368805,
-0.019195297732949257,
-0.022025134414434433,
-0.08018834888935089,
-0.043590616434812546,
0.06981530040502548,
-0.04999225586652756,
0.1256609410047531,
-0.020875349640846252,
0.09235033392906189,
0.045605119317770004,
-0.07377336919307709,
0.023352887481451035,
0.030738437548279762,
0.22314849495887756,
0.062051914632320404,
-0.01610133796930313,
0.044526077806949615,
-0.0056547923013567924,
0.047879233956336975,
0.08856824040412903,
0.17098933458328247,
0.18817758560180664,
-0.01588284969329834,
0.05176379159092903,
0.07569438964128494,
-0.09040294587612152,
-0.09791208058595657,
0.12116839736700058,
-0.005164319649338722,
0.10924898087978363,
-0.06450361758470535,
0.22765009105205536,
0.07122702151536942,
-0.17626233398914337,
0.04897518455982208,
-0.079558365046978,
-0.08922912180423737,
-0.11066397279500961,
-0.018690351396799088,
-0.08227329701185226,
-0.1427392214536667,
0.00919851753860712,
-0.12561637163162231,
0.02118326909840107,
0.09895186126232147,
0.014988322742283344,
0.02805721014738083,
0.14173580706119537,
-0.039292510598897934,
0.023229781538248062,
0.04869456589221954,
0.00500459223985672,
0.008690242655575275,
-0.07347477972507477,
-0.08363408595323563,
0.05529560148715973,
0.0012694550678133965,
0.04475992172956467,
-0.03477095440030098,
-0.0022524481173604727,
0.027730969712138176,
-0.006817549001425505,
-0.06992533802986145,
0.034952547401189804,
0.014332410879433155,
0.04505559802055359,
0.08219046145677567,
0.047078315168619156,
0.0041319685988128185,
-0.03671559691429138,
0.2745203673839569,
-0.07755859196186066,
-0.09643987566232681,
-0.14884765446186066,
0.22370687127113342,
-0.003403999609872699,
0.008373335003852844,
0.04428301006555557,
-0.10526137799024582,
-0.006543160416185856,
0.16100084781646729,
0.1284177154302597,
-0.07510718703269958,
-0.005658114794641733,
-0.005321638658642769,
-0.027412617579102516,
-0.08328115940093994,
0.13014888763427734,
0.0994640663266182,
0.01851866953074932,
-0.05985711142420769,
-0.03247898444533348,
0.002075428608804941,
-0.01568879559636116,
-0.0859164297580719,
0.040165092796087265,
0.026456035673618317,
-0.0037903201300650835,
-0.03225818648934364,
0.08521488308906555,
0.030575698241591454,
-0.20965440571308136,
0.03453199565410614,
-0.1313685029745102,
-0.16948415338993073,
-0.04227505996823311,
0.09327809512615204,
-0.013944268226623535,
0.04107850417494774,
-0.03446254879236221,
0.012208864092826843,
0.10743625462055206,
-0.02745700627565384,
-0.004946270026266575,
-0.14141987264156342,
0.12466928362846375,
-0.052271340042352676,
0.22071899473667145,
0.006029118318110704,
0.0742056742310524,
0.11567369103431702,
0.0317792184650898,
-0.09987157583236694,
0.050585098564624786,
0.0658205896615982,
-0.11023902893066406,
0.017630815505981445,
0.13049155473709106,
-0.055503349751234055,
0.12244829535484314,
0.04489539563655853,
-0.1413429230451584,
0.02666780725121498,
-0.08042153716087341,
-0.06361427158117294,
-0.07365655899047852,
0.010948935523629189,
-0.08452470600605011,
0.14192570745944977,
0.19942158460617065,
-0.025426695123314857,
0.0029907685238868,
-0.06473544239997864,
0.026517853140830994,
0.04611179232597351,
0.09800279140472412,
-0.04463860020041466,
-0.20993530750274658,
0.025476839393377304,
0.05283667892217636,
0.018994735553860664,
-0.2569529712200165,
-0.0952228531241417,
0.04606932774186134,
-0.020690133795142174,
-0.0612851046025753,
0.10012989491224289,
0.09256619960069656,
0.035076919943094254,
-0.04678044468164444,
-0.2243194729089737,
-0.0418829619884491,
0.14177238941192627,
-0.11487370729446411,
-0.04110102728009224
] |
null | null | ml-agents |
# **ppo** Agent playing **SnowballTarget**
This is a trained model of a **ppo** agent playing **SnowballTarget**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: slc48/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
| {"library_name": "ml-agents", "tags": ["SnowballTarget", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-SnowballTarget"]} | reinforcement-learning | slc48/ppo-SnowballTarget | [
"ml-agents",
"tensorboard",
"onnx",
"SnowballTarget",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SnowballTarget",
"region:us"
] | 2024-02-12T11:32:48+00:00 | [] | [] | TAGS
#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us
|
# ppo Agent playing SnowballTarget
This is a trained model of a ppo agent playing SnowballTarget
using the Unity ML-Agents Library.
## Usage (with ML-Agents)
The Documentation: URL
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your
browser: URL
- A *longer tutorial* to understand how works ML-Agents:
URL
### Resume the training
### Watch your Agent play
You can watch your agent playing directly in your browser
1. If the environment is part of ML-Agents official environments, go to URL
2. Step 1: Find your model_id: slc48/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play
| [
"# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: slc48/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
"TAGS\n#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us \n",
"# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: slc48/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
50,
206
] | [
"passage: TAGS\n#ml-agents #tensorboard #onnx #SnowballTarget #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SnowballTarget #region-us \n# ppo Agent playing SnowballTarget\n This is a trained model of a ppo agent playing SnowballTarget\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: slc48/ppo-SnowballTarget\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play"
] | [
-0.04837195947766304,
0.06490949541330338,
-0.003432661760598421,
0.10945559293031693,
0.17772641777992249,
-0.012663372792303562,
0.1635570228099823,
0.09639621526002884,
0.13397426903247833,
0.07115299999713898,
0.0938170775771141,
0.0931617021560669,
0.057745132595300674,
0.1293790638446808,
0.0820702388882637,
-0.20873813331127167,
-0.046795547008514404,
-0.10698551684617996,
0.004426172003149986,
0.07925060391426086,
0.037609804421663284,
-0.03825750946998596,
0.029727403074502945,
0.04586951807141304,
-0.01697450876235962,
0.0008949954062700272,
-0.06544099748134613,
-0.034519948065280914,
0.06256689131259918,
-0.028857143595814705,
0.01743227243423462,
-0.05188261345028877,
0.11262554675340652,
-0.1651487946510315,
0.024554966017603874,
0.03870108723640442,
-0.011023025959730148,
-0.022032443434000015,
0.1398659199476242,
0.027804464101791382,
0.09751056879758835,
-0.1223103329539299,
0.09382011741399765,
0.0720333531498909,
-0.04782848432660103,
0.0031703184358775616,
-0.07446074485778809,
0.05793141946196556,
0.20630693435668945,
0.14413821697235107,
0.0017642624443396926,
0.07701148092746735,
-0.03604622930288315,
0.05926046893000603,
0.15682443976402283,
-0.27498480677604675,
-0.06526408344507217,
0.17745421826839447,
-0.054316651076078415,
0.03390716388821602,
-0.012757026590406895,
0.0470002107322216,
-0.019930237904191017,
0.0223805271089077,
-0.017929473891854286,
0.029624588787555695,
0.26738691329956055,
0.02704068459570408,
-0.09400635212659836,
-0.09218524396419525,
-0.01676248013973236,
0.02695733867585659,
-0.05273156985640526,
-0.18438585102558136,
0.006279003340750933,
0.10511137545108795,
0.0061796484515070915,
0.03764386475086212,
0.061613742262125015,
0.016805633902549744,
-0.0914018526673317,
-0.16076499223709106,
-0.03540968894958496,
-0.0709201991558075,
0.10492939502000809,
0.10736044496297836,
-0.03087068349123001,
-0.0010290861828252673,
0.04351989924907684,
0.08001308143138885,
0.12064779549837112,
-0.039983369410037994,
-0.035210296511650085,
-0.011973215267062187,
-0.15551304817199707,
-0.005830145440995693,
-0.03222871571779251,
-0.006388407200574875,
0.03562352433800697,
0.15303319692611694,
0.16262295842170715,
0.030362941324710846,
0.04318556562066078,
0.029281003400683403,
-0.0050465720705688,
0.1119031235575676,
0.04644215852022171,
-0.013867618516087532,
0.003636428853496909,
0.017178857699036598,
0.06783805787563324,
-0.0967952162027359,
-0.1057506650686264,
0.048952583223581314,
-0.051399145275354385,
0.1286279857158661,
0.1608460694551468,
-0.026187414303421974,
-0.008249130100011826,
-0.027967093512415886,
0.01183968037366867,
-0.15121003985404968,
0.08268710970878601,
0.057999469339847565,
-0.04808053746819496,
-0.0768611952662468,
-0.05666457489132881,
0.0464237742125988,
-0.08740589767694473,
0.03818710148334503,
0.0050009991973638535,
0.07618767768144608,
-0.005849871784448624,
-0.03401808813214302,
0.04950490593910217,
-0.12311410158872604,
-0.012774337083101273,
-0.15988820791244507,
-0.13586607575416565,
-0.08784088492393494,
0.0279264897108078,
-0.042243946343660355,
-0.11196179687976837,
-0.10470068454742432,
0.03388712555170059,
-0.0639280378818512,
0.023480238392949104,
-0.03530567139387131,
-0.06473559141159058,
-0.030199987813830376,
-0.11206205934286118,
0.05420200899243355,
0.15336520969867706,
0.000514897343236953,
-0.03219657391309738,
0.025447944179177284,
-0.15330006182193756,
0.16294072568416595,
-0.14366893470287323,
0.15763814747333527,
-0.07093828171491623,
0.036787066608667374,
0.1288612335920334,
-0.01861419528722763,
0.05715280771255493,
0.19707505404949188,
-0.10798878967761993,
-0.07081704586744308,
0.03582610562443733,
-0.090824656188488,
-0.10258297622203827,
0.05602000653743744,
0.01942923665046692,
0.032509684562683105,
0.06194135174155235,
0.20783731341362,
0.1019953191280365,
-0.2318621575832367,
0.03280241787433624,
0.020654328167438507,
-0.13268248736858368,
-0.01210389006882906,
0.12469732016324997,
-0.07512183487415314,
-0.010119328275322914,
-0.03708021342754364,
-0.13509301841259003,
0.09909168630838394,
-0.009726769290864468,
-0.06292295455932617,
0.023158978670835495,
-0.052709467709064484,
-0.04576078802347183,
-0.0010681862477213144,
0.0375831164419651,
-0.03786592558026314,
-0.049651455134153366,
-0.0075907320715487,
0.036539457738399506,
0.00022115689353086054,
0.07409777492284775,
-0.032255299389362335,
0.1313602179288864,
-0.01610906422138214,
0.007960733957588673,
-0.11200816929340363,
-0.1563463658094406,
-0.02084236964583397,
0.036475569009780884,
0.08061971515417099,
-0.0923428162932396,
0.10085518658161163,
0.08336322009563446,
0.048330605030059814,
-0.07369807362556458,
-0.06540843099355698,
0.009986928664147854,
-0.10714752227067947,
-0.10167336463928223,
-0.08078829199075699,
-0.055852729827165604,
0.11451154202222824,
-0.09188080579042435,
0.06784377247095108,
-0.06099151074886322,
0.09268193691968918,
-0.02499045617878437,
-0.08302583545446396,
0.04271720349788666,
-0.015810079872608185,
0.0388956181704998,
-0.09525435417890549,
0.10166218876838684,
0.06860717386007309,
-0.13607420027256012,
0.02362855337560177,
0.058608394116163254,
-0.07825595140457153,
0.12395235151052475,
0.04684408754110336,
-0.004228697624057531,
-0.05242856964468956,
-0.05667952820658684,
0.010743926279246807,
-0.07405214756727219,
0.01966378279030323,
0.21541012823581696,
0.12916797399520874,
0.0758504793047905,
-0.031208228319883347,
-0.0557861402630806,
-0.036133017390966415,
-0.055063631385564804,
-0.0546543262898922,
0.13543976843357086,
0.04574882984161377,
0.006113201379776001,
0.04497625306248665,
-0.011922849342226982,
0.08781535923480988,
0.11270426213741302,
-0.009961645118892193,
-0.12245183438062668,
0.01478891633450985,
0.05571615323424339,
0.055050455033779144,
0.012095711193978786,
0.059442322701215744,
-0.027249019593000412,
-0.015357641503214836,
-0.06805545091629028,
-0.017028411850333214,
-0.11534348875284195,
-0.06224387139081955,
0.06538800895214081,
-0.017095986753702164,
-0.006493515335023403,
-0.08584879338741302,
-0.04269358143210411,
0.03095414489507675,
0.09551719576120377,
-0.005748161114752293,
0.03974342718720436,
-0.03723079338669777,
-0.12977030873298645,
0.0457899272441864,
-0.09118686616420746,
-0.2253391444683075,
-0.12224804610013962,
-0.03478109836578369,
-0.07393918931484222,
0.02584906853735447,
0.06511299312114716,
-0.19036783277988434,
-0.0010331383673474193,
-0.0928429588675499,
-0.00462972791865468,
-0.008522260934114456,
-0.03066422790288925,
0.14247335493564606,
0.0949348732829094,
-0.025476697832345963,
-0.06725994497537613,
0.01141271460801363,
0.015317779965698719,
-0.0663287565112114,
-0.02022489532828331,
0.08850192278623581,
0.09862467646598816,
0.06738349050283432,
0.059573184698820114,
0.05608958378434181,
-0.021687068045139313,
0.1418416053056717,
-0.049058496952056885,
0.016736725345253944,
0.06917709857225418,
-0.011957084760069847,
0.07356003671884537,
0.008563966490328312,
0.033044133335351944,
-0.0008465400314889848,
0.0075211417861282825,
0.007195803336799145,
-0.08372931182384491,
-0.2101641297340393,
-0.07837420701980591,
0.001690364209935069,
0.1752123236656189,
0.15176083147525787,
0.09360306710004807,
-0.1299142837524414,
0.03020024113357067,
0.008332964032888412,
-0.11123427748680115,
0.10393902659416199,
0.12337760627269745,
-0.0640033632516861,
-0.021648088470101357,
0.03600167855620384,
-0.04626674950122833,
0.05501427501440048,
0.060216352343559265,
-0.053752075880765915,
0.08457475900650024,
0.016028940677642822,
-0.004519673530012369,
-0.032850176095962524,
-0.04284932091832161,
-0.05458466336131096,
0.13861402869224548,
0.07161276787519455,
0.024652861058712006,
0.02434123121201992,
-0.06401965767145157,
-0.09284020960330963,
0.1301584392786026,
0.15861161053180695,
-0.0644875019788742,
-0.053602997213602066,
0.11170439422130585,
0.053656332194805145,
0.19431672990322113,
0.00018999043095391244,
-0.12728483974933624,
-0.06656723469495773,
-0.0039069997146725655,
-0.1111859604716301,
0.004572732839733362,
0.03875002637505531,
0.0033679562620818615,
-0.1620202362537384,
0.051029935479164124,
0.007765849586576223,
0.1123562604188919,
0.016653697937726974,
-0.03352133929729462,
0.06316066533327103,
0.004781703930348158,
-0.027305573225021362,
0.04308425262570381,
-0.15044178068637848,
0.01920774206519127,
-0.002892922842875123,
0.08755307644605637,
-0.05922297388315201,
0.030888300389051437,
0.0878518670797348,
-0.04278332367539406,
0.16667161881923676,
0.04999572038650513,
-0.04535062611103058,
-0.13538114726543427,
-0.1785186380147934,
-0.05492927506566048,
-0.025200780481100082,
-0.11336749792098999,
0.07432419806718826,
0.03948485851287842,
-0.016796324402093887,
-0.1044088825583458,
0.010374500416219234,
-0.033439844846725464,
-0.1236405000090599,
-0.04588509351015091,
-0.07901794463396072,
0.05434705317020416,
-0.056146543473005295,
-0.07227982580661774,
-0.07106673717498779,
0.18111129105091095,
0.0778246745467186,
-0.10992623120546341,
-0.12235132604837418,
0.014464648440480232,
-0.05948404595255852,
-0.02536207064986229,
0.07295273244380951,
0.021163523197174072,
0.10921197384595871,
-0.10257016867399216,
-0.05731981247663498,
-0.0343993678689003,
-0.10185011476278305,
-0.09330639243125916,
0.03573302924633026,
0.17131288349628448,
0.040293704718351364,
0.09016585350036621,
-0.010930811055004597,
0.09785950928926468,
-0.005084719974547625,
-0.07563001662492752,
0.12045066058635712,
0.10706545412540436,
-0.019586650654673576,
0.0710168331861496,
0.03221868351101875,
0.06830170750617981,
-0.10657582432031631,
-0.011741213500499725,
0.21856571733951569,
0.2624669671058655,
-0.07091356068849564,
0.19444267451763153,
0.0034423319157212973,
-0.04879440739750862,
-0.17065371572971344,
-0.05047941952943802,
0.014226652681827545,
-0.03788626939058304,
0.10144823789596558,
-0.1957869976758957,
0.09266377985477448,
0.0036157805006951094,
-0.008937081322073936,
0.043412428349256516,
-0.14885011315345764,
-0.08441878110170364,
0.025879204273223877,
0.09536018967628479,
-0.054078325629234314,
-0.09800437837839127,
-0.08209902048110962,
0.005983532406389713,
-0.07518972456455231,
0.02391558326780796,
-0.10408705472946167,
0.05190525949001312,
0.015113240107893944,
0.05027838051319122,
0.06728948652744293,
-0.05774206295609474,
0.13419203460216522,
-0.028491508215665817,
-0.051864977926015854,
-0.07211019098758698,
0.0352817066013813,
-0.026720600202679634,
-0.09056244790554047,
0.0614357516169548,
-0.015665309503674507,
-0.016110384836792946,
-0.20527273416519165,
-0.0515974797308445,
0.029324287548661232,
0.03433994576334953,
-0.034574955701828,
-0.07204435020685196,
-0.03246242552995682,
0.0496789775788784,
0.08237787336111069,
0.025338495150208473,
0.12191972881555557,
-0.0016258276300504804,
-0.006747527047991753,
0.05902356654405594,
0.037264633923769,
0.0417807511985302,
-0.13792891800403595,
-0.060178063809871674,
-0.06741562485694885,
0.0031745927408337593,
-0.05038297921419144,
-0.021862590685486794,
0.057751987129449844,
0.06454936414957047,
-0.018284499645233154,
0.058315202593803406,
-0.07248244434595108,
-0.01720322109758854,
0.02360963076353073,
-0.09871149808168411,
-0.11641400307416916,
-0.08112195134162903,
-0.11323413997888565,
0.019344139844179153,
-0.08207801729440689,
0.08561033010482788,
-0.0511074922978878,
0.000006157732968858909,
0.008271072059869766,
0.03823301941156387,
-0.00837091263383627,
0.04109114408493042,
0.021848531439900398,
0.03555294871330261,
-0.0752454325556755,
0.12624616920948029,
0.0035422444343566895,
-0.04433935135602951,
0.049045540392398834,
0.18542705476284027,
-0.06090686470270157,
-0.06755232810974121,
-0.04653635993599892,
0.08202316612005234,
0.0368821881711483,
-0.02464759349822998,
-0.04531404748558998,
-0.05579663813114166,
0.12068244069814682,
-0.1543353945016861,
0.013988298363983631,
-0.11168906092643738,
0.009688456542789936,
0.05266693979501724,
-0.04947555437684059,
0.05897848680615425,
-0.02565385028719902,
-0.06126860901713371,
-0.14177274703979492,
0.08076289296150208,
0.020898211747407913,
0.09091001749038696,
-0.013857102952897549,
-0.01664838008582592,
-0.14640890061855316,
0.034539710730314255,
-0.009301334619522095,
0.00937731098383665,
-0.15767885744571686,
0.015189849771559238,
0.0005566618638113141,
0.027838382869958878,
0.03394407406449318,
0.06330730766057968,
-0.040923405438661575,
-0.09497308731079102,
-0.052823930978775024,
0.055362384766340256,
-0.08638421446084976,
-0.021761935204267502,
-0.029749643057584763,
-0.0810704380273819,
0.05714099481701851,
0.09330606460571289,
-0.019409652799367905,
-0.041999757289886475,
-0.06289536505937576,
0.017295289784669876,
-0.020952781662344933,
-0.04338812083005905,
0.043587930500507355,
-0.12142062187194824,
0.025315169245004654,
-0.05898028239607811,
-0.11530134826898575,
0.03817586600780487,
0.13681745529174805,
-0.06615148484706879,
0.04659741371870041,
0.05225390940904617,
-0.07888229936361313,
-0.07514043152332306,
-0.010574582032859325,
0.07609771192073822,
0.06265786290168762,
0.10804282873868942,
-0.08231362700462341,
0.1940915733575821,
-0.10705672204494476,
-0.029684627428650856,
0.011921378783881664,
0.07069443166255951,
0.03043452836573124,
-0.09037002176046371,
0.046142395585775375,
-0.013186276890337467,
0.06406433135271072,
0.07640320062637329,
0.006336098536849022,
0.04977619647979736,
0.05622098967432976,
0.15605923533439636,
0.015957653522491455,
0.07662102580070496,
-0.005027689039707184,
0.022150078788399696,
0.1214212104678154,
-0.003588109975680709,
0.06970521062612534,
-0.07578159868717194,
0.0698883906006813,
0.048542484641075134,
0.08432549238204956,
0.0799373984336853,
0.05958611145615578,
-0.09755848348140717,
-0.16737206280231476,
-0.05025918781757355,
0.02989552542567253,
0.032141152769327164,
-0.059225164353847504,
0.18841588497161865,
0.13016726076602936,
-0.19526679813861847,
0.017433786764740944,
0.0023673248942941427,
0.0352543406188488,
-0.08283191174268723,
-0.08520817756652832,
0.014503334648907185,
-0.13635382056236267,
0.09779627621173859,
-0.016965722665190697,
0.0007188064628280699,
-0.016041763126850128,
0.0017707310616970062,
0.030332189053297043,
0.04611389711499214,
-0.05492629110813141,
0.013342383317649364,
0.03955838456749916,
-0.02978416532278061,
0.006096012890338898,
-0.007257315330207348,
-0.09483347833156586,
-0.03224252164363861,
-0.06458204239606857,
-0.014412015676498413,
0.02123970352113247,
0.010901081375777721,
0.0660749226808548,
0.017614107578992844,
-0.05556448921561241,
0.07001074403524399,
0.0013671716442331672,
0.024516751989722252,
0.20561911165714264,
0.09751173108816147,
-0.04735312983393669,
-0.05325661227107048,
0.2074880301952362,
-0.044729698449373245,
-0.05504407733678818,
-0.08020878583192825,
0.10428486764431,
-0.05339634418487549,
-0.044722575694322586,
-0.04279856011271477,
-0.1688818633556366,
-0.05959653854370117,
0.15781940519809723,
0.12358317524194717,
-0.02764078415930271,
0.00850228127092123,
-0.06986373662948608,
0.0055330232717096806,
0.025965897366404533,
0.09719651192426682,
0.06755491346120834,
0.0647735670208931,
-0.09468752890825272,
-0.015430670231580734,
-0.07073669880628586,
-0.10097527503967285,
-0.19617915153503418,
0.04507774859666824,
0.01988121122121811,
-0.02363126166164875,
-0.018175894394516945,
0.119504414498806,
-0.10736411064863205,
-0.09442649036645889,
0.1272178292274475,
-0.03155955672264099,
-0.07037467509508133,
0.00816245749592781,
0.025720445439219475,
0.0007912382134236395,
0.11099740117788315,
0.08430522680282593,
0.04708769917488098,
0.02828478068113327,
-0.018822411075234413,
-0.10786639899015427,
0.02503473125398159,
0.04109792038798332,
-0.12389965355396271,
0.24101018905639648,
-0.019229037687182426,
0.00967972818762064,
0.09229805320501328,
0.07136010378599167,
-0.18840588629245758,
0.01118122972548008,
0.05127615109086037,
-0.18696530163288116,
0.01683090254664421,
0.08011209964752197,
-0.04710543155670166,
0.01567862369120121,
0.050772398710250854,
-0.0353865921497345,
0.007231499068439007,
0.18746773898601532,
0.039992693811655045,
-0.03943387418985367,
0.06889498233795166,
-0.15335078537464142,
0.10110267251729965,
0.09498310089111328,
-0.05553991347551346,
0.007690408732742071,
-0.033886685967445374,
0.007307842373847961,
-0.013647988438606262,
-0.01910986937582493,
-0.012482975609600544,
-0.11861138790845871,
-0.021159639582037926,
-0.056131962686777115,
0.024389130994677544,
-0.2146560251712799,
-0.1284102499485016,
-0.05253817141056061,
-0.075998455286026,
-0.04923783242702484,
0.08161662518978119,
0.07518652826547623,
-0.050319019705057144,
0.014762512408196926,
-0.12604838609695435,
0.032651543617248535,
0.15343046188354492,
-0.07129194587469101,
-0.010389594361186028
] |
null | null | transformers |
Megatron-Mx is a Mixure of Experts (MoE) made with Mistral Models.
## 💻 Usage
```python
!pip install -qU transformers bitsandbytes accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "Eurdem/Megatron-Mx"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)
messages = [{"role": "user", "content": "Tell me abou AI."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=512, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "apache-2.0", "tags": ["moe", "merge", "English", "Mixtral", "Mistral"], "base_model": ["Eurdem/Megatron-Slerp-1", "Eurdem/Megatron-Slerp-5"]} | text-generation | Eurdem/Megatron-Mx | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"moe",
"merge",
"English",
"Mixtral",
"Mistral",
"conversational",
"base_model:Eurdem/Megatron-Slerp-1",
"base_model:Eurdem/Megatron-Slerp-5",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T11:33:27+00:00 | [] | [] | TAGS
#transformers #safetensors #mixtral #text-generation #moe #merge #English #Mixtral #Mistral #conversational #base_model-Eurdem/Megatron-Slerp-1 #base_model-Eurdem/Megatron-Slerp-5 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
Megatron-Mx is a Mixure of Experts (MoE) made with Mistral Models.
## Usage
| [
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mixtral #text-generation #moe #merge #English #Mixtral #Mistral #conversational #base_model-Eurdem/Megatron-Slerp-1 #base_model-Eurdem/Megatron-Slerp-5 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"## Usage"
] | [
108,
3
] | [
"passage: TAGS\n#transformers #safetensors #mixtral #text-generation #moe #merge #English #Mixtral #Mistral #conversational #base_model-Eurdem/Megatron-Slerp-1 #base_model-Eurdem/Megatron-Slerp-5 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## Usage"
] | [
-0.028806794434785843,
-0.028095019981265068,
-0.0047660344280302525,
0.0026916631031781435,
0.09514334797859192,
0.0139055410400033,
0.18831217288970947,
0.050187014043331146,
-0.02192622981965542,
-0.004589904565364122,
0.14343757927417755,
0.08802304416894913,
-0.018714820966124535,
0.10540279746055603,
-0.06418386846780777,
-0.19887994229793549,
0.12458952516317368,
-0.06639270484447479,
-0.12375910580158234,
0.08571319282054901,
0.15084704756736755,
-0.05446130782365799,
0.08288942277431488,
-0.031941015273332596,
-0.07264410704374313,
0.04757671803236008,
0.019452018663287163,
-0.05589212849736214,
0.09696688503026962,
0.1008702889084816,
0.048616182059049606,
0.041116420179605484,
0.004404202103614807,
-0.1774238795042038,
0.04752304404973984,
-0.004348334856331348,
-0.03320496901869774,
0.02614215388894081,
-0.003626548917964101,
-0.06208289787173271,
0.10605868697166443,
-0.004321851301938295,
0.015767250210046768,
0.06556711345911026,
-0.07734108716249466,
-0.10749463737010956,
-0.06977908313274384,
0.07042351365089417,
0.07593394815921783,
0.05444958433508873,
-0.017588788643479347,
0.11034476011991501,
-0.06233106926083565,
0.09056387096643448,
0.100763700902462,
-0.3041025698184967,
-0.023042287677526474,
0.09800267219543457,
0.05928632244467735,
0.09397410601377487,
0.004339480772614479,
0.07620812952518463,
0.06602226197719574,
-0.004908308852463961,
0.01685136929154396,
-0.0523807555437088,
0.03622246906161308,
0.0389513336122036,
-0.12079751491546631,
0.011738777160644531,
0.25777900218963623,
0.0019355540862306952,
0.020110594108700752,
-0.07541455328464508,
-0.1005074679851532,
0.06560579687356949,
-0.02204124629497528,
-0.027950050309300423,
0.007498823571950197,
0.07459304481744766,
0.08750631660223007,
-0.07139161229133606,
-0.08071743696928024,
-0.03738018125295639,
-0.15203016996383667,
0.17906975746154785,
0.00889761932194233,
0.03910937160253525,
-0.043069884181022644,
0.02164398320019245,
-0.03921373561024666,
-0.11964336782693863,
0.03981058672070503,
-0.084986612200737,
0.021823974326252937,
0.034033603966236115,
-0.0558181069791317,
-0.08337654918432236,
0.166447713971138,
0.22617630660533905,
0.018909437581896782,
0.06436033546924591,
0.007740079890936613,
0.0841650739312172,
0.03661702573299408,
-0.005696315784007311,
-0.05883160978555679,
-0.1719733029603958,
0.07251712679862976,
0.0102789755910635,
0.11088115721940994,
-0.029008638113737106,
-0.12485251575708389,
0.033787913620471954,
-0.019518617540597916,
0.07093945890665054,
0.03828474134206772,
0.11405108869075775,
-0.04431142285466194,
0.011562472209334373,
0.10615149885416031,
-0.09195936471223831,
-0.0005118580884300172,
0.021142296493053436,
0.03865242749452591,
0.0815272405743599,
0.032128993421792984,
0.08222905546426773,
-0.00191106169950217,
0.030915750190615654,
-0.05922848358750343,
-0.040802307426929474,
-0.05890890583395958,
-0.08834822475910187,
0.06005213037133217,
0.05950860306620598,
-0.010147200897336006,
-0.11771424114704132,
-0.19881407916545868,
-0.006631431169807911,
0.06205719709396362,
-0.025960100814700127,
-0.05629400536417961,
-0.03822065144777298,
-0.01763765700161457,
0.04470101743936539,
-0.04269680380821228,
0.0066657280549407005,
-0.07065229117870331,
0.021152179688215256,
-0.02875369042158127,
0.0239107608795166,
-0.20668411254882812,
0.03642794489860535,
-0.09606911987066269,
0.03882494568824768,
-0.186587393283844,
0.025735745206475258,
-0.10599438101053238,
0.1385846734046936,
-0.05907481908798218,
0.023564884439110756,
-0.045644860714673996,
0.059200625866651535,
0.007681813556700945,
0.16332830488681793,
-0.12446258962154388,
-0.07999870926141739,
0.10581515729427338,
-0.17888395488262177,
-0.17486394941806793,
0.146247997879982,
0.00970496330410242,
0.015265914611518383,
0.08483818173408508,
0.14877623319625854,
0.02248498611152172,
-0.034935880452394485,
0.05714933201670647,
0.05026084557175636,
-0.05785118415951729,
-0.027765363454818726,
0.09922167658805847,
-0.04595058038830757,
-0.12465032190084457,
0.05596733093261719,
-0.0034691463224589825,
0.08723342418670654,
-0.016339706256985664,
-0.061512093991041183,
-0.06051700562238693,
-0.043258532881736755,
0.06709522753953934,
-0.023730291053652763,
0.007340816780924797,
-0.11160075664520264,
-0.044728219509124756,
0.09884285181760788,
0.06247308850288391,
-0.018173880875110626,
0.011508976109325886,
-0.06364601105451584,
0.09343031048774719,
-0.003068177495151758,
0.062469109892845154,
-0.1251927614212036,
-0.07867967337369919,
-0.046245839446783066,
0.052293241024017334,
0.04561392590403557,
0.04142078757286072,
0.07164178788661957,
0.012422815896570683,
-0.05591044947504997,
0.013755089603364468,
0.14112251996994019,
0.051520541310310364,
-0.04335760697722435,
-0.2094031721353531,
0.04338168725371361,
-0.06180209293961525,
0.15008817613124847,
-0.09327658265829086,
0.06962883472442627,
0.04567103832960129,
0.13418260216712952,
-0.04168810695409775,
0.06062740087509155,
-0.028778472915291786,
0.009298823773860931,
-0.05936714634299278,
0.008828896097838879,
0.09250524640083313,
0.009939832612872124,
-0.1432126760482788,
0.16126854717731476,
-0.16697196662425995,
0.2343592792749405,
0.1870594024658203,
-0.09539121389389038,
0.034579940140247345,
-0.10152850300073624,
0.019034378230571747,
-0.03053518570959568,
0.04588981345295906,
-0.10182129591703415,
0.0414089635014534,
0.013807029463350773,
0.11585530638694763,
-0.06309866160154343,
-0.027289249002933502,
-0.026928650215268135,
-0.05005025863647461,
-0.0522940419614315,
0.08385756611824036,
0.06368930637836456,
-0.19907037913799286,
0.18012525141239166,
0.2789163887500763,
-0.028945231810212135,
0.153818741440773,
-0.05802210420370102,
0.026338020339608192,
0.0018039837013930082,
-0.012764643877744675,
-0.045274145901203156,
0.013734360225498676,
-0.13343343138694763,
-0.0030857541132718325,
0.054229624569416046,
0.045360639691352844,
0.059412501752376556,
-0.07951655983924866,
-0.003953568171709776,
-0.0004968937137164176,
-0.03046536259353161,
0.03748394176363945,
0.04976533353328705,
-0.019072458148002625,
0.09478720277547836,
-0.03331567719578743,
-0.041050463914871216,
0.05586788058280945,
-0.016267716884613037,
-0.0973791852593422,
0.14798398315906525,
-0.19556055963039398,
-0.16219200193881989,
-0.15073099732398987,
-0.03405018150806427,
-0.11413505673408508,
-0.005034841597080231,
0.09579036384820938,
-0.03391759470105171,
-0.05757822096347809,
-0.09031085669994354,
0.07163700461387634,
0.011537278071045876,
-0.01797962747514248,
0.011411376297473907,
-0.0025240161921828985,
-0.021735459566116333,
-0.1293470412492752,
-0.04826084524393082,
0.023445909842848778,
-0.011474674567580223,
0.045592300593853,
-0.0604635626077652,
0.08491502702236176,
0.13355600833892822,
0.028531894087791443,
-0.006658927071839571,
-0.014321768656373024,
0.17030946910381317,
-0.026657288894057274,
0.04449107125401497,
0.18390725553035736,
-0.0386350117623806,
0.06850827485322952,
0.1961308717727661,
0.02703999914228916,
-0.03541351482272148,
0.040978364646434784,
-0.017691172659397125,
-0.04603004828095436,
-0.2094232141971588,
-0.13632343709468842,
-0.07707799971103668,
0.032820165157318115,
-0.04453762248158455,
0.06449243426322937,
0.013849357143044472,
0.06562749296426773,
-0.05658537894487381,
-0.08556473255157471,
0.10625643283128738,
0.047806475311517715,
0.27907004952430725,
-0.04472934082150459,
0.14160436391830444,
-0.08094195276498795,
-0.06245986372232437,
0.1139904037117958,
0.009608716703951359,
0.05567415803670883,
0.11014866828918457,
0.11760541051626205,
0.06040383502840996,
0.04575083404779434,
0.06167208030819893,
0.09431451559066772,
0.0006438711425289512,
-0.03012414462864399,
-0.05810263007879257,
-0.06306436657905579,
-0.038950033485889435,
0.06265523284673691,
-0.008831859566271305,
-0.007611431647092104,
0.0065635503269732,
0.011991764418780804,
0.15071140229701996,
0.1939696967601776,
-0.00797028187662363,
-0.1840479075908661,
-0.016675351187586784,
0.11699063330888748,
-0.003344601020216942,
-0.00511877192184329,
0.06429292261600494,
0.06608470529317856,
-0.056811559945344925,
0.14681625366210938,
-0.005395112559199333,
0.10239243507385254,
0.060843415558338165,
0.046834465116262436,
-0.04737040400505066,
0.017097104340791702,
0.0034257909283041954,
0.05848362669348717,
-0.22073012590408325,
0.16639238595962524,
0.009245445020496845,
-0.03344987332820892,
-0.028614262118935585,
0.046098917722702026,
0.0707838162779808,
0.2379874289035797,
0.03380022943019867,
0.003784717759117484,
-0.17132514715194702,
0.021822048351168633,
-0.04795551672577858,
0.047392338514328,
0.03540598973631859,
0.0001516419870313257,
0.007938445545732975,
-0.09039951115846634,
-0.012520598247647285,
0.030900785699486732,
0.06940680742263794,
-0.153489887714386,
-0.161546528339386,
0.022071437910199165,
0.08858226239681244,
0.05669286474585533,
-0.080130435526371,
-0.03642553091049194,
-0.09621483087539673,
0.23038652539253235,
-0.10769395530223846,
-0.0710357204079628,
-0.1311480551958084,
-0.08121045678853989,
0.057761721312999725,
-0.03168124705553055,
0.03257104009389877,
-0.05505523830652237,
0.06299271434545517,
-0.07391560822725296,
-0.14355824887752533,
0.13229237496852875,
-0.1040005311369896,
-0.08536814898252487,
-0.04168786481022835,
0.12617136538028717,
-0.0780545249581337,
0.02118021994829178,
-0.0020739156752824783,
0.031043052673339844,
-0.06164935231208801,
-0.12761946022510529,
-0.025768615305423737,
0.0917370617389679,
0.01865747757256031,
0.05627359077334404,
-0.09310474991798401,
-0.17549872398376465,
0.04931557551026344,
-0.05264906957745552,
0.21716387569904327,
0.2928609549999237,
-0.029807403683662415,
0.09168380498886108,
0.22956931591033936,
-0.06480942666530609,
-0.3130433261394501,
-0.10468973219394684,
-0.12235824018716812,
-0.03353618457913399,
0.022063344717025757,
-0.06256230920553207,
0.10249793529510498,
0.12361569702625275,
-0.049321260303258896,
0.002321124542504549,
-0.23849457502365112,
-0.14249368011951447,
0.139729842543602,
0.015785688534379005,
0.2820395827293396,
-0.18477606773376465,
-0.08515246212482452,
-0.15445482730865479,
-0.15984682738780975,
0.04716340824961662,
-0.18636971712112427,
0.038628850132226944,
0.038494814187288284,
-0.030717622488737106,
-0.004662260413169861,
-0.03757012262940407,
0.14938054978847504,
-0.064344622194767,
0.05925111845135689,
-0.10213883221149445,
-0.028136279433965683,
0.10558411478996277,
0.003011544467881322,
0.07334735244512558,
-0.2337948977947235,
0.010706018656492233,
-0.02141467109322548,
-0.024267639964818954,
-0.005873768124729395,
0.06180751696228981,
-0.01156326849013567,
-0.045091524720191956,
-0.038688622415065765,
-0.013622044585645199,
-0.00871309544891119,
-0.0039453343488276005,
0.19524623453617096,
-0.08494406938552856,
0.07976353168487549,
0.1591891050338745,
0.13387426733970642,
-0.17105643451213837,
0.05709924176335335,
-0.012058054096996784,
-0.07536061108112335,
0.06620126217603683,
-0.12717163562774658,
0.05098309367895126,
0.08588366955518723,
-0.05609118565917015,
0.09256825596094131,
0.06008659303188324,
0.010396079160273075,
-0.000018967233700095676,
0.11894886195659637,
-0.1608288735151291,
-0.15270335972309113,
-0.00681699812412262,
0.06033705919981003,
-0.006553347688168287,
0.0667477697134018,
0.1579311043024063,
-0.005038795061409473,
-0.005515535827726126,
0.01301480270922184,
0.03401241824030876,
-0.09037957340478897,
0.08746068179607391,
0.005962344352155924,
0.03430541232228279,
-0.12003517150878906,
0.1161876991391182,
0.005940107628703117,
-0.09518615901470184,
-0.006271216552704573,
0.12132368236780167,
-0.1129036620259285,
-0.12804323434829712,
-0.003271889640018344,
0.1606268286705017,
-0.03251788765192032,
-0.09704181551933289,
-0.08064986020326614,
-0.20402295887470245,
0.015463633462786674,
0.1153428927063942,
0.09692644327878952,
0.03909847512841225,
0.020046742632985115,
-0.023349817842245102,
0.04530403017997742,
0.08283811062574387,
-0.00615811999887228,
0.02594800665974617,
-0.08283092081546783,
-0.03352666646242142,
-0.02000611089169979,
-0.0019116245675832033,
-0.06438126415014267,
0.014094042591750622,
-0.129783034324646,
-0.016316939145326614,
-0.1943545639514923,
-0.05725720897316933,
-0.10318052768707275,
-0.023694349452853203,
0.03964453935623169,
-0.07985084503889084,
-0.039642490446567535,
-0.013705156743526459,
-0.0646367222070694,
-0.0252511166036129,
-0.003390850266441703,
0.07691062241792679,
-0.11878636479377747,
-0.0028766626492142677,
0.08838362991809845,
-0.03686046600341797,
0.10052016377449036,
0.07719922065734863,
-0.048374880105257034,
0.05289288982748985,
-0.22047539055347443,
-0.05850822478532791,
0.08631236851215363,
0.03077050857245922,
0.006087605841457844,
-0.04736628755927086,
-0.036872249096632004,
0.09614105522632599,
0.05030837655067444,
0.043408144265413284,
0.0649312287569046,
-0.08400561660528183,
0.04930484667420387,
-0.03365673869848251,
-0.1285116821527481,
-0.010814594104886055,
-0.10021702200174332,
0.1136329397559166,
0.024041686207056046,
0.19829899072647095,
-0.0884334146976471,
-0.011833949945867062,
-0.10374035686254501,
0.05625326558947563,
0.012987800873816013,
-0.16292329132556915,
-0.1273421198129654,
-0.06332781165838242,
-0.019823135808110237,
-0.004471758846193552,
0.23411165177822113,
-0.031980544328689575,
-0.11801537871360779,
0.10646602511405945,
0.0505317747592926,
0.05231039971113205,
0.07954518496990204,
0.20433296263217926,
0.06267033517360687,
0.008642621338367462,
-0.11613535135984421,
0.019446203485131264,
0.0578102208673954,
-0.07464440166950226,
0.08471612632274628,
0.13279440999031067,
-0.0022808059584349394,
0.09882175922393799,
0.061868201941251755,
0.024524951353669167,
0.00007530884613515809,
-0.010001561604440212,
-0.017479145899415016,
0.047990765422582626,
-0.06135370954871178,
0.19225062429904938,
0.18857194483280182,
-0.06919502466917038,
0.00036931803333573043,
-0.06443566083908081,
-0.009183764457702637,
-0.1448778659105301,
-0.0701352059841156,
-0.10267234593629837,
-0.1799153983592987,
-0.04197802022099495,
-0.09930963814258575,
-0.010566758923232555,
0.050165604799985886,
0.048945099115371704,
-0.0213210079818964,
0.055159326642751694,
-0.07865991443395615,
-0.031118081882596016,
-0.005645266268402338,
-0.04297732561826706,
-0.0112835131585598,
-0.026403134688735008,
-0.09194837510585785,
-0.006843533832579851,
-0.016124302521348,
-0.023562200367450714,
0.0750376433134079,
-0.04612872004508972,
0.04185092821717262,
-0.08944831788539886,
-0.08547040075063705,
-0.04338163882493973,
0.0398528091609478,
-0.012064428068697453,
0.06742600351572037,
0.03628581762313843,
-0.020306671038269997,
0.0609271302819252,
0.14508937299251556,
-0.05376330018043518,
-0.18285682797431946,
-0.10085500031709671,
0.07765238732099533,
-0.008913345634937286,
0.1307407021522522,
-0.026289356872439384,
-0.05581139773130417,
-0.049433041363954544,
0.1836099922657013,
0.3402366042137146,
-0.09214939177036285,
0.021759985014796257,
-0.03385788947343826,
0.028325054794549942,
0.015530436299741268,
0.055608004331588745,
0.06541381776332855,
0.1183362826704979,
-0.04772096127271652,
0.060030657798051834,
-0.03912912309169769,
-0.04817689582705498,
-0.1365155428647995,
0.028522420674562454,
-0.005963290575891733,
-0.051678191870450974,
0.0047473968006670475,
0.09750042855739594,
-0.0588117390871048,
0.04213536158204079,
-0.09163661301136017,
-0.10560492426156998,
-0.05867152661085129,
-0.03143756464123726,
0.15769845247268677,
0.028136229142546654,
0.03008299507200718,
-0.022454245015978813,
-0.054021600633859634,
0.06087205186486244,
-0.01921854354441166,
-0.1486910879611969,
-0.08375948667526245,
0.07423804700374603,
-0.029621528461575508,
0.09910684823989868,
0.007516638841480017,
0.04434606805443764,
0.11216796934604645,
0.0038713482208549976,
-0.06790493428707123,
0.08156438171863556,
0.03033696673810482,
-0.06809952855110168,
0.016522420570254326,
-0.08555908501148224,
-0.0035078309010714293,
0.0855090320110321,
0.045125238597393036,
-0.0801517516374588,
0.06028695032000542,
0.026622848585247993,
-0.10104665160179138,
-0.019509535282850266,
0.07393084466457367,
-0.08601125329732895,
0.09545841068029404,
0.053774744272232056,
-0.018886983394622803,
0.006232333369553089,
-0.03649807348847389,
0.033297620713710785,
0.016448792070150375,
0.01884959638118744,
-0.030743137001991272,
-0.18323253095149994,
-0.019591912627220154,
0.07252547144889832,
0.03208620101213455,
-0.1909235566854477,
-0.07286157459020615,
-0.14575797319412231,
0.08040229231119156,
-0.10172580182552338,
0.04514170438051224,
0.19167298078536987,
-0.010003793053328991,
-0.018693087622523308,
-0.22957012057304382,
0.01932467147707939,
0.08405274897813797,
-0.05319822579622269,
-0.07232565432786942
] |
null | null | transformers |
# NMTOB-7B
NMTOB-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [Kukedlc/NeuTrixOmniBe-7B-model-remix](https://huggingface.co/Kukedlc/NeuTrixOmniBe-7B-model-remix)
* [paulml/OmniBeagleSquaredMBX-v3-7B-v2](https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B-v2)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: Kukedlc/NeuTrixOmniBe-7B-model-remix
layer_range: [0, 32]
- model: paulml/OmniBeagleSquaredMBX-v3-7B-v2
layer_range: [0, 32]
merge_method: slerp
base_model: Kukedlc/NeuTrixOmniBe-7B-model-remix
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "paulml/NMTOB-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "cc-by-nc-4.0", "tags": ["merge", "mergekit", "lazymergekit", "Kukedlc/NeuTrixOmniBe-7B-model-remix", "paulml/OmniBeagleSquaredMBX-v3-7B-v2"], "base_model": ["Kukedlc/NeuTrixOmniBe-7B-model-remix", "paulml/OmniBeagleSquaredMBX-v3-7B-v2"]} | text-generation | paulml/NMTOB-7B | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"Kukedlc/NeuTrixOmniBe-7B-model-remix",
"paulml/OmniBeagleSquaredMBX-v3-7B-v2",
"base_model:Kukedlc/NeuTrixOmniBe-7B-model-remix",
"base_model:paulml/OmniBeagleSquaredMBX-v3-7B-v2",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T11:34:09+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #Kukedlc/NeuTrixOmniBe-7B-model-remix #paulml/OmniBeagleSquaredMBX-v3-7B-v2 #base_model-Kukedlc/NeuTrixOmniBe-7B-model-remix #base_model-paulml/OmniBeagleSquaredMBX-v3-7B-v2 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# NMTOB-7B
NMTOB-7B is a merge of the following models using LazyMergekit:
* Kukedlc/NeuTrixOmniBe-7B-model-remix
* paulml/OmniBeagleSquaredMBX-v3-7B-v2
## Configuration
## Usage
| [
"# NMTOB-7B\n\nNMTOB-7B is a merge of the following models using LazyMergekit:\n* Kukedlc/NeuTrixOmniBe-7B-model-remix\n* paulml/OmniBeagleSquaredMBX-v3-7B-v2",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #Kukedlc/NeuTrixOmniBe-7B-model-remix #paulml/OmniBeagleSquaredMBX-v3-7B-v2 #base_model-Kukedlc/NeuTrixOmniBe-7B-model-remix #base_model-paulml/OmniBeagleSquaredMBX-v3-7B-v2 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# NMTOB-7B\n\nNMTOB-7B is a merge of the following models using LazyMergekit:\n* Kukedlc/NeuTrixOmniBe-7B-model-remix\n* paulml/OmniBeagleSquaredMBX-v3-7B-v2",
"## Configuration",
"## Usage"
] | [
161,
66,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #Kukedlc/NeuTrixOmniBe-7B-model-remix #paulml/OmniBeagleSquaredMBX-v3-7B-v2 #base_model-Kukedlc/NeuTrixOmniBe-7B-model-remix #base_model-paulml/OmniBeagleSquaredMBX-v3-7B-v2 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# NMTOB-7B\n\nNMTOB-7B is a merge of the following models using LazyMergekit:\n* Kukedlc/NeuTrixOmniBe-7B-model-remix\n* paulml/OmniBeagleSquaredMBX-v3-7B-v2## Configuration## Usage"
] | [
-0.04288113862276077,
0.04572850093245506,
-0.006471363361924887,
-0.012580575421452522,
0.04861200228333473,
0.04672292247414589,
0.14749334752559662,
0.1078682616353035,
0.025585496798157692,
0.06643728166818619,
0.05317264050245285,
0.15339775383472443,
0.042608972638845444,
0.14101654291152954,
-0.06457356363534927,
-0.1386926919221878,
0.06365476548671722,
0.013236005790531635,
-0.031353600323200226,
0.08982948213815689,
0.10250286757946014,
-0.0449763685464859,
0.09963095933198929,
0.008387165144085884,
-0.09319332242012024,
0.016987405717372894,
0.010479817166924477,
-0.031005406752228737,
0.04911063611507416,
0.07857213914394379,
0.030405910685658455,
0.07300765067338943,
-0.042799197137355804,
-0.18089495599269867,
0.030139276757836342,
0.02902589924633503,
-0.04293324053287506,
0.05586334317922592,
0.09163560718297958,
-0.05983109027147293,
0.11612515151500702,
-0.11589933186769485,
-0.009285541251301765,
0.07192695140838623,
-0.10561221092939377,
-0.02309231273829937,
-0.12473327666521072,
0.12000032514333725,
0.07482416182756424,
0.006103862542659044,
-0.011678287759423256,
0.11690855771303177,
-0.012431557290256023,
0.10619637370109558,
0.23370955884456635,
-0.3210948407649994,
-0.028518855571746826,
0.12226136773824692,
0.08842505514621735,
-0.03103632479906082,
0.010036077350378036,
0.06486514210700989,
0.0002719048934523016,
-0.012317555025219917,
0.017951905727386475,
-0.08950287848711014,
0.07954394072294235,
-0.053671304136514664,
-0.11752202361822128,
-0.0151435611769557,
0.13056504726409912,
0.033750925213098526,
-0.02290518768131733,
-0.09627979248762131,
-0.06627796590328217,
0.08081812411546707,
-0.04252520948648453,
-0.03398269787430763,
-0.000349947193171829,
-0.048658587038517,
-0.018646176904439926,
-0.05961432680487633,
-0.016882536932826042,
-0.015034002251923084,
-0.05850910767912865,
0.112567238509655,
0.013796274550259113,
0.006286063697189093,
-0.03386552259325981,
0.05700378492474556,
-0.11110134422779083,
-0.10088749974966049,
-0.023980626836419106,
-0.04335508495569229,
-0.008955986239016056,
-0.03049006313085556,
-0.03593160957098007,
-0.1262427717447281,
0.10418252646923065,
0.21770429611206055,
-0.06639890372753143,
0.04633721336722374,
0.022584639489650726,
0.04421873763203621,
-0.0018762312829494476,
-0.030199678614735603,
-0.08938714116811752,
-0.21467334032058716,
0.07368446886539459,
0.048876676708459854,
0.045275889337062836,
0.002725865226238966,
-0.0653514415025711,
-0.08284176886081696,
0.0498785525560379,
0.056554440408945084,
0.10207574814558029,
0.08816290646791458,
-0.05993884429335594,
-0.06296432763338089,
0.17697232961654663,
-0.07878443598747253,
0.029339689761400223,
-0.014466225169599056,
-0.0036893419455736876,
-0.0236011054366827,
0.07570608705282211,
0.030330421403050423,
-0.0474395677447319,
0.04618221893906593,
-0.057198043912649155,
-0.021099509671330452,
-0.016879383474588394,
-0.10061810165643692,
0.031085874885320663,
-0.07600314170122147,
-0.03937321528792381,
-0.10753708332777023,
-0.1505509465932846,
-0.018591415137052536,
0.01959160529077053,
-0.027194242924451828,
-0.019514616578817368,
-0.03960279002785683,
0.013763254508376122,
0.011595848016440868,
0.008381334133446217,
0.03382221236824989,
0.005851492751389742,
0.0025243202690035105,
0.0032387978862971067,
0.0679372176527977,
-0.07665570825338364,
0.012245009653270245,
-0.05363566055893898,
0.09737216681241989,
-0.1674329787492752,
0.05433769151568413,
-0.06195366010069847,
0.007330470252782106,
-0.1484299600124359,
-0.008722285740077496,
-0.02099348045885563,
-0.00792174693197012,
0.0675402358174324,
0.12557074427604675,
-0.08069627732038498,
-0.07994091510772705,
0.14600801467895508,
-0.08232375979423523,
-0.13199058175086975,
0.07581595331430435,
0.03455384448170662,
0.01764272339642048,
0.048960164189338684,
0.13533423840999603,
0.13723807036876678,
-0.03942784294486046,
-0.0846574530005455,
-0.0073232995346188545,
-0.004733341280370951,
0.021380970254540443,
0.03454728424549103,
-0.028305986896157265,
-0.04886678233742714,
0.0356484055519104,
0.04363041743636131,
0.03904477134346962,
-0.0306034367531538,
-0.04222840070724487,
-0.0634014829993248,
-0.06885410100221634,
0.10329694300889969,
-0.08001848310232162,
0.001300230622291565,
-0.06468284130096436,
-0.03888474404811859,
0.03355167433619499,
0.08599438518285751,
-0.024035586044192314,
0.02956535667181015,
-0.08146605640649796,
0.11479366570711136,
-0.08968908339738846,
0.0511246956884861,
-0.11492279917001724,
-0.04637736827135086,
0.01635102368891239,
-0.054673124104738235,
0.019775139167904854,
-0.044317577034235,
0.0668843537569046,
0.019535712897777557,
-0.06783118844032288,
-0.04386411979794502,
0.09320297837257385,
0.016049455851316452,
-0.04233015328645706,
-0.12420715391635895,
-0.05864628031849861,
-0.03570111468434334,
0.10426357388496399,
-0.0791771337389946,
0.035798218101263046,
0.030507121235132217,
0.19017787277698517,
0.019725581631064415,
-0.022072728723287582,
0.04632030799984932,
0.035801488906145096,
-0.02885224111378193,
-0.0009371282067149878,
0.06932393461465836,
-0.01814117655158043,
-0.10851003974676132,
0.11905108392238617,
-0.12622670829296112,
0.05385466292500496,
0.08749625831842422,
0.035989515483379364,
-0.014882617630064487,
-0.03170330077409744,
-0.006091732997447252,
-0.05637369304895401,
0.08429335802793503,
-0.04873640835285187,
0.02252683974802494,
0.03689747303724289,
0.09890017658472061,
-0.07167908549308777,
-0.038795214146375656,
0.01991560496389866,
-0.03699763864278793,
-0.056213561445474625,
0.07174070179462433,
-0.007183864247053862,
-0.223517045378685,
0.07276748865842819,
0.19912885129451752,
0.01004247646778822,
0.14107677340507507,
0.03402261435985565,
-0.006584935821592808,
-0.08327429741621017,
0.017258618026971817,
0.05349937826395035,
-0.0394156351685524,
-0.06802766770124435,
0.032424185425043106,
0.06393634527921677,
0.01211792416870594,
0.05372924357652664,
-0.04775916039943695,
0.032833170145750046,
-0.00067359977401793,
-0.020241793245077133,
0.09150524437427521,
0.12118800729513168,
0.013984774239361286,
0.06039969623088837,
-0.007231741212308407,
0.018331779167056084,
0.022700948640704155,
-0.013383704237639904,
-0.09085454046726227,
0.1489020138978958,
-0.1462138295173645,
-0.2504824995994568,
-0.14861701428890228,
-0.07647167146205902,
-0.133632630109787,
-0.03740590065717697,
0.04400079324841499,
-0.0035122395493090153,
-0.034490812569856644,
-0.09953973442316055,
0.0519871711730957,
0.009673070162534714,
-0.03140998259186745,
-0.049499910324811935,
0.020360657945275307,
0.03920164331793785,
-0.11183981597423553,
-0.024669846519827843,
0.04844369366765022,
-0.09852227568626404,
0.06792927533388138,
-0.032319650053977966,
0.09822329878807068,
0.07039561122655869,
0.01065679918974638,
-0.012599370442330837,
-0.011565559543669224,
0.23412226140499115,
-0.025034485384821892,
0.06738917529582977,
0.169610396027565,
-0.019734201952815056,
0.0639367625117302,
0.12004910409450531,
0.00031537486938759685,
-0.016330687329173088,
-0.01790299266576767,
-0.010583902709186077,
-0.015814879909157753,
-0.1946597844362259,
-0.09865986555814743,
-0.06428588181734085,
0.04298324137926102,
0.04327854886651039,
0.04050355777144432,
0.06638692319393158,
0.06888504326343536,
-0.08613131940364838,
0.03991001099348068,
0.0637403279542923,
0.07125993818044662,
0.15779991447925568,
0.0018646904500201344,
0.09936228394508362,
-0.0396311953663826,
0.0026907657738775015,
0.05304505676031113,
0.06776701658964157,
0.08762907981872559,
0.05197243019938469,
0.15040962398052216,
0.08038244396448135,
0.0705176293849945,
0.05992741882801056,
0.07100923359394073,
-0.018649067729711533,
-0.005966668948531151,
-0.007813911885023117,
-0.09432994574308395,
0.005717679392546415,
0.021285653114318848,
0.0593600757420063,
0.03209624066948891,
-0.02063155174255371,
0.001159976120106876,
0.08468224853277206,
0.13056349754333496,
0.0922069326043129,
-0.20253156125545502,
-0.02089710161089897,
0.04102642461657524,
0.01805310696363449,
-0.04569361358880997,
-0.03092510625720024,
0.029964646324515343,
-0.10963892936706543,
0.15312880277633667,
-0.03820439800620079,
0.08573447912931442,
-0.04442253336310387,
-0.004503301344811916,
-0.03986983001232147,
0.10781410336494446,
-0.010812103748321533,
0.04852217435836792,
-0.24390633404254913,
0.10292665660381317,
0.05757317319512367,
0.014202091842889786,
0.0005153983947820961,
0.019282719120383263,
0.020460087805986404,
0.11482616513967514,
0.08744597434997559,
0.01733250543475151,
0.014551809057593346,
-0.04991932958364487,
-0.07808344811201096,
-0.012811761349439621,
0.08318615704774857,
-0.02696123905479908,
0.08242963254451752,
-0.021479768678545952,
-0.065236397087574,
-0.0243166983127594,
0.1116936057806015,
-0.23072977364063263,
-0.13030308485031128,
0.0823117345571518,
0.09007693082094193,
0.014630123041570187,
-0.0897095650434494,
-0.007676595356315374,
-0.04484676197171211,
0.24987412989139557,
-0.06693659722805023,
-0.04159342125058174,
-0.07872429490089417,
-0.04957913979887962,
0.14381791651248932,
-0.08876387774944305,
0.07009158283472061,
-0.05591447278857231,
0.05074552074074745,
-0.0986485630273819,
-0.13829167187213898,
0.04816504195332527,
-0.1019073948264122,
-0.10279559344053268,
-0.0321819931268692,
0.12026508897542953,
-0.05204829201102257,
0.009370923042297363,
0.015347125008702278,
0.049026183784008026,
0.013289118185639381,
-0.046170786023139954,
0.023865707218647003,
0.1361023336648941,
0.0031359633430838585,
0.1230434700846672,
-0.05682877078652382,
-0.107170969247818,
-0.044319357722997665,
-0.010621058754622936,
0.16158823668956757,
0.2843509912490845,
-0.012171069160103798,
0.08062475174665451,
0.10152707993984222,
-0.07228779792785645,
-0.2153717577457428,
-0.07520915567874908,
0.019147442653775215,
0.00857484620064497,
0.039551932364702225,
-0.07698699086904526,
0.03851832449436188,
0.07904306799173355,
0.004043234046548605,
0.12093983590602875,
-0.2621123194694519,
-0.136091411113739,
0.048641279339790344,
0.05841779336333275,
0.08768775314092636,
-0.10347124934196472,
-0.09782696515321732,
-0.0920666828751564,
-0.20634324848651886,
0.07766716182231903,
-0.03155281022191048,
0.08577857166528702,
-0.05277486890554428,
0.007895337417721748,
0.015867110341787338,
-0.04639149457216263,
0.1086038127541542,
-0.03190762922167778,
0.02434307150542736,
-0.0800669863820076,
-0.0712738037109375,
0.09810914844274521,
-0.07626035809516907,
0.038592658936977386,
-0.15405495464801788,
0.05100991949439049,
-0.02736794948577881,
-0.012518439441919327,
-0.07703867554664612,
0.08387543261051178,
-0.06313072890043259,
-0.027024270966649055,
-0.025621211156249046,
0.0424657016992569,
0.02197226881980896,
0.028729038313031197,
0.10691080242395401,
-0.04156617447733879,
0.08302228152751923,
0.24810661375522614,
0.08410205692052841,
-0.05750962719321251,
-0.020079558715224266,
-0.010971413925290108,
-0.0591825470328331,
0.045165497809648514,
0.008308112621307373,
0.01708419807255268,
0.05637378990650177,
0.009597583673894405,
0.09831799566745758,
0.02012055180966854,
-0.0899924486875534,
-0.030486490577459335,
0.1015651524066925,
-0.15134082734584808,
-0.10919175297021866,
-0.047073494642972946,
-0.024288106709718704,
-0.052320696413517,
0.03151416406035423,
0.21546818315982819,
0.0013966565020382404,
-0.038240697234869,
0.02024235762655735,
0.019910478964447975,
-0.08814441412687302,
0.14120016992092133,
0.01830746978521347,
0.04937491565942764,
-0.0668068677186966,
0.035131268203258514,
0.03619898855686188,
-0.0355939082801342,
-0.009080689400434494,
0.07360978424549103,
-0.09706120193004608,
-0.08146443963050842,
-0.05859900638461113,
0.12986686825752258,
-0.018041759729385376,
-0.01759994961321354,
-0.057847555726766586,
-0.06541714817285538,
0.028274446725845337,
0.13209998607635498,
0.01834874413907528,
0.019943058490753174,
0.04746639356017113,
-0.047973811626434326,
-0.011722762137651443,
0.08987785875797272,
0.02818170003592968,
0.10158875584602356,
-0.08170882612466812,
0.042947907000780106,
-0.03830190747976303,
0.03859325125813484,
-0.01875368319451809,
0.02940518409013748,
-0.1447576880455017,
-0.05081542953848839,
-0.16939249634742737,
-0.019297925755381584,
-0.1479542851448059,
-0.03849293291568756,
-0.025008853524923325,
0.02078629843890667,
-0.03548792377114296,
0.0049113621935248375,
-0.030210578814148903,
-0.07932553440332413,
-0.04288734495639801,
0.07176659256219864,
-0.07743922621011734,
-0.01168820820748806,
0.029286565259099007,
-0.06513482332229614,
0.0591217502951622,
0.004002213478088379,
-0.014746755361557007,
-0.002262457273900509,
-0.1502278596162796,
-0.08624638617038727,
0.019945327192544937,
0.010571444407105446,
0.013202842324972153,
-0.11176971346139908,
-0.009018316864967346,
0.005757959093898535,
-0.025550326332449913,
-0.015070249326527119,
0.054883189499378204,
-0.08736976236104965,
0.0053804232738912106,
-0.06364796310663223,
-0.047084879130125046,
-0.05402611941099167,
0.017324209213256836,
0.04928511753678322,
0.02081163413822651,
0.11243120580911636,
-0.06816015392541885,
0.04419797658920288,
-0.15704146027565002,
-0.01141397189348936,
-0.005893330555409193,
-0.11785827577114105,
0.008799235336482525,
-0.023443887010216713,
0.04098368436098099,
-0.013912669382989407,
0.11191274970769882,
-0.020382892340421677,
-0.1483461856842041,
0.013737301342189312,
-0.030179454013705254,
-0.060442663729190826,
0.02395857684314251,
0.15849418938159943,
0.060357894748449326,
-0.013189141638576984,
-0.08004213869571686,
0.062173835933208466,
-0.006138794124126434,
0.03821169212460518,
0.06242161616683006,
0.11249958723783493,
0.007630246225744486,
0.0685679167509079,
0.09057328850030899,
-0.03480982035398483,
0.028789037838578224,
-0.0006059907609596848,
-0.02004099264740944,
0.10294579714536667,
-0.01727481372654438,
0.06707444787025452,
0.1587855964899063,
-0.15457497537136078,
0.05298151075839996,
0.031258318573236465,
-0.032067909836769104,
-0.09656914323568344,
-0.18463236093521118,
-0.12613621354103088,
-0.09224940091371536,
0.0012200773926451802,
-0.11521007865667343,
0.010771751403808594,
-0.035604819655418396,
0.008547653444111347,
0.009432349354028702,
0.10002647340297699,
-0.013686484657227993,
-0.03787720203399658,
0.06541432440280914,
-0.025708498433232307,
-0.038507793098688126,
0.029415063560009003,
-0.01995733007788658,
0.027566472068428993,
-0.007380971219390631,
0.027161551639437675,
0.0341959223151207,
-0.030944358557462692,
0.06692013144493103,
-0.03602049872279167,
-0.10785839706659317,
-0.011798115447163582,
0.02836797386407852,
0.029721731320023537,
0.08511850982904434,
0.04514080658555031,
-0.03538329526782036,
-0.01027761772274971,
0.0842306911945343,
-0.01605384051799774,
-0.12487117946147919,
-0.04662935063242912,
0.12893244624137878,
0.0014782064827159047,
0.0522664450109005,
0.010763976722955704,
-0.042246948927640915,
0.012982400134205818,
0.1458256095647812,
0.2765883207321167,
-0.02060369774699211,
0.03150471672415733,
0.02360965870320797,
0.012390783056616783,
0.05469822511076927,
0.05840684473514557,
0.02089773491024971,
0.1944267451763153,
-0.03705925494432449,
0.06287766247987747,
-0.016568506136536598,
-0.06431402266025543,
-0.07160067558288574,
0.02019589953124523,
0.0073906658217310905,
-0.028555039316415787,
0.04058516398072243,
0.06913928687572479,
-0.066676564514637,
0.008825200609862804,
0.002241234527900815,
-0.13561959564685822,
-0.10103761404752731,
-0.07645333558320999,
0.056799035519361496,
0.006899030413478613,
0.07212572544813156,
-0.033856045454740524,
-0.05183543264865875,
0.07128942757844925,
-0.03652348741889,
-0.06836765259504318,
-0.07475178688764572,
0.017868608236312866,
-0.052300408482551575,
0.05490346625447273,
-0.010643239133059978,
0.05425665155053139,
0.10752900689840317,
0.0076692677102983,
-0.09180712699890137,
0.053761545568704605,
0.012340305373072624,
-0.041474368423223495,
0.05513402447104454,
0.11050824820995331,
-0.023256774991750717,
0.07539328187704086,
0.044260185211896896,
-0.11293445527553558,
0.05730981007218361,
0.10724329948425293,
-0.018019281327724457,
-0.0684194415807724,
0.07280045002698898,
-0.05294870585203171,
0.13093571364879608,
0.1572132259607315,
-0.04229997098445892,
0.008470823988318443,
-0.03098374977707863,
0.02392342872917652,
0.07620251178741455,
0.09240328520536423,
-0.03956456109881401,
-0.17649677395820618,
0.02388633042573929,
-0.0031065514776855707,
0.027247775346040726,
-0.23009024560451508,
-0.06611720472574234,
-0.09814288467168808,
-0.025719525292515755,
-0.0872202217578888,
0.11014005541801453,
0.12993213534355164,
-0.00029223610181361437,
-0.007045114878565073,
-0.1494932621717453,
-0.004717595409601927,
0.09691882133483887,
-0.11815094202756882,
-0.09183821827173233
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# LLM agent flow classification
This model identifies common events and patterns within the conversation flow.
Such events include an apology, where the LLM acknowledges a mistake.
The flow labels can serve as foundational elements for sophisticated LLM analytics.
It is a fined-tuned version of [MiniLMv2-L6-H384](https://huggingface.co/nreimers/MiniLMv2-L6-H384-distilled-from-RoBERTa-Large).
The quantized version in ONNX format can be found [here](https://huggingface.co/minuva/MiniLMv2-agentflow-v2-onnx)
This model is *only* for the LLM agent texts in the dialog. For the user texts [use this model](https://huggingface.co/minuva/MiniLMv2-userflow-v2).
# Load the Model
```py
from transformers import pipeline
pipe = pipeline(model='minuva/MiniLMv2-agentflow-v2', task='text-classification')
pipe("thats my mistake")
# [{'label': 'agent_apology_error_mistake', 'score': 0.9965628981590271}]
```
# Categories Explanation
<details>
<summary>Click to expand!</summary>
- OTHER: Responses or actions by the agent that do not fit into the predefined categories or are outside the scope of the specific interactions listed.
- agent_apology_error_mistake: When the agent acknowledges an error or mistake in the information provided or in the handling of the request.
- agent_apology_unsatisfactory: The agent expresses an apology for providing an unsatisfactory response or for any dissatisfaction experienced by the user.
- agent_didnt_understand: Indicates that the agent did not understand the user's request or question.
- agent_limited_capabilities: The agent communicates its limitations in addressing certain requests or providing certain types of information.
- agent_refuses_answer: When the agent explicitly refuses to answer a question or fulfill a request, due to policy restrictions or ethical considerations.
- image_limitations": The agent points out limitations related to handling or interpreting images.
- no_information_doesnt_know": The agent indicates that it has no information available or does not know the answer to the user's question.
- success_and_followup_assistance": The agent successfully provides the requested information or service and offers further assistance or follow-up actions if needed.
</details>
<br>
# Metrics in our private test dataset
| Model (params) | Loss | Accuracy | F1 |
|--------------------|-------------|----------|--------|
| minuva/MiniLMv2-agentflow-v2 (33M) | 0.1540 | 0.9616 | 0.9618 |
# Deployment
Check our [llm-flow-classification repository](https://github.com/minuva/llm-flow-classification) for a FastAPI and ONNX based server to deploy this model on CPU devices.
| {"license": "apache-2.0", "tags": ["generated_from_trainer", "text-classification", "multi-class-classification"], "metrics": ["accuracy", "f1"], "base_model": "nreimers/MiniLMv2-L6-H384-distilled-from-RoBERTa-Large", "model-index": [{"name": "MiniLMv2-L6-H384-distilled-from-RoBERTa-Large-agentflow-distil", "results": []}]} | text-classification | minuva/MiniLMv2-agentflow-v2 | [
"transformers",
"tensorboard",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"multi-class-classification",
"base_model:nreimers/MiniLMv2-L6-H384-distilled-from-RoBERTa-Large",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:38:17+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #multi-class-classification #base_model-nreimers/MiniLMv2-L6-H384-distilled-from-RoBERTa-Large #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| LLM agent flow classification
=============================
This model identifies common events and patterns within the conversation flow.
Such events include an apology, where the LLM acknowledges a mistake.
The flow labels can serve as foundational elements for sophisticated LLM analytics.
It is a fined-tuned version of MiniLMv2-L6-H384.
The quantized version in ONNX format can be found here
This model is *only* for the LLM agent texts in the dialog. For the user texts use this model.
Load the Model
==============
Categories Explanation
======================
Click to expand!
```
- OTHER: Responses or actions by the agent that do not fit into the predefined categories or are outside the scope of the specific interactions listed.
- agent_apology_error_mistake: When the agent acknowledges an error or mistake in the information provided or in the handling of the request.
- agent_apology_unsatisfactory: The agent expresses an apology for providing an unsatisfactory response or for any dissatisfaction experienced by the user.
- agent_didnt_understand: Indicates that the agent did not understand the user's request or question.
- agent_limited_capabilities: The agent communicates its limitations in addressing certain requests or providing certain types of information.
- agent_refuses_answer: When the agent explicitly refuses to answer a question or fulfill a request, due to policy restrictions or ethical considerations.
- image_limitations": The agent points out limitations related to handling or interpreting images.
- no_information_doesnt_know": The agent indicates that it has no information available or does not know the answer to the user's question.
- success_and_followup_assistance": The agent successfully provides the requested information or service and offers further assistance or follow-up actions if needed.
```
Metrics in our private test dataset
===================================
Deployment
==========
Check our llm-flow-classification repository for a FastAPI and ONNX based server to deploy this model on CPU devices.
| [] | [
"TAGS\n#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #multi-class-classification #base_model-nreimers/MiniLMv2-L6-H384-distilled-from-RoBERTa-Large #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
95
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #roberta #text-classification #generated_from_trainer #multi-class-classification #base_model-nreimers/MiniLMv2-L6-H384-distilled-from-RoBERTa-Large #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.02736477740108967,
0.14003890752792358,
-0.004247638396918774,
0.09091632813215256,
0.09064123034477234,
-0.0046659596264362335,
0.21800704300403595,
0.1214926466345787,
-0.02916918322443962,
-0.02164260298013687,
0.13223254680633545,
0.15064853429794312,
-0.02601269818842411,
0.0008560930727981031,
-0.03820846229791641,
-0.18072497844696045,
0.05526171252131462,
0.009218620136380196,
-0.13959942758083344,
0.06335494667291641,
0.12720774114131927,
-0.06716679781675339,
0.07957110553979874,
-0.020317617803812027,
-0.13556471467018127,
0.06337086111307144,
0.053475599735975266,
-0.10783185809850693,
0.08729329705238342,
0.0639672800898552,
0.12859877943992615,
0.06571908295154572,
0.0660267025232315,
-0.11178875714540482,
0.032135967165231705,
0.058087289333343506,
-0.06993893533945084,
0.07330625504255295,
0.04852304980158806,
-0.07496647536754608,
0.037447016686201096,
0.02833007648587227,
0.02923686057329178,
0.07249673455953598,
-0.06094532459974289,
-0.14327961206436157,
-0.04326478764414787,
0.06209886446595192,
0.09213082492351532,
0.03885367512702942,
0.03894523158669472,
0.11942349374294281,
-0.03873785212635994,
0.07587846368551254,
0.1266561597585678,
-0.3329620659351349,
-0.02456863969564438,
0.1322704404592514,
0.030662422999739647,
0.05329937860369682,
-0.009610830806195736,
0.034996166825294495,
0.05444548279047012,
-0.020966662093997,
0.05009697377681732,
-0.06971519440412521,
-0.09023170173168182,
0.028956251218914986,
-0.09529831260442734,
0.05890036001801491,
0.1917262226343155,
-0.0052467212080955505,
0.007175038568675518,
-0.009975483641028404,
-0.05707499384880066,
-0.004913480021059513,
-0.03174865245819092,
0.004159071482717991,
0.021275069564580917,
0.03548135235905647,
0.000773328123614192,
-0.06676045060157776,
-0.10828704386949539,
-0.06674588471651077,
-0.13032418489456177,
0.15462347865104675,
0.016365433111786842,
0.04133177176117897,
-0.11401400715112686,
0.05022216960787773,
0.032078176736831665,
-0.10736826062202454,
0.023432312533259392,
-0.06461804360151291,
0.03420701250433922,
-0.014214770868420601,
-0.025474965572357178,
-0.09148368239402771,
0.135807067155838,
0.1696973592042923,
0.08699910342693329,
0.045361585915088654,
-0.03973818197846413,
0.06686372309923172,
-0.06252029538154602,
0.06896933913230896,
-0.0666099488735199,
-0.03928080573678017,
0.09459646791219711,
0.0051252348348498344,
0.09738309681415558,
-0.02233138494193554,
-0.19537927210330963,
0.028866060078144073,
0.020281272009015083,
0.07936147600412369,
-0.0032489190343767405,
0.060759205371141434,
-0.03533036261796951,
-0.01161778811365366,
0.06974320858716965,
-0.08624742180109024,
-0.0024004345759749413,
0.0015130853280425072,
0.011329409666359425,
0.029192637652158737,
0.04265236482024193,
0.02308035083115101,
0.005329159554094076,
-0.0010771113447844982,
-0.06835152208805084,
-0.03835536167025566,
-0.047121308743953705,
-0.09227059036493301,
0.05613730475306511,
-0.08489080518484116,
0.03771215304732323,
-0.16164621710777283,
-0.16230452060699463,
0.036332979798316956,
0.0804668739438057,
-0.04700018838047981,
-0.030365711078047752,
-0.006602422799915075,
-0.028988609090447426,
0.021654484793543816,
-0.009252415038645267,
-0.07076720893383026,
-0.042956676334142685,
0.007022975478321314,
-0.004439080134034157,
0.06661786884069443,
-0.16904398798942566,
0.03227704018354416,
-0.09950526803731918,
0.024592546746134758,
-0.08999134600162506,
-0.008136685006320477,
-0.11209725588560104,
0.17134293913841248,
-0.07409987598657608,
-0.012297170236706734,
-0.01850847713649273,
0.04060656949877739,
-0.005537017248570919,
0.05513834208250046,
-0.13971568644046783,
-0.06322979927062988,
0.15781262516975403,
-0.12351464480161667,
-0.14894886314868927,
0.09606669843196869,
-0.01586027257144451,
0.04634133353829384,
0.07812582701444626,
0.15052500367164612,
0.14833563566207886,
-0.001969944452866912,
0.06993824988603592,
0.10865551233291626,
-0.03650645911693573,
-0.16323818266391754,
0.06167532131075859,
-0.012730858288705349,
-0.12096404284238815,
0.03369167074561119,
0.03031620755791664,
0.05647590756416321,
0.001653205370530486,
-0.09185486286878586,
-0.06313742697238922,
-0.07837174087762833,
0.04134482517838478,
0.014911338686943054,
0.09836383908987045,
-0.08826708793640137,
-0.060837190598249435,
0.07095079869031906,
0.07366432249546051,
-0.060045789927244186,
0.0015775690553709865,
-0.09738289564847946,
0.09689272940158844,
-0.07804058492183685,
0.00941783282905817,
-0.12385553121566772,
-0.045902494341135025,
0.0034400573931634426,
-0.04153643548488617,
0.056873973459005356,
0.06316333264112473,
0.06041921302676201,
0.00019805118790827692,
-0.028517117723822594,
0.00486401654779911,
0.11207115650177002,
0.04056559503078461,
-0.020907016471028328,
-0.20266787707805634,
0.040265608578920364,
-0.0769946426153183,
0.048225220292806625,
-0.15192756056785583,
0.0454866923391819,
0.06387829035520554,
0.09515132009983063,
0.05126082897186279,
0.09792305529117584,
-0.006302478723227978,
0.033327616751194,
-0.07438597828149796,
-0.053846314549446106,
0.10137917101383209,
-0.001457204925827682,
-0.08529430627822876,
0.07073450088500977,
-0.09851091355085373,
0.21273042261600494,
0.15515489876270294,
-0.08545898646116257,
0.011021260172128677,
0.036103587597608566,
0.010489043779671192,
0.008287771604955196,
0.027028940618038177,
0.027324659749865532,
0.009183699265122414,
-0.020016513764858246,
0.14654643833637238,
-0.07040808349847794,
-0.02614201419055462,
0.006081210449337959,
-0.0783538743853569,
-0.03772488608956337,
0.08585022389888763,
0.08938714861869812,
-0.20905739068984985,
0.1543894112110138,
0.24661439657211304,
-0.11668742448091507,
0.16389058530330658,
-0.03806585073471069,
0.03031218983232975,
0.012903287075459957,
-0.031114190816879272,
0.019868729636073112,
0.05095187947154045,
-0.042219847440719604,
-0.017450504004955292,
0.027694398537278175,
-0.01193155162036419,
0.04387735202908516,
-0.13506481051445007,
-0.007067323196679354,
0.012866809964179993,
-0.03540150448679924,
0.06427770853042603,
0.03647184744477272,
-0.04939209669828415,
0.10166968405246735,
-0.027965480461716652,
-0.09640509635210037,
0.09264790266752243,
-0.006146055180579424,
-0.059993743896484375,
0.1789463311433792,
-0.11886017769575119,
-0.16338616609573364,
-0.1907026469707489,
-0.07911863923072815,
-0.09701989591121674,
0.044888731092214584,
0.04957303777337074,
-0.09943021833896637,
-0.052132438868284225,
-0.09710577875375748,
-0.030710292980074883,
0.021110763773322105,
0.02680756151676178,
-0.026879502460360527,
0.08944077789783478,
0.0379839725792408,
-0.1392248570919037,
-0.008134917356073856,
0.014312518760561943,
-0.07330459356307983,
0.04693298786878586,
-0.03094513528048992,
0.06180187687277794,
0.16248998045921326,
0.006726110354065895,
0.0006719956872984767,
-0.0262138694524765,
0.07120126485824585,
-0.004372368101030588,
0.03525293618440628,
0.19180092215538025,
-0.031234173104166985,
0.03709731996059418,
0.15883329510688782,
0.05004836991429329,
-0.058260150253772736,
0.022390156984329224,
0.0027901490684598684,
-0.0748368501663208,
-0.24074257910251617,
-0.069678395986557,
-0.04443766921758652,
0.05247151106595993,
0.0758507251739502,
0.08372047543525696,
0.05821351706981659,
0.10383149981498718,
0.030099760740995407,
0.07410097122192383,
-0.004356690216809511,
0.05348886176943779,
0.14075452089309692,
0.03820512443780899,
0.15511882305145264,
-0.13079042732715607,
-0.07637892663478851,
0.07136240601539612,
0.08038467168807983,
0.08514265716075897,
0.06520123779773712,
0.13102759420871735,
0.06258218735456467,
0.11144045740365982,
0.1053958386182785,
0.09156933426856995,
-0.010387921705842018,
-0.027626043185591698,
-0.025456788018345833,
-0.05005106329917908,
0.02451363392174244,
0.04772486910223961,
-0.11155782639980316,
-0.02196059748530388,
0.0038849543780088425,
0.02870362065732479,
0.07495059072971344,
0.21848855912685394,
0.0370015986263752,
-0.2336004078388214,
-0.0014535182854160666,
0.11699358373880386,
0.036841440945863724,
-0.019047213718295097,
0.07866403460502625,
0.02121390402317047,
-0.005047157406806946,
0.07528981566429138,
-0.04752834886312485,
0.07252945005893707,
-0.027472596615552902,
0.008295195177197456,
0.0032713718246668577,
0.04835997149348259,
0.016720900312066078,
0.09832325577735901,
-0.23581808805465698,
0.20029033720493317,
0.030354615300893784,
0.01068791002035141,
-0.052759937942028046,
0.03097866289317608,
0.0700056254863739,
0.1603160798549652,
0.12858465313911438,
-0.021999528631567955,
-0.1741640269756317,
-0.05452710762619972,
-0.07541515678167343,
0.0533883199095726,
0.03244561702013016,
-0.025563521310687065,
0.006356895435601473,
-0.08070318400859833,
-0.029469307512044907,
0.05291585624217987,
-0.06028107553720474,
-0.09221691638231277,
-0.1507793515920639,
0.022744236513972282,
0.08195684105157852,
-0.09784615784883499,
-0.06603559851646423,
-0.058225974440574646,
-0.10841374099254608,
0.1882053166627884,
-0.15489648282527924,
-0.04343050718307495,
-0.09765290468931198,
-0.06206037476658821,
0.012499530799686909,
-0.06320147961378098,
0.04875818267464638,
-0.06312660872936249,
0.08364163339138031,
-0.002985658124089241,
-0.2061166614294052,
0.11510027199983597,
-0.09844709932804108,
-0.06295890361070633,
-0.03791728988289833,
0.07540418952703476,
-0.06557807326316833,
0.013195005245506763,
-0.00427862536162138,
-0.015523289330303669,
-0.012402785941958427,
-0.08038953691720963,
-0.014773509465157986,
0.031112242490053177,
-0.014156200923025608,
-0.01729394868016243,
-0.12307125329971313,
-0.09338943660259247,
0.027934445068240166,
0.009277804754674435,
0.1721748560667038,
0.20093274116516113,
-0.08487294614315033,
0.10084669291973114,
0.1930149793624878,
-0.07201419025659561,
-0.33370843529701233,
-0.04218579828739166,
-0.10636857897043228,
-0.030467191711068153,
-0.0133956428617239,
-0.14047889411449432,
0.18448305130004883,
0.06890260428190231,
-0.06831448525190353,
0.054681576788425446,
-0.18552620708942413,
-0.12189190834760666,
0.2086755335330963,
0.08973487466573715,
0.21530382335186005,
-0.14881998300552368,
-0.05859695002436638,
-0.1259855329990387,
-0.06366641074419022,
0.16347655653953552,
-0.25811898708343506,
0.06494756042957306,
-0.005868533626198769,
-0.06176284700632095,
-0.022643370553851128,
-0.04210501164197922,
0.09247497469186783,
-0.05391133949160576,
0.1065940260887146,
-0.09451140463352203,
-0.02082609198987484,
0.15214115381240845,
-0.03273588791489601,
0.03254106268286705,
-0.1765720546245575,
0.03956257551908493,
0.0007466342649422586,
-0.036165472120046616,
-0.0034573045559227467,
0.06850872188806534,
-0.003681386588141322,
-0.06623430550098419,
-0.05122882500290871,
0.026944391429424286,
-0.025466755032539368,
-0.022989846765995026,
0.24249239265918732,
-0.005186027381569147,
0.1132846251130104,
0.18526683747768402,
0.1273278295993805,
-0.09658650308847427,
0.07276332378387451,
-0.027147335931658745,
-0.06485958397388458,
0.10512065142393112,
-0.18441103398799896,
0.08814562112092972,
0.0637650415301323,
-0.037721358239650726,
0.07217243313789368,
0.07362642884254456,
0.010018348693847656,
-0.007724144030362368,
0.1315091997385025,
-0.14648139476776123,
-0.05552412196993828,
-0.0224818866699934,
0.033983100205659866,
-0.00011181229638168588,
0.12884217500686646,
0.1845836639404297,
-0.0022573936730623245,
-0.010976044461131096,
0.029681729152798653,
-0.012788163498044014,
-0.06147555634379387,
0.07756134122610092,
0.11407984048128128,
0.030146893113851547,
-0.08803720027208328,
0.12913160026073456,
0.0379607193171978,
-0.04490521922707558,
0.0006978359306231141,
0.049009885638952255,
-0.09764565527439117,
-0.13302306830883026,
0.06096898764371872,
0.1981707364320755,
-0.0786605179309845,
-0.09523322433233261,
-0.1295355260372162,
-0.13102653622627258,
0.005640748888254166,
0.17978017032146454,
0.11001081019639969,
0.020627902820706367,
-0.019417880102992058,
-0.05843251943588257,
-0.021139631047844887,
0.0682944804430008,
-0.044977232813835144,
0.06137453019618988,
-0.1764802187681198,
-0.01334122009575367,
-0.020879056304693222,
0.041801538318395615,
-0.05943923443555832,
0.02521885745227337,
-0.14213629066944122,
-0.022389641031622887,
-0.17677395045757294,
0.015980923548340797,
-0.049818653613328934,
0.015496421605348587,
0.012830881401896477,
-0.007594811264425516,
-0.07055758684873581,
0.024037960916757584,
-0.09087443351745605,
0.006029291078448296,
-0.002032511867582798,
0.07119156420230865,
-0.07173002511262894,
-0.025285908952355385,
0.023271016776561737,
-0.023397430777549744,
0.07290108501911163,
-0.01591785065829754,
-0.033728718757629395,
0.05896304175257683,
-0.203351691365242,
-0.0209349375218153,
0.05065484344959259,
0.030524583533406258,
0.018687929958105087,
0.0015385845908895135,
0.004128233529627323,
0.05904743820428848,
0.03508803993463516,
0.018944239243865013,
0.033975716680288315,
-0.07921537011861801,
0.01491187047213316,
-0.0832151249051094,
-0.07252711802721024,
-0.03561271354556084,
-0.03300570324063301,
0.11580855399370193,
0.011098439805209637,
0.1839151680469513,
-0.07966736704111099,
-0.010102740488946438,
-0.08749852329492569,
0.017964649945497513,
0.0003317994996905327,
-0.14251980185508728,
-0.1360330432653427,
-0.042554326355457306,
-0.0385746993124485,
-0.03171076625585556,
0.21500559151172638,
0.02584182098507881,
-0.07513392716646194,
0.04475873336195946,
0.0016853481065481901,
0.0753660574555397,
0.06583870947360992,
0.23630550503730774,
0.045200053602457047,
-0.00011794314195867628,
-0.06987795233726501,
0.013012290000915527,
0.0678178071975708,
0.035033855587244034,
0.08420442044734955,
0.14315156638622284,
-0.09003306180238724,
0.0942063257098198,
0.0942763239145279,
0.00006118109013186768,
0.02101045474410057,
0.061138056218624115,
-0.019665732979774475,
0.060543354600667953,
-0.0038670445792376995,
0.07435554265975952,
0.17363031208515167,
-0.04842069372534752,
-0.0194349754601717,
-0.04646275192499161,
-0.051207445561885834,
-0.1700165718793869,
-0.19693492352962494,
-0.12964046001434326,
-0.13953042030334473,
-0.010198975913226604,
-0.05314796790480614,
-0.056429069489240646,
0.023633046075701714,
0.04975888133049011,
-0.021958600729703903,
0.08386576920747757,
-0.060417063534259796,
0.00601736456155777,
0.02747090719640255,
-0.000317874742904678,
-0.06091710180044174,
-0.004199426621198654,
-0.0547238364815712,
-0.049107637256383896,
-0.00840730033814907,
-0.06284051388502121,
0.009167815558612347,
0.01563226245343685,
0.05335060879588127,
-0.07344137877225876,
-0.0803784653544426,
-0.015230809338390827,
0.028515512123703957,
0.0017927283188328147,
0.06767735630273819,
0.014522552490234375,
-0.048574525862932205,
0.06644026190042496,
0.17219679057598114,
-0.05728072300553322,
-0.09991959482431412,
-0.08673594892024994,
0.11773669719696045,
-0.012084229849278927,
0.08782247453927994,
-0.04231996089220047,
-0.03091377019882202,
0.008092273026704788,
0.24295635521411896,
0.29583558440208435,
-0.05378938838839531,
0.04339206591248512,
-0.04339678958058357,
-0.00780731625854969,
-0.03317481651902199,
0.10291365534067154,
0.051865916699171066,
0.051834799349308014,
-0.048147328197956085,
-0.10434312373399734,
0.0006030230433680117,
0.003609179286286235,
-0.12184469401836395,
0.09464691579341888,
0.026260564103722572,
-0.0388607531785965,
-0.009661844931542873,
0.057077325880527496,
-0.06641539186239243,
0.050165075808763504,
0.04238088056445122,
-0.1040819063782692,
-0.10424872487783432,
-0.022268926724791527,
0.14045481383800507,
-0.03246427699923515,
0.01710905134677887,
-0.06768076866865158,
-0.039470184594392776,
-0.009499589912593365,
-0.015751684084534645,
-0.17101077735424042,
-0.046401917934417725,
0.03608131408691406,
-0.017396466806530952,
0.1778293401002884,
-0.0007972687017172575,
0.003721813904121518,
0.13484732806682587,
0.0005766024114564061,
-0.0862656682729721,
0.08674066513776779,
-0.011318412609398365,
-0.063352070748806,
0.04422753304243088,
-0.009272966533899307,
-0.03576207533478737,
0.05824226140975952,
0.04916077107191086,
-0.04496605694293976,
0.02778022550046444,
-0.026845719665288925,
-0.12764380872249603,
-0.06404553353786469,
0.007447604555636644,
-0.0908985510468483,
0.10895498096942902,
0.05920625478029251,
-0.03347200155258179,
-0.022132815793156624,
-0.025018658488988876,
0.052426036447286606,
0.03654419630765915,
-0.0892951488494873,
-0.028307607397437096,
-0.09721153974533081,
0.008351298049092293,
0.019529184326529503,
0.02772381156682968,
-0.2594297230243683,
-0.03364333510398865,
-0.07027047872543335,
0.016724372282624245,
-0.13750727474689484,
0.039494264870882034,
0.18492357432842255,
0.05401698127388954,
-0.038496483117341995,
-0.09846372157335281,
-0.04494775831699371,
0.034045420587062836,
-0.11748985946178436,
-0.12251557409763336
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["translation", "generated_from_trainer"], "datasets": ["kde4"], "base_model": "Helsinki-NLP/opus-mt-en-fr", "model-index": [{"name": "marian-finetuned-kde4-en-to-fr", "results": []}]} | translation | FkSg16KN/marian-finetuned-kde4-en-to-fr | [
"transformers",
"tensorboard",
"safetensors",
"marian",
"text2text-generation",
"translation",
"generated_from_trainer",
"dataset:kde4",
"base_model:Helsinki-NLP/opus-mt-en-fr",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:40:07+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| [
"# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
90,
47,
6,
12,
8,
3,
103,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.11025281995534897,
0.13228027522563934,
-0.002497683512046933,
0.061853762716054916,
0.12278620153665543,
-0.0016732231015339494,
0.10084521025419235,
0.12471552938222885,
-0.05806703492999077,
0.09101225435733795,
0.08990395069122314,
0.0342695489525795,
0.08322229236364365,
0.13366325199604034,
-0.024866709485650063,
-0.27132096886634827,
0.03774908930063248,
0.01072507631033659,
-0.10109632462263107,
0.08744445443153381,
0.13495291769504547,
-0.06991395354270935,
0.06739504635334015,
0.03721550479531288,
-0.10737323015928268,
0.018366042524576187,
-0.041393667459487915,
-0.07454794645309448,
0.0767643079161644,
0.02546648308634758,
0.07536610960960388,
0.019309675320982933,
0.0988769382238388,
-0.22849087417125702,
0.001960745081305504,
0.05505223199725151,
0.03928176686167717,
0.07525794208049774,
0.058315154165029526,
0.048237018287181854,
0.10814590752124786,
-0.1580948382616043,
0.09189411252737045,
0.013933563604950905,
-0.055112943053245544,
-0.16375596821308136,
-0.07293853163719177,
0.07157345861196518,
0.12349112331867218,
0.10730661451816559,
0.006467369385063648,
0.1545604020357132,
-0.02037140727043152,
0.07787896692752838,
0.1356193572282791,
-0.23995637893676758,
-0.07327912747859955,
0.03007066436111927,
0.07968269288539886,
0.06351755559444427,
-0.08305691182613373,
0.013575187884271145,
0.048182856291532516,
0.017976263538002968,
0.05964080989360809,
-0.005765973124653101,
-0.016775773838162422,
-0.03823238983750343,
-0.12145690619945526,
-0.03912362828850746,
0.21170002222061157,
0.08163828402757645,
-0.04135378450155258,
-0.11774558573961258,
-0.00544202234596014,
-0.07507895678281784,
-0.025144480168819427,
-0.06114153191447258,
0.0007986847776919603,
-0.06112470477819443,
-0.025017524138092995,
-0.08785413205623627,
-0.10162447392940521,
-0.04378224164247513,
0.05385315418243408,
0.13827195763587952,
0.03308909013867378,
0.010150556452572346,
-0.01321139931678772,
0.07743871957063675,
-0.01715659722685814,
-0.13989275693893433,
-0.029436253011226654,
-0.008495042100548744,
-0.053963590413331985,
-0.06529659777879715,
-0.02727101370692253,
-0.0746145099401474,
0.0030257580801844597,
0.09588798135519028,
-0.03737008199095726,
0.06193612515926361,
0.026739880442619324,
0.004988937638700008,
-0.010766369290649891,
0.13464544713497162,
-0.029011279344558716,
-0.04113291949033737,
0.0004627654852811247,
0.09249570965766907,
0.004584005102515221,
-0.028781766071915627,
-0.08195483684539795,
-0.03938132897019386,
0.08590923249721527,
0.07095848768949509,
-0.018384763970971107,
0.023256462067365646,
-0.032902173697948456,
-0.05869368463754654,
0.06293278187513351,
-0.12899120151996613,
0.0396842435002327,
-0.023689718917012215,
-0.07669086754322052,
-0.053687553852796555,
0.007347010541707277,
0.031166095286607742,
-0.047367602586746216,
0.033390019088983536,
-0.05365898460149765,
-0.021465519443154335,
-0.06337427347898483,
-0.04458260163664818,
0.022491946816444397,
-0.0288541316986084,
0.027591250836849213,
-0.08787475526332855,
-0.14875110983848572,
-0.03439804166555405,
0.056288495659828186,
-0.061134062707424164,
-0.09906890243291855,
-0.03063266910612583,
-0.04462319239974022,
0.023661427199840546,
-0.018998173996806145,
0.1262560933828354,
-0.03623584285378456,
0.032354190945625305,
0.0057951523922383785,
0.0002870011667255312,
0.021634381264448166,
0.0351492315530777,
-0.07353489845991135,
0.046621453016996384,
-0.09827374666929245,
0.06563037633895874,
-0.12131834030151367,
0.02265392802655697,
-0.13843069970607758,
-0.10709956288337708,
-0.015242228284478188,
-0.02788487821817398,
0.07154285162687302,
0.11342854052782059,
-0.11157362908124924,
-0.014513843692839146,
0.10833443701267242,
-0.08236566185951233,
-0.11801477521657944,
0.08233492821455002,
-0.011563459411263466,
0.04681802913546562,
0.045430738478899,
0.1608518660068512,
0.1554884910583496,
-0.13760565221309662,
-0.024367554113268852,
0.024903491139411926,
0.06936389207839966,
-0.004345031920820475,
0.08736279606819153,
0.00853815395385027,
-0.0017038672231137753,
0.025792179629206657,
-0.07704151421785355,
0.001396657433360815,
-0.05019181966781616,
-0.09859573096036911,
-0.04149125888943672,
-0.07916545867919922,
-0.016659049317240715,
0.025332357734441757,
0.036780763417482376,
-0.07465989142656326,
-0.09990911930799484,
0.07626409083604813,
0.13343006372451782,
-0.06173425167798996,
0.020513206720352173,
-0.06699208170175552,
0.04261273145675659,
-0.04337961599230766,
-0.028976861387491226,
-0.1587388515472412,
-0.12304207682609558,
0.05662766471505165,
-0.11318235099315643,
0.01505711767822504,
-0.0025804005563259125,
0.06642841547727585,
0.07786444574594498,
-0.0463782399892807,
-0.0427132211625576,
-0.10699856281280518,
0.013985831290483475,
-0.09185459464788437,
-0.15175209939479828,
-0.048316724598407745,
-0.038187675178050995,
0.18471311032772064,
-0.24210521578788757,
0.00558061758056283,
0.033446215093135834,
0.1590559184551239,
0.016550561413168907,
-0.04770262539386749,
0.0003432557568885386,
0.012158974073827267,
-0.0023469345178455114,
-0.09518501162528992,
0.02030852437019348,
0.013344892300665379,
-0.11612769961357117,
0.01644587330520153,
-0.1274741291999817,
0.039827436208724976,
0.06416992098093033,
0.0904388278722763,
-0.06452967971563339,
-0.047412026673555374,
-0.06884508579969406,
-0.051081009209156036,
-0.04176926985383034,
-0.0044025518000125885,
0.18018954992294312,
0.012919513508677483,
0.10404016077518463,
-0.06265105307102203,
-0.06032375991344452,
0.029595568776130676,
0.000008033690392039716,
-0.08065760135650635,
0.09724018722772598,
0.01775170862674713,
-0.1810828000307083,
0.07350460439920425,
0.0763687938451767,
-0.031251877546310425,
0.1661660522222519,
-0.043380945920944214,
-0.10968884825706482,
-0.03315906971693039,
0.026007404550909996,
0.016976134851574898,
0.14781923592090607,
-0.07823873311281204,
0.031458307057619095,
0.043581102043390274,
0.022060241550207138,
0.048720940947532654,
-0.13455145061016083,
0.0014114886289462447,
0.028370540589094162,
-0.04274500533938408,
0.03986083343625069,
-0.008734508417546749,
-0.0032885971013456583,
0.06433915346860886,
0.027198292315006256,
-0.020926345139741898,
0.023313960060477257,
-0.017142686992883682,
-0.07300687581300735,
0.17306993901729584,
-0.10798090696334839,
-0.20241953432559967,
-0.1555367112159729,
0.07284772396087646,
-0.06178789213299751,
-0.033607520163059235,
0.01248161494731903,
-0.05823759734630585,
-0.07185440510511398,
-0.11907355487346649,
-0.019671296700835228,
-0.07304127514362335,
-0.019480308517813683,
0.04712740331888199,
0.03462144359946251,
0.08838595449924469,
-0.1159868985414505,
0.012411052361130714,
0.0159104336053133,
-0.044214390218257904,
-0.03150247409939766,
0.0259398240596056,
0.0944824367761612,
0.08154232800006866,
-0.017567304894328117,
0.031467255204916,
-0.012297284789383411,
0.22371503710746765,
-0.09423356503248215,
0.014017107896506786,
0.11689390242099762,
0.015574214980006218,
0.048282936215400696,
0.1303127259016037,
0.01703796163201332,
-0.06598561257123947,
0.027879411354660988,
0.04359196126461029,
-0.008670068345963955,
-0.23720546066761017,
-0.05730921030044556,
-0.033853016793727875,
-0.05361296981573105,
0.11767075210809708,
0.0645662248134613,
0.019171234220266342,
0.07872184365987778,
-0.03244297578930855,
0.01859782449901104,
0.010495496913790703,
0.09704557806253433,
0.09034346044063568,
0.045814648270606995,
0.07171395421028137,
-0.023682398721575737,
-0.03292730078101158,
0.06258692592382431,
0.038432683795690536,
0.22967347502708435,
-0.03186177462339401,
0.13296367228031158,
0.015192181803286076,
0.14852456748485565,
-0.011687508784234524,
0.03491184115409851,
0.03506188839673996,
0.003517653327435255,
0.02487974613904953,
-0.07210248708724976,
-0.002590855350717902,
0.0540839359164238,
0.016430353745818138,
0.017412159591913223,
-0.0750928446650505,
0.040905579924583435,
-0.013569065369665623,
0.23429562151432037,
0.06146448850631714,
-0.2812844514846802,
-0.0933714509010315,
0.024365877732634544,
-0.021243995055556297,
-0.1035851389169693,
-0.00030724809039384127,
0.10898293554782867,
-0.1496596485376358,
0.07524729520082474,
-0.08025383204221725,
0.09974245727062225,
-0.034940335899591446,
-0.024827629327774048,
0.06553079187870026,
0.08727923780679703,
0.007979053072631359,
0.12402518093585968,
-0.1664639264345169,
0.22473162412643433,
0.012455415911972523,
0.11410021781921387,
-0.07279646396636963,
0.05009140446782112,
0.0016166470013558865,
0.07914853096008301,
0.12101166695356369,
0.0228593572974205,
-0.1353197693824768,
-0.1752055436372757,
-0.13464416563510895,
0.030787397176027298,
0.09192412346601486,
-0.0602620467543602,
0.06711921095848083,
-0.026168756186962128,
0.004014886915683746,
0.02163287252187729,
-0.0696859136223793,
-0.20367491245269775,
-0.1606472134590149,
0.05270374193787575,
0.03215346485376358,
-0.013749919831752777,
-0.09771383553743362,
-0.11132755875587463,
0.002237447304651141,
0.2072235345840454,
0.04766685143113136,
-0.05041463300585747,
-0.1574200540781021,
0.05723070725798607,
0.15351863205432892,
-0.08047395944595337,
0.013499487191438675,
0.00374376168474555,
0.1831102818250656,
0.02117547020316124,
-0.05085166171193123,
0.048240188509225845,
-0.07212004065513611,
-0.1335040181875229,
-0.018967026844620705,
0.1483813375234604,
0.03503241762518883,
0.03312084078788757,
0.03522077202796936,
0.026592392474412918,
0.009323634207248688,
-0.08206413686275482,
0.005264489445835352,
0.014642644673585892,
0.06299438327550888,
0.030922595411539078,
-0.05551905930042267,
0.04020485654473305,
-0.06976672261953354,
-0.03520050272345543,
0.12729410827159882,
0.22231072187423706,
-0.06563584506511688,
0.05138479545712471,
0.04823330044746399,
-0.0770622044801712,
-0.16430765390396118,
0.04459049925208092,
0.13117070496082306,
0.038811273872852325,
0.06771695613861084,
-0.16333261132240295,
0.0875193327665329,
0.0842713937163353,
-0.038526322692632675,
0.07241912931203842,
-0.23236778378486633,
-0.1336863785982132,
0.085238516330719,
0.13202159106731415,
-0.013192376121878624,
-0.09194184839725494,
-0.06262076646089554,
-0.027384640648961067,
-0.12828607857227325,
0.10995753109455109,
-0.046206649392843246,
0.0910034328699112,
0.002883240347728133,
0.08148781210184097,
0.03847382590174675,
-0.0413036085665226,
0.1858925223350525,
-0.004430518485605717,
0.026535404846072197,
-0.06823208183050156,
0.08235251158475876,
0.07938656210899353,
-0.08733733743429184,
0.09346824884414673,
-0.046702586114406586,
0.06204633787274361,
-0.16455939412117004,
-0.03577122092247009,
-0.05070330947637558,
0.1016334816813469,
-0.06474737077951431,
-0.05627157539129257,
-0.030748866498470306,
0.07840455323457718,
0.07376978546380997,
-0.018819542601704597,
0.11818552762269974,
0.02893516793847084,
0.05498562380671501,
0.12180285155773163,
0.10783291608095169,
0.03943132236599922,
-0.07994823157787323,
-0.0060419184155762196,
-0.020535634830594063,
0.06116599589586258,
-0.05617542192339897,
0.024448642507195473,
0.11880495399236679,
0.003792861709371209,
0.11199913173913956,
-0.009286921471357346,
-0.08447912335395813,
-0.01570236124098301,
0.04274660348892212,
-0.08905450999736786,
-0.15503211319446564,
-0.06303969770669937,
0.03135782852768898,
-0.13276715576648712,
-0.0007194725330919027,
0.14194798469543457,
-0.0792456716299057,
-0.006607489660382271,
-0.02375638484954834,
0.020921045914292336,
-0.025708887726068497,
0.1617288589477539,
0.042961910367012024,
0.07012490928173065,
-0.04954538494348526,
0.10137775540351868,
0.0729166567325592,
-0.10889662802219391,
0.06736521422863007,
0.04559728503227234,
-0.08197730034589767,
-0.03744499012827873,
0.05434461310505867,
0.10108806937932968,
-0.005042644217610359,
-0.07446064800024033,
-0.06888898462057114,
-0.08504270762205124,
0.025724269449710846,
0.009094526059925556,
0.029211577028036118,
-0.018444696441292763,
-0.008653175085783005,
0.006719449069350958,
-0.14856833219528198,
0.11959850788116455,
0.05128518119454384,
0.07141643017530441,
-0.1525830179452896,
0.02371562086045742,
0.00391179695725441,
0.05019492283463478,
-0.011381740681827068,
0.009281485341489315,
-0.055091552436351776,
-0.03460976853966713,
-0.09766174107789993,
-0.004268114920705557,
-0.033660829067230225,
0.005177121609449387,
-0.03128882497549057,
-0.06844862550497055,
-0.059930220246315,
0.06168266758322716,
-0.05871100723743439,
-0.06382260471582413,
-0.021221790462732315,
0.061353541910648346,
-0.09937402606010437,
-0.029570618644356728,
0.04483768716454506,
-0.10540646314620972,
0.06387251615524292,
0.050687097012996674,
0.03189229965209961,
0.025533752515912056,
-0.04538128525018692,
0.04269999638199806,
0.008725532330572605,
0.04957766830921173,
0.05124768614768982,
-0.13097058236598969,
-0.018482698127627373,
0.006270854268223047,
0.04504162073135376,
0.0033959245774894953,
0.04611150175333023,
-0.11966126412153244,
-0.08013615757226944,
-0.05346561223268509,
-0.06219330430030823,
-0.053294453769922256,
0.0697060152888298,
0.057649243623018265,
0.02028150111436844,
0.15919175744056702,
-0.06924430280923843,
0.05198970437049866,
-0.17997024953365326,
-0.012628872878849506,
0.006432778667658567,
-0.03571827709674835,
-0.02298525534570217,
-0.02347704768180847,
0.07255815714597702,
-0.06350480020046234,
0.1097467765212059,
-0.013212562538683414,
0.08752013742923737,
0.04906708374619484,
-0.08276287466287613,
0.04609302058815956,
0.016702231019735336,
0.20024660229682922,
0.0660475417971611,
0.005031734239310026,
0.09879478067159653,
-0.033895526081323624,
0.04200191795825958,
0.06540143489837646,
0.10041096806526184,
0.15519925951957703,
0.0005573531379923224,
0.07986460626125336,
0.06862806528806686,
-0.06535539031028748,
-0.12352722883224487,
0.06131473928689957,
-0.02987021766602993,
0.11826776713132858,
-0.009547101333737373,
0.13024894893169403,
0.10840785503387451,
-0.19328227639198303,
0.03431487828493118,
-0.05678454041481018,
-0.12989981472492218,
-0.09638288617134094,
-0.1598881483078003,
-0.0954960361123085,
-0.09628510475158691,
0.03042786195874214,
-0.12564478814601898,
0.011371753178536892,
0.033116940408945084,
0.015577292069792747,
-0.0022684731520712376,
0.17022067308425903,
-0.01306353323161602,
0.015024980530142784,
0.0827396810054779,
0.0019338394049555063,
0.008900177665054798,
-0.03509701043367386,
-0.05395190790295601,
0.03135617822408676,
0.02390149235725403,
0.06037634238600731,
-0.04513651132583618,
-0.01194804161787033,
0.04158424213528633,
0.02994723431766033,
-0.08655781298875809,
0.009025844745337963,
0.004278125241398811,
0.040858834981918335,
0.03796723857522011,
0.05013785511255264,
-0.0017518899403512478,
-0.04765453562140465,
0.2565976083278656,
-0.040164366364479065,
-0.06504279375076294,
-0.1260729432106018,
0.08591553568840027,
0.02395622432231903,
-0.010862494818866253,
0.06463178992271423,
-0.09806820005178452,
-0.009367631748318672,
0.13442423939704895,
0.1333867460489273,
-0.02649053931236267,
-0.024677399545907974,
-0.013613997958600521,
-0.018165146932005882,
-0.05510196462273598,
0.09416629374027252,
0.07777651399374008,
-0.011803953908383846,
-0.060663823038339615,
-0.0159640870988369,
-0.022317366674542427,
-0.03549283742904663,
-0.10250793397426605,
0.07324137538671494,
0.015836218371987343,
-0.0029496527276933193,
-0.029467742890119553,
0.0986250787973404,
0.00819090660661459,
-0.13274021446704865,
-0.005187200382351875,
-0.13841663300991058,
-0.20367218554019928,
-0.030213508754968643,
0.10560499876737595,
-0.019937291741371155,
0.053388796746730804,
-0.004597230348736048,
0.004181165713816881,
0.09076347947120667,
-0.00008884158887667581,
-0.03816846385598183,
-0.09053397923707962,
0.07994600385427475,
-0.061288949102163315,
0.2568321228027344,
0.008295251987874508,
0.07639902830123901,
0.10045448690652847,
-0.01913628913462162,
-0.14631278812885284,
0.026744326576590538,
0.08721663802862167,
-0.04040316492319107,
0.060585904866456985,
0.1823117434978485,
-0.05188862979412079,
0.07998615503311157,
0.04302503913640976,
-0.13671661913394928,
-0.02844150923192501,
-0.03455718606710434,
0.013903682120144367,
-0.062038604170084,
0.01642535626888275,
-0.08212979137897491,
0.16134461760520935,
0.18891990184783936,
-0.05680166929960251,
-0.033935822546482086,
-0.07130136340856552,
0.046734489500522614,
0.04736148193478584,
0.10317935794591904,
0.00876710657030344,
-0.20335346460342407,
-0.03226965293288231,
0.019159486517310143,
0.035307131707668304,
-0.2281155288219452,
-0.1043785959482193,
0.01940816640853882,
-0.06147541105747223,
-0.033089086413383484,
0.11172143369913101,
0.059263113886117935,
0.0066741094924509525,
-0.038794342428445816,
-0.09456618875265121,
-0.04552295431494713,
0.11995761841535568,
-0.1422937959432602,
-0.058007024228572845
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-classification | exala/db_mc_8.0-93 | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:43:12+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #distilbert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #distilbert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
48,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #distilbert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.07302619516849518,
0.15942901372909546,
-0.0037264563143253326,
0.025167323648929596,
0.1172078475356102,
0.008749904111027718,
0.07480382919311523,
0.10722988098859787,
-0.02045559324324131,
0.12543776631355286,
0.039410512894392014,
0.10394789278507233,
0.11057476699352264,
0.19190818071365356,
-0.002767085563391447,
-0.20705340802669525,
0.06408926099538803,
-0.1160135567188263,
0.01611061953008175,
0.12252140045166016,
0.14288343489170074,
-0.10968661308288574,
0.07077977806329727,
-0.03890860080718994,
-0.02716642990708351,
-0.03298870474100113,
-0.06216486915946007,
-0.05656079575419426,
0.06601378321647644,
0.0591726079583168,
0.0694754421710968,
0.024590101093053818,
0.08098499476909637,
-0.28943222761154175,
0.019404316321015358,
0.07737266272306442,
0.0035657489206641912,
0.06202864274382591,
0.07916183769702911,
-0.07883433997631073,
0.10480276495218277,
-0.05524575710296631,
0.15781880915164948,
0.07489447295665741,
-0.0982506051659584,
-0.18013636767864227,
-0.08504346013069153,
0.09836910665035248,
0.17099453508853912,
0.05479501187801361,
-0.03633468225598335,
0.14033685624599457,
-0.08061470836400986,
0.01687866821885109,
0.06772492825984955,
-0.0681740939617157,
-0.05254287272691727,
0.05607175827026367,
0.07204149663448334,
0.09469678997993469,
-0.13314193487167358,
-0.00799859780818224,
0.04471778869628906,
0.01626773551106453,
0.10993628203868866,
0.023736488074064255,
0.12237779796123505,
0.02965191937983036,
-0.14525189995765686,
-0.06473848968744278,
0.11550955474376678,
0.035839054733514786,
-0.060212019830942154,
-0.24547284841537476,
-0.0033295010216534138,
-0.034091219305992126,
-0.026645053178071976,
-0.04267369583249092,
0.0422113798558712,
-0.030090559273958206,
0.09037542343139648,
0.006197172217071056,
-0.06834427267313004,
-0.051925547420978546,
0.09237229824066162,
0.06009524688124657,
0.026748334988951683,
-0.027690796181559563,
0.02246314287185669,
0.12014351040124893,
0.1042904481291771,
-0.11278197169303894,
-0.06414325535297394,
-0.06594311445951462,
-0.08891893923282623,
-0.04987602308392525,
0.034507665783166885,
0.07233929634094238,
0.045586712658405304,
0.20240989327430725,
0.004374812822788954,
0.051542725414037704,
0.027702245861291885,
0.0163713525980711,
0.06719023734331131,
0.06825068593025208,
-0.05008118972182274,
-0.12756529450416565,
-0.03760921210050583,
0.11946272850036621,
0.0017542883288115263,
-0.03342404216527939,
-0.0370357446372509,
0.06106860190629959,
0.04892996326088905,
0.12214437127113342,
0.0646679550409317,
0.018874529749155045,
-0.07587674260139465,
-0.046467430889606476,
0.18032068014144897,
-0.15827614068984985,
0.02335406094789505,
0.01725550927221775,
-0.0531628392636776,
-0.033948007971048355,
0.018618647009134293,
0.009236144833266735,
-0.029387421905994415,
0.1005609855055809,
-0.06605342775583267,
-0.04097803682088852,
-0.10903192311525345,
-0.055316012352705,
0.03402786701917648,
-0.025065315887331963,
-0.02789347805082798,
-0.040971264243125916,
-0.1248450055718422,
-0.07442043721675873,
0.06903292238712311,
-0.06419821083545685,
-0.06718063354492188,
-0.040221910923719406,
-0.06102044880390167,
0.014937658794224262,
0.0008266915683634579,
0.1269739717245102,
-0.02968521974980831,
0.04848431050777435,
-0.0539008304476738,
0.06914420425891876,
0.13718481361865997,
0.03272920101881027,
-0.06809919327497482,
0.06580659747123718,
-0.21232743561267853,
0.10933514684438705,
-0.09396476298570633,
0.026792176067829132,
-0.16038449108600616,
-0.02306288480758667,
0.03069896623492241,
0.039205435663461685,
-0.01574552245438099,
0.1448679268360138,
-0.1747746467590332,
-0.03626937046647072,
0.18672682344913483,
-0.12991686165332794,
-0.09265255182981491,
0.06183605268597603,
-0.0648084431886673,
0.13347013294696808,
0.05529501289129257,
-0.01992315798997879,
0.05587787926197052,
-0.13651202619075775,
-0.023517979308962822,
-0.058770496398210526,
-0.011057188734412193,
0.15450166165828705,
0.06303975731134415,
-0.04996807500720024,
0.024645399302244186,
0.017310835421085358,
-0.024148117750883102,
-0.04886231571435928,
-0.03430904448032379,
-0.09810014069080353,
0.005970593076199293,
-0.07982048392295837,
0.025509681552648544,
-0.02279755286872387,
-0.08887400478124619,
-0.040562164038419724,
-0.15593992173671722,
0.009587006643414497,
0.0986250564455986,
0.0006499737501144409,
-0.029481856152415276,
-0.09914560616016388,
0.0014640848385170102,
0.016265012323856354,
-0.010709897615015507,
-0.1529860496520996,
-0.05147454887628555,
0.025713054463267326,
-0.16740785539150238,
0.02983911894261837,
-0.04416975751519203,
0.03472619876265526,
0.04469497501850128,
-0.047529187053442,
-0.02975785918533802,
0.015605244785547256,
0.02078833244740963,
-0.024411868304014206,
-0.25051596760749817,
-0.013653411529958248,
-0.051656268537044525,
0.17981497943401337,
-0.25592783093452454,
0.04935307428240776,
0.0690855160355568,
0.12038503587245941,
0.005616906564682722,
-0.04484110698103905,
0.038755834102630615,
-0.05312656611204147,
-0.04079194739460945,
-0.06756321340799332,
-0.004968787543475628,
-0.03330003470182419,
-0.04708937928080559,
0.040533605962991714,
-0.18370530009269714,
-0.026839453727006912,
0.11585007607936859,
0.06803574413061142,
-0.17149686813354492,
-0.07743752747774124,
-0.034665726125240326,
-0.05996506288647652,
-0.08542647957801819,
-0.056485775858163834,
0.09173574298620224,
0.04302561655640602,
0.055119626224040985,
-0.07221351563930511,
-0.0563325397670269,
0.015307560563087463,
-0.011831860989332199,
-0.032375045120716095,
0.08966241031885147,
0.07603370398283005,
-0.12257120013237,
0.10713227838277817,
0.06915293633937836,
0.06829847395420074,
0.10371299833059311,
0.006018918938934803,
-0.0951351672410965,
-0.012076831422746181,
0.028954172506928444,
0.013578351587057114,
0.14422492682933807,
-0.07140666991472244,
0.03330845758318901,
0.04359918460249901,
-0.027328653261065483,
0.009608421474695206,
-0.10246647149324417,
0.018117014318704605,
0.03343784064054489,
-0.008881162852048874,
0.017250988632440567,
-0.05481864511966705,
0.014968239702284336,
0.10633815079927444,
0.03211374580860138,
0.027500580996274948,
0.01981731504201889,
-0.040416620671749115,
-0.12751449644565582,
0.1772654801607132,
-0.09383377432823181,
-0.2552470862865448,
-0.13026653230190277,
-0.009479007683694363,
0.045126691460609436,
-0.010854403488337994,
0.019198866561055183,
-0.05917074531316757,
-0.1081017553806305,
-0.10490734130144119,
0.026286281645298004,
0.054074980318546295,
-0.08816048502922058,
-0.064018115401268,
0.05169869586825371,
0.0385097898542881,
-0.12403316795825958,
0.021811455488204956,
0.046125855296850204,
-0.07025353610515594,
0.00821257010102272,
0.052987806499004364,
0.08472178876399994,
0.1826072335243225,
0.007897963747382164,
-0.016298603266477585,
0.008750800043344498,
0.2144501805305481,
-0.1484457403421402,
0.092045359313488,
0.14109621942043304,
-0.06516804546117783,
0.08377774804830551,
0.20131921768188477,
0.030504774302244186,
-0.09844772517681122,
0.03905881568789482,
0.03513709455728531,
-0.0375148244202137,
-0.24395905435085297,
-0.0748228207230568,
0.0031239830423146486,
-0.06623414903879166,
0.10724245756864548,
0.08736731112003326,
0.1171678826212883,
0.05268942564725876,
-0.11185546219348907,
-0.06449731439352036,
0.05344700068235397,
0.12066427618265152,
-0.028124094009399414,
0.0008641352178528905,
0.09650425612926483,
-0.02977217361330986,
0.02383269928395748,
0.09186029434204102,
0.018334977328777313,
0.1854310929775238,
0.04487955570220947,
0.1315774768590927,
0.08984522521495819,
0.06165572628378868,
0.01767764426767826,
0.01994951255619526,
0.022676948457956314,
0.028990833088755608,
-0.022242991253733635,
-0.0817873626947403,
-0.00921230111271143,
0.14159180223941803,
0.026489878073334694,
0.03602421656250954,
0.001440341817215085,
-0.04777481406927109,
0.07105493545532227,
0.16661210358142853,
0.012482628226280212,
-0.22979335486888885,
-0.06520283222198486,
0.07564391940832138,
-0.07074891030788422,
-0.11627703160047531,
-0.013096708804368973,
0.024812309071421623,
-0.18332423269748688,
0.04349841922521591,
-0.024669349193572998,
0.1018587276339531,
-0.11199972778558731,
-0.02344847284257412,
0.035318560898303986,
0.06107853353023529,
-0.035138774663209915,
0.07848566025495529,
-0.20783106982707977,
0.1402515470981598,
0.007242240011692047,
0.06469187885522842,
-0.10684854537248611,
0.08134520798921585,
0.020340995863080025,
0.006346969865262508,
0.1665121465921402,
-0.005634299945086241,
-0.072713203728199,
-0.09345488250255585,
-0.07864519953727722,
-0.017188850790262222,
0.0979963019490242,
-0.11784757673740387,
0.09015297889709473,
-0.007544329855591059,
-0.03196582943201065,
-0.0007019630284048617,
-0.12950846552848816,
-0.13376227021217346,
-0.18478168547153473,
0.04834262654185295,
-0.12510578334331512,
0.041554566472768784,
-0.10858581960201263,
-0.060765668749809265,
-0.041379012167453766,
0.19413886964321136,
-0.20414148271083832,
-0.08119912445545197,
-0.14911502599716187,
-0.0672706589102745,
0.11254695802927017,
-0.03948867693543434,
0.08191721886396408,
0.008871423080563545,
0.2073923498392105,
-0.004810879472643137,
0.0006135239964351058,
0.09140623360872269,
-0.09588538110256195,
-0.2094263732433319,
-0.0959051325917244,
0.13635295629501343,
0.13115985691547394,
0.04470321163535118,
0.00023247375793289393,
0.02411508932709694,
-0.0018883526790887117,
-0.11162916570901871,
0.03426937386393547,
0.15202432870864868,
0.10249507427215576,
0.044034719467163086,
-0.0260105412453413,
-0.13932733237743378,
-0.1056612879037857,
-0.054744839668273926,
0.013206261210143566,
0.1903214454650879,
-0.0706305131316185,
0.1657869964838028,
0.1536196768283844,
-0.06531279534101486,
-0.21233291923999786,
0.03679078444838524,
0.030905993655323982,
-0.00751135777682066,
0.04347773641347885,
-0.2047269195318222,
0.07352772355079651,
0.01412410382181406,
-0.05716951563954353,
0.1305869072675705,
-0.17576472461223602,
-0.14771407842636108,
0.09065452963113785,
0.07857703417539597,
-0.2075619101524353,
-0.12917637825012207,
-0.0950717106461525,
-0.05231890827417374,
-0.10034287720918655,
0.09251669049263,
-0.0036216825246810913,
0.005252200644463301,
0.036232154816389084,
0.01758572831749916,
0.01728934422135353,
-0.05098523199558258,
0.19524237513542175,
-0.00017524124996270984,
0.05021730437874794,
-0.07728931307792664,
-0.07839185744524002,
0.03842216357588768,
-0.06752927601337433,
0.08417709171772003,
-0.02161126770079136,
0.0039355861954391,
-0.11725787818431854,
-0.06764968484640121,
-0.04570414870977402,
0.03315238282084465,
-0.08949651569128036,
-0.09646400064229965,
-0.0555412657558918,
0.10287721455097198,
0.09537502378225327,
-0.03549838066101074,
-0.06785823404788971,
-0.09521738439798355,
0.05743926018476486,
0.2211635708808899,
0.18752726912498474,
0.07758046686649323,
-0.07665256410837173,
-0.008446265943348408,
-0.02362825535237789,
0.05575858801603317,
-0.2147134691476822,
0.04626009985804558,
0.03838435187935829,
0.030744675546884537,
0.1351434588432312,
-0.022784622386097908,
-0.16072605550289154,
-0.04722895845770836,
0.05541609972715378,
-0.07028964161872864,
-0.15762348473072052,
0.003693870734423399,
0.08388359844684601,
-0.15567344427108765,
-0.05364304408431053,
0.030349692329764366,
-0.03299986198544502,
-0.02724997140467167,
0.002993965055793524,
0.08165504038333893,
0.02525121532380581,
0.10604418069124222,
0.06794179975986481,
0.11212385445833206,
-0.10361232608556747,
0.07820820808410645,
0.08721207082271576,
-0.11143109202384949,
0.03750693425536156,
0.059706296771764755,
-0.06430401653051376,
-0.03306615725159645,
0.028105957433581352,
0.08702781051397324,
0.02858729287981987,
-0.07410863786935806,
0.0023060773964971304,
-0.11285153776407242,
0.06773319095373154,
0.13773435354232788,
0.037572041153907776,
0.009064391255378723,
0.04253077879548073,
0.030666319653391838,
-0.1025259718298912,
0.11677869409322739,
0.04715273529291153,
0.03828616067767143,
-0.053534768521785736,
-0.002754961373284459,
0.04357896372675896,
-0.015574077144265175,
-0.017309002578258514,
-0.03927738964557648,
-0.06638500094413757,
-0.009345067664980888,
-0.16059128940105438,
0.027963994070887566,
-0.06438141316175461,
0.011313637718558311,
0.015024027787148952,
-0.02930280566215515,
0.006326301023364067,
0.010901868343353271,
-0.07644513994455338,
-0.04005778953433037,
-0.0025265931617468596,
0.11033432930707932,
-0.16255317628383636,
0.006753581576049328,
0.08725008368492126,
-0.12882095575332642,
0.07888396829366684,
-0.003228981513530016,
-0.008663777261972427,
0.019871357828378677,
-0.1389452964067459,
0.06426677107810974,
-0.007317067123949528,
0.006886337883770466,
0.024405626580119133,
-0.20780570805072784,
0.002691886154934764,
-0.049495045095682144,
-0.06124653294682503,
-0.003442719578742981,
-0.03931323438882828,
-0.11277955025434494,
0.10321920365095139,
0.017737101763486862,
-0.08050814270973206,
-0.018862100318074226,
0.05358913913369179,
0.11278057098388672,
-0.053978823125362396,
0.14271147549152374,
-0.018007846549153328,
0.05715036392211914,
-0.1816556304693222,
-0.017987793311476707,
-0.017368610948324203,
0.016075139865279198,
-0.03470727428793907,
-0.008873502723872662,
0.05237460881471634,
-0.01958826184272766,
0.22800102829933167,
-0.023029034957289696,
0.01981639862060547,
0.06532696634531021,
0.0016252564964815974,
-0.010984939523041248,
0.09684767574071884,
0.048498742282390594,
0.015143456868827343,
0.0203377865254879,
0.013252451084554195,
-0.04566340893507004,
-0.008616970852017403,
-0.12847718596458435,
0.08234056085348129,
0.1677752137184143,
0.08175479620695114,
-0.006052352488040924,
0.047567352652549744,
-0.11316590011119843,
-0.09173060953617096,
0.10132203251123428,
-0.03303218260407448,
-0.013127516023814678,
-0.05242474004626274,
0.1442553550004959,
0.15683847665786743,
-0.1846613585948944,
0.0673123374581337,
-0.06864999234676361,
-0.058019280433654785,
-0.10558338463306427,
-0.17708730697631836,
-0.0631738007068634,
-0.033932529389858246,
-0.009048123843967915,
-0.060769032686948776,
0.06745719909667969,
0.10813924670219421,
0.01437336578965187,
0.004817943554371595,
0.08580505102872849,
-0.03281113877892494,
0.006333827041089535,
0.04443316161632538,
0.052908364683389664,
0.015542974695563316,
-0.06320759654045105,
0.004275370854884386,
0.006610610987991095,
0.0376921184360981,
0.055147934705019,
0.030873596668243408,
-0.0092905443161726,
0.007207514252513647,
-0.020693093538284302,
-0.10057692229747772,
0.04111333191394806,
-0.025823315605521202,
-0.047910936176776886,
0.1509503871202469,
0.020467912778258324,
-0.003414076054468751,
-0.022258523851633072,
0.2298767864704132,
-0.06479788571596146,
-0.07484833151102066,
-0.13822507858276367,
0.14135941863059998,
-0.03916965425014496,
0.05368134006857872,
0.049936410039663315,
-0.10397564619779587,
0.03804606944322586,
0.14477981626987457,
0.14261196553707123,
-0.034453462809324265,
0.008940902538597584,
0.009526451118290424,
0.004399977158755064,
-0.02350606769323349,
0.05356355383992195,
0.04485337436199188,
0.11325705051422119,
-0.06528755277395248,
0.09648586809635162,
-0.005538135301321745,
-0.09084830433130264,
-0.019364742562174797,
0.1391776204109192,
0.002899263286963105,
0.024846963584423065,
-0.08323919028043747,
0.12169293314218521,
-0.06053123623132706,
-0.2529715597629547,
0.06497339904308319,
-0.06441039592027664,
-0.1503337174654007,
-0.019829563796520233,
0.015622834675014019,
-0.0025740223936736584,
0.022466620430350304,
0.06178610026836395,
-0.06470615416765213,
0.15161879360675812,
0.03660573810338974,
-0.07138057053089142,
-0.07539889216423035,
0.07816334068775177,
-0.08136013150215149,
0.30430659651756287,
0.007375960238277912,
0.05443240702152252,
0.09480572491884232,
-0.03690790757536888,
-0.13316340744495392,
0.0335354208946228,
0.09097745269536972,
-0.047231536358594894,
0.06487338244915009,
0.20800761878490448,
-0.011225960217416286,
0.11401397734880447,
0.07447969168424606,
-0.08660271763801575,
0.05096733942627907,
-0.0917983278632164,
-0.09906064718961716,
-0.0893944799900055,
0.0902828648686409,
-0.059031881392002106,
0.1506001204252243,
0.12994202971458435,
-0.04605574533343315,
0.005047217011451721,
-0.022221196442842484,
0.05354851856827736,
-0.0026379574555903673,
0.11034536361694336,
0.03008626215159893,
-0.19489215314388275,
0.03033076599240303,
-0.00037526662345044315,
0.10122878104448318,
-0.25035029649734497,
-0.08561131358146667,
0.03936697542667389,
-0.007475157734006643,
-0.057129982858896255,
0.12413015216588974,
0.054405856877565384,
0.047805771231651306,
-0.05493326112627983,
-0.05230220779776573,
-0.007250586990267038,
0.1655176728963852,
-0.10096944123506546,
-0.0014428504509851336
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ccai-sft-qlora
This model is a fine-tuned version of [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) on the HuggingFaceH4/cai-conversation-harmless dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6977
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.6894 | 0.99 | 38 | 0.6977 |
### Framework versions
- PEFT 0.7.1
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.14.6
- Tokenizers 0.15.2 | {"license": "apache-2.0", "library_name": "peft", "tags": ["alignment-handbook", "generated_from_trainer", "trl", "sft", "generated_from_trainer"], "datasets": ["HuggingFaceH4/cai-conversation-harmless"], "base_model": "mistralai/Mistral-7B-Instruct-v0.2", "model-index": [{"name": "ccai-sft-qlora", "results": []}]} | null | saffr0n/ccai-sft-qlora | [
"peft",
"tensorboard",
"safetensors",
"mistral",
"alignment-handbook",
"generated_from_trainer",
"trl",
"sft",
"dataset:HuggingFaceH4/cai-conversation-harmless",
"base_model:mistralai/Mistral-7B-Instruct-v0.2",
"license:apache-2.0",
"4-bit",
"region:us"
] | 2024-02-12T11:45:10+00:00 | [] | [] | TAGS
#peft #tensorboard #safetensors #mistral #alignment-handbook #generated_from_trainer #trl #sft #dataset-HuggingFaceH4/cai-conversation-harmless #base_model-mistralai/Mistral-7B-Instruct-v0.2 #license-apache-2.0 #4-bit #region-us
| ccai-sft-qlora
==============
This model is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.2 on the HuggingFaceH4/cai-conversation-harmless dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6977
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0002
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* distributed\_type: multi-GPU
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 8
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 1
### Training results
### Framework versions
* PEFT 0.7.1
* Transformers 4.36.2
* Pytorch 2.1.2+cu121
* Datasets 2.14.6
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 1",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.15.2"
] | [
"TAGS\n#peft #tensorboard #safetensors #mistral #alignment-handbook #generated_from_trainer #trl #sft #dataset-HuggingFaceH4/cai-conversation-harmless #base_model-mistralai/Mistral-7B-Instruct-v0.2 #license-apache-2.0 #4-bit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 1",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.15.2"
] | [
90,
155,
4,
39
] | [
"passage: TAGS\n#peft #tensorboard #safetensors #mistral #alignment-handbook #generated_from_trainer #trl #sft #dataset-HuggingFaceH4/cai-conversation-harmless #base_model-mistralai/Mistral-7B-Instruct-v0.2 #license-apache-2.0 #4-bit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 1### Training results### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.15.2"
] | [
-0.1173187643289566,
0.0973186045885086,
-0.0045649390667676926,
0.09824132174253464,
0.08255615830421448,
0.04596135765314102,
0.13856005668640137,
0.12425487488508224,
-0.018879126757383347,
0.14383584260940552,
0.11181817203760147,
0.04166674613952637,
0.08430144190788269,
0.17571264505386353,
-0.020031318068504333,
-0.24849911034107208,
0.014271877706050873,
-0.03811126947402954,
-0.09809470921754837,
0.10422591120004654,
0.07680810987949371,
-0.10484348237514496,
0.073389433324337,
-0.019418716430664062,
-0.12439442425966263,
-0.052547674626111984,
-0.04997926577925682,
-0.00329189351759851,
0.10260219126939774,
0.029690789058804512,
0.10350712388753891,
0.04233711585402489,
0.10775884240865707,
-0.2198697030544281,
0.00927126407623291,
0.042799320071935654,
0.010846350342035294,
0.07845308631658554,
0.10201473534107208,
0.016152245923876762,
0.07563327997922897,
-0.10806810110807419,
0.0651298388838768,
0.016960006207227707,
-0.12629102170467377,
-0.17422111332416534,
-0.09521762281656265,
0.07372230291366577,
0.1202208399772644,
0.04900749772787094,
-0.016161270439624786,
0.08512156456708908,
-0.05251765251159668,
0.07073428481817245,
0.23894552886486053,
-0.24305948615074158,
-0.08296408504247665,
0.0336056724190712,
0.021402982994914055,
0.07477342337369919,
-0.12342531979084015,
-0.030239105224609375,
0.017212707549333572,
0.019195418804883957,
0.09927906095981598,
0.00041021453216671944,
0.03580186888575554,
-0.005629531107842922,
-0.13469372689723969,
-0.05452784523367882,
0.10277481377124786,
0.07338963449001312,
-0.014639765955507755,
-0.11163873225450516,
-0.04457235708832741,
-0.1744590401649475,
-0.053432896733284,
0.00037032508407719433,
0.028599059209227562,
-0.05053577572107315,
-0.04459819942712784,
0.03240727633237839,
-0.08259652554988861,
-0.06958942860364914,
0.022280298173427582,
0.0741853192448616,
0.04475133493542671,
-0.017440592870116234,
0.04483387991786003,
0.11786950379610062,
0.017193570733070374,
-0.16269981861114502,
-0.010692530311644077,
0.013435439206659794,
-0.10539810359477997,
-0.030059779062867165,
0.005109626334160566,
0.06993241608142853,
0.050153978168964386,
0.17418980598449707,
-0.04997475445270538,
0.07880860567092896,
0.06680682301521301,
0.01945079118013382,
-0.07827089726924896,
0.1378118246793747,
-0.057247307151556015,
-0.07267246395349503,
-0.037586066871881485,
0.145114928483963,
0.031212501227855682,
-0.017257029190659523,
-0.06722985208034515,
0.030426695942878723,
0.08471257239580154,
0.04277494549751282,
-0.004860205575823784,
0.014308939687907696,
-0.0879485160112381,
-0.026402387768030167,
0.12968508899211884,
-0.09221044182777405,
0.05685259774327278,
0.04793347790837288,
-0.036705199629068375,
-0.04143531993031502,
0.016344008967280388,
0.008725752122700214,
0.005472660530358553,
0.08843507617712021,
-0.08938521146774292,
-0.001436823047697544,
-0.046602874994277954,
-0.07555992901325226,
0.02708260715007782,
-0.0766826868057251,
-0.010353093035519123,
-0.06810780614614487,
-0.07629936188459396,
-0.0653156042098999,
0.03328259661793709,
-0.07728260010480881,
-0.06491066515445709,
-0.061594508588314056,
-0.07922136038541794,
0.041322823613882065,
0.00893749762326479,
0.09854894131422043,
-0.061148885637521744,
0.07111919671297073,
-0.01669609360396862,
0.062418609857559204,
0.05915366858243942,
0.03599002957344055,
-0.03659198060631752,
0.07208995521068573,
-0.1716984510421753,
0.051436301320791245,
-0.09027770906686783,
0.06259404867887497,
-0.12321862578392029,
-0.08474896848201752,
0.014470761641860008,
-0.03467454016208649,
0.08476567268371582,
0.13518691062927246,
-0.1545642912387848,
-0.06340988725423813,
0.1845768839120865,
-0.08409232646226883,
-0.10491739213466644,
0.13986411690711975,
-0.009195888414978981,
-0.057779889553785324,
0.0007523834938183427,
0.16805055737495422,
0.12379568815231323,
-0.13822080194950104,
-0.008779287338256836,
-0.016523001715540886,
0.11158943176269531,
0.05300338193774223,
0.0891152024269104,
-0.029465366154909134,
0.05622823163866997,
0.003133864840492606,
-0.056325122714042664,
0.04756384342908859,
-0.08616410940885544,
-0.08836165815591812,
-0.026756400242447853,
-0.07147036492824554,
0.01257920078933239,
0.05491601303219795,
0.007627787999808788,
-0.07109279185533524,
-0.10385140776634216,
-0.05603712797164917,
0.10960157960653305,
-0.08613134920597076,
0.009106227196753025,
-0.035590190440416336,
0.09168757498264313,
-0.007397431414574385,
-0.016773611307144165,
-0.14070303738117218,
-0.10960469394922256,
0.05654577165842056,
-0.02767528034746647,
0.0067808786407113075,
-0.01342687290161848,
0.07250836491584778,
0.10003586113452911,
-0.04510039463639259,
-0.059915587306022644,
-0.03039785847067833,
-0.006881611421704292,
-0.08404736965894699,
-0.24296531081199646,
-0.05001700669527054,
-0.03449256345629692,
0.21702322363853455,
-0.2214706391096115,
0.012231536209583282,
0.011578209698200226,
0.1343192309141159,
0.027865031734108925,
-0.05571577697992325,
0.00177656183950603,
0.048404939472675323,
-0.021286074072122574,
-0.08860216289758682,
0.04419448599219322,
-0.005651857703924179,
-0.0955277532339096,
-0.00752516882494092,
-0.16776864230632782,
0.08138348162174225,
0.07053425163030624,
0.07217881083488464,
-0.12369831651449203,
-0.09413149952888489,
-0.06719174236059189,
-0.05845548212528229,
-0.021879596635699272,
0.014808171428740025,
0.1423972100019455,
0.0154406214132905,
0.09408066421747208,
-0.0683676153421402,
-0.05165766924619675,
0.035378698259592056,
-0.005726389121264219,
-0.010755414143204689,
0.1561642587184906,
0.05091581866145134,
-0.10463111847639084,
0.11641084402799606,
0.1122705489397049,
-0.04839782789349556,
0.10618631541728973,
-0.07702786475419998,
-0.081779845058918,
-0.05679526552557945,
0.059136152267456055,
0.03570075333118439,
0.12663131952285767,
-0.04929971322417259,
0.012441248632967472,
0.03485612943768501,
0.0029732664115726948,
-0.015060175210237503,
-0.1854507029056549,
-0.017556652426719666,
0.02868047170341015,
-0.06745285540819168,
0.024456290528178215,
-0.0152580002322793,
-0.006327132694423199,
0.09825903177261353,
-0.004496815614402294,
-0.08615461736917496,
-0.03858072683215141,
-0.01630176417529583,
-0.07881785184144974,
0.19752687215805054,
-0.09222651273012161,
-0.11199506372213364,
-0.10335075110197067,
0.031165409833192825,
-0.01857849396765232,
-0.007690502796322107,
0.022382697090506554,
-0.05255015939474106,
-0.04395380616188049,
-0.09652858972549438,
-0.030533090233802795,
0.03330577164888382,
0.03080029971897602,
0.016034971922636032,
-0.01983608864247799,
0.05274057015776634,
-0.08159781992435455,
0.00481262244284153,
-0.009727531112730503,
-0.024594495072960854,
0.04951358959078789,
0.013084808364510536,
0.113947294652462,
0.15219560265541077,
0.057236406952142715,
0.0036742626689374447,
-0.026818983256816864,
0.1925521045923233,
-0.10494638234376907,
0.02401582896709442,
0.054371993988752365,
-0.01360591035336256,
0.06941492855548859,
0.1652262657880783,
0.04636389762163162,
-0.07031507045030594,
0.00026104727294296026,
0.02474462427198887,
-0.022922039031982422,
-0.18141190707683563,
-0.043786633759737015,
-0.05040600150823593,
0.024486204609274864,
0.1193239688873291,
0.03352069482207298,
-0.011960403062403202,
0.03720015659928322,
-0.017565689980983734,
-0.017483098432421684,
0.037570104002952576,
0.06003911420702934,
0.011957052163779736,
0.04022591561079025,
0.11123304814100266,
-0.023545442149043083,
-0.034682873636484146,
0.04808378219604492,
0.006304020527750254,
0.2488371729850769,
-0.02587904967367649,
0.19829003512859344,
0.04249200597405434,
0.16583853960037231,
-0.013235838152468204,
0.04133109748363495,
0.02198156714439392,
-0.02661164663732052,
0.0015958257718011737,
-0.05019635334610939,
0.010134285315871239,
0.04371865093708038,
0.07569698244333267,
0.016585730016231537,
-0.08423643559217453,
0.02513827197253704,
0.04535067081451416,
0.27583548426628113,
0.08816457539796829,
-0.30496540665626526,
-0.07191383093595505,
0.021959424018859863,
-0.021669359877705574,
-0.021476324647665024,
-0.011017359793186188,
0.15242066979408264,
-0.07946981489658356,
0.08095216751098633,
-0.060653332620859146,
0.07040825486183167,
-0.054894350469112396,
-0.009165926836431026,
0.07014341652393341,
0.08896829187870026,
0.0007779420702718198,
0.03608335554599762,
-0.18059378862380981,
0.27845633029937744,
-0.007285105995833874,
0.025574488565325737,
-0.035896215587854385,
0.025493135675787926,
0.026105288416147232,
0.04706493765115738,
0.10281229019165039,
-0.0010669745970517397,
-0.08088046312332153,
-0.19361735880374908,
-0.14062289893627167,
0.012727165594696999,
0.11475691944360733,
-0.08533811569213867,
0.0981379896402359,
-0.018181530758738518,
-0.04297613725066185,
0.030312636867165565,
-0.05179327726364136,
-0.06718645244836807,
-0.11646562814712524,
0.03874678909778595,
-0.03231251612305641,
-0.02557632513344288,
-0.07270833849906921,
-0.10164029151201248,
-0.1090594083070755,
0.14395540952682495,
-0.08816307783126831,
-0.026923244819045067,
-0.1352062076330185,
0.05103441700339317,
0.16246397793293,
-0.09014366567134857,
0.03585309535264969,
-0.007005356717854738,
0.09796704351902008,
0.011837953701615334,
-0.009924020618200302,
0.09353738278150558,
-0.07752785831689835,
-0.22613607347011566,
-0.0567304790019989,
0.14788657426834106,
0.053180206567049026,
0.06603783369064331,
-0.024052292108535767,
0.03980524092912674,
-0.01026616059243679,
-0.103953056037426,
0.060852229595184326,
0.056428212672472,
0.06173640117049217,
0.0356435663998127,
-0.014460338279604912,
-0.004167167469859123,
-0.05716191604733467,
-0.031315870583057404,
0.08585179597139359,
0.3328632712364197,
-0.10365500301122665,
0.05393054708838463,
0.02356337197124958,
-0.04989286884665489,
-0.1659584939479828,
-0.05136478692293167,
0.11912735551595688,
0.041427019983530045,
0.02339157648384571,
-0.13003475964069366,
0.038134314119815826,
0.08576285094022751,
-0.023268843069672585,
0.08600448817014694,
-0.3019276559352875,
-0.14039336144924164,
0.08485523611307144,
0.08432121574878693,
-0.04948492348194122,
-0.16992777585983276,
-0.05957186594605446,
0.030203698202967644,
-0.10629814863204956,
0.06802038103342056,
-0.02217116765677929,
0.11461439728736877,
-0.026011597365140915,
0.0004981919191777706,
0.01936611719429493,
-0.04792525991797447,
0.19371318817138672,
0.008519290946424007,
0.06331120431423187,
-0.032499879598617554,
-0.01196887344121933,
-0.008072438649833202,
-0.07464192807674408,
0.018622327595949173,
-0.0826481431722641,
0.030244123190641403,
-0.08821319788694382,
-0.003628317965194583,
-0.07641750574111938,
0.012093814089894295,
-0.05544346198439598,
-0.012638899497687817,
-0.05272984877228737,
0.05160374939441681,
0.057602763175964355,
-0.0018446182366460562,
0.11740715056657791,
-0.005766662769019604,
0.1296951174736023,
0.12382788956165314,
0.06280024349689484,
0.010795267298817635,
-0.10345850139856339,
-0.023758096620440483,
-0.013902870938181877,
0.01296195201575756,
-0.1124921590089798,
0.020624829456210136,
0.15069299936294556,
0.03029470518231392,
0.13286808133125305,
0.043244268745183945,
-0.06965073198080063,
-0.008807593025267124,
0.07198485732078552,
-0.11786340922117233,
-0.1330164223909378,
0.010165133513510227,
-0.017579250037670135,
-0.15230196714401245,
-0.017260795459151268,
0.11900453269481659,
-0.02257853001356125,
-0.014261312782764435,
0.008207940496504307,
0.05676066130399704,
-0.01424427330493927,
0.21377885341644287,
0.03557971492409706,
0.0748133435845375,
-0.09000932425260544,
0.06151300668716431,
0.05113512650132179,
-0.11208885163068771,
0.006020262371748686,
0.11022711545228958,
-0.08452136069536209,
-0.031456854194402695,
0.09266416728496552,
0.12575888633728027,
0.011958378367125988,
-0.01607636548578739,
-0.11094213277101517,
-0.1330983191728592,
0.07738518714904785,
0.08714327216148376,
0.04958813637495041,
0.030268747359514236,
0.005706924945116043,
0.016475992277264595,
-0.0701417326927185,
0.1181357279419899,
0.09194259345531464,
0.08670014142990112,
-0.11991077661514282,
0.09021548926830292,
-0.01575237512588501,
0.014507781714200974,
-0.007426316384226084,
0.033125635236501694,
-0.13432389497756958,
-0.032871074974536896,
-0.06649819761514664,
0.02472786419093609,
-0.06133059784770012,
0.003492403542622924,
0.00946680549532175,
-0.06692706048488617,
-0.019524557515978813,
0.002989415777847171,
-0.0845550075173378,
-0.04908081516623497,
-0.029701370745897293,
0.08025597780942917,
-0.10630931705236435,
-0.050521593540906906,
0.048355042934417725,
-0.11408443748950958,
0.10883383452892303,
0.023074114695191383,
0.045895133167505264,
-0.002724533202126622,
-0.11014481633901596,
0.037095218896865845,
0.038158118724823,
0.003885669633746147,
0.03605777397751808,
-0.17028166353702545,
-0.025629624724388123,
-0.06285809725522995,
-0.006001640576869249,
0.015402593649923801,
0.05752760171890259,
-0.1024584248661995,
0.020024705678224564,
-0.04988369345664978,
-0.09046389162540436,
-0.06205550953745842,
0.03721141815185547,
0.06935553252696991,
0.010882310569286346,
0.11570937186479568,
-0.080741748213768,
0.0708296000957489,
-0.2349458932876587,
-0.01826801709830761,
0.0032477204222232103,
-0.06298304349184036,
-0.03812548890709877,
-0.03911259025335312,
0.09593984484672546,
-0.04064759984612465,
0.052558451890945435,
-0.05627736821770668,
0.025974256917834282,
0.00916723720729351,
-0.009506157599389553,
0.04382315278053284,
0.06801289319992065,
0.09189257025718689,
0.024914760142564774,
-0.04149879515171051,
0.04763200134038925,
-0.0011129691265523434,
0.05722056329250336,
0.05080365389585495,
0.17751780152320862,
0.10500326752662659,
0.006375383120030165,
0.057862576097249985,
0.05912468954920769,
-0.16243700683116913,
-0.11860422790050507,
0.10109439492225647,
-0.08539018034934998,
0.10811153799295425,
-0.029027367010712624,
0.15580880641937256,
0.08217693120241165,
-0.2269124984741211,
0.02851487509906292,
-0.043958015739917755,
-0.10999230295419693,
-0.08732102811336517,
-0.058762628585100174,
-0.0852971151471138,
-0.1334923505783081,
-0.005511757452040911,
-0.09957017749547958,
0.035854440182447433,
0.09082373231649399,
0.026664607226848602,
0.0510023832321167,
0.12485863268375397,
0.06150990352034569,
0.03062530979514122,
0.06059667095541954,
0.05738714337348938,
-0.006323789246380329,
-0.04013075679540634,
-0.08397438377141953,
0.016935888677835464,
-0.02731345407664776,
0.0451827272772789,
-0.053517211228609085,
-0.014584202319383621,
0.07173768430948257,
0.013120211660861969,
-0.1014447882771492,
0.023139750584959984,
-0.015357764437794685,
0.02437087707221508,
0.05741695687174797,
0.021228456869721413,
0.022510146722197533,
-0.018763689324259758,
0.175362691283226,
-0.06658711284399033,
-0.04844091460108757,
-0.13307616114616394,
0.20444513857364655,
-0.04560800641775131,
-0.014239358715713024,
0.038428083062171936,
-0.057123348116874695,
0.004824422299861908,
0.13413217663764954,
0.16339831054210663,
-0.06381230056285858,
0.0016996967606246471,
0.019908923655748367,
-0.00783722847700119,
-0.009180688299238682,
0.08054465055465698,
0.10301665216684341,
0.06933198869228363,
-0.08497447520494461,
-0.02371152490377426,
-0.02345181256532669,
-0.03929808735847473,
-0.05545732006430626,
0.0351468101143837,
0.02756035514175892,
0.024668080732226372,
-0.03346981853246689,
0.07219239324331284,
-0.0348314493894577,
-0.08225599676370621,
0.10363472253084183,
-0.2157265841960907,
-0.18311192095279694,
-0.02417127415537834,
0.06070806086063385,
-0.0013943902449682355,
0.05543931946158409,
-0.013394556939601898,
-0.032269254326820374,
0.13688687980175018,
-0.02313537895679474,
-0.06173423305153847,
-0.06173069402575493,
0.04772167652845383,
-0.12089492380619049,
0.18338049948215485,
-0.03483124449849129,
0.05947568267583847,
0.1204335018992424,
0.0317743644118309,
-0.1084584891796112,
0.0071739088743925095,
0.08407735824584961,
-0.08017420768737793,
0.02874833159148693,
0.14608369767665863,
-0.039130158722400665,
0.06945459544658661,
0.06376788765192032,
-0.08383399993181229,
0.0033907678443938494,
-0.04202963784337044,
-0.02315252274274826,
-0.03724948689341545,
-0.004145943559706211,
-0.04918660596013069,
0.1609071046113968,
0.1833069920539856,
-0.04475957900285721,
-0.013920767232775688,
-0.005021090153604746,
0.017471402883529663,
0.04745466262102127,
0.14080601930618286,
-0.021776584908366203,
-0.241010382771492,
0.04402532801032066,
-0.011448582634329796,
0.03887622430920601,
-0.17970041930675507,
-0.12003359198570251,
0.00993933342397213,
-0.04166198894381523,
-0.08871320635080338,
0.13132235407829285,
0.02972639724612236,
0.035113248974084854,
-0.047170013189315796,
-0.10028091073036194,
-0.03597506135702133,
0.1432323008775711,
-0.14388161897659302,
-0.06975885480642319
] |
null | null | sentence-transformers |
# trtd56/practical_nlp_course_7
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('trtd56/practical_nlp_course_7')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('trtd56/practical_nlp_course_7')
model = AutoModel.from_pretrained('trtd56/practical_nlp_course_7')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=trtd56/practical_nlp_course_7)
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 329 with parameters:
```
{'batch_size': 16, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 1,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 100,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | {"library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "transformers"], "pipeline_tag": "sentence-similarity"} | sentence-similarity | trtd56/practical_nlp_course_7 | [
"sentence-transformers",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:46:25+00:00 | [] | [] | TAGS
#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us
|
# trtd56/practical_nlp_course_7
This is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have sentence-transformers installed:
Then you can use the model like this:
## Usage (HuggingFace Transformers)
Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL
## Training
The model was trained with the parameters:
DataLoader:
'URL.dataloader.DataLoader' of length 329 with parameters:
Loss:
'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss'
Parameters of the fit()-Method:
## Full Model Architecture
## Citing & Authors
| [
"# trtd56/practical_nlp_course_7\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.",
"## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:",
"## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.",
"## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL",
"## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 329 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \n\nParameters of the fit()-Method:",
"## Full Model Architecture",
"## Citing & Authors"
] | [
"TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us \n",
"# trtd56/practical_nlp_course_7\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.",
"## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:",
"## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.",
"## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL",
"## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 329 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \n\nParameters of the fit()-Method:",
"## Full Model Architecture",
"## Citing & Authors"
] | [
43,
59,
38,
64,
29,
78,
5,
6
] | [
"passage: TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us \n# trtd56/practical_nlp_course_7\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 329 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \n\nParameters of the fit()-Method:## Full Model Architecture## Citing & Authors"
] | [
-0.029745081439614296,
0.1035170927643776,
-0.006478891242295504,
0.04829687252640724,
0.10160480439662933,
0.025649413466453552,
0.1398513913154602,
0.08236680179834366,
-0.029154924675822258,
0.08523841202259064,
0.010278905741870403,
0.09800415486097336,
0.0024589430540800095,
0.025706853717565536,
0.021774912253022194,
-0.28010988235473633,
0.03198075294494629,
-0.05629697069525719,
0.002057786798104644,
0.07363142818212509,
0.12179258465766907,
-0.062424223870038986,
0.05481238663196564,
0.004659469239413738,
-0.04215047508478165,
0.01713941991329193,
-0.01498123724013567,
-0.029488040134310722,
0.09834898263216019,
0.06497624516487122,
0.048853397369384766,
0.011986925266683102,
0.017988374456763268,
-0.20765943825244904,
0.016487423330545425,
0.07587043941020966,
-0.024039892479777336,
0.05883752182126045,
0.028753355145454407,
-0.04208437353372574,
0.16531386971473694,
-0.08284570276737213,
0.0695311650633812,
0.053131189197301865,
-0.1262541115283966,
-0.06389839947223663,
-0.05090736970305443,
0.002409291220828891,
0.1347520500421524,
0.09286702424287796,
-0.061817508190870285,
0.1366463601589203,
-0.04924468696117401,
0.08558901399374008,
0.08720160275697708,
-0.29236528277397156,
-0.04023520275950432,
0.028534458950161934,
0.07369274646043777,
0.03512491658329964,
-0.10386179387569427,
0.015503051690757275,
-0.009982721880078316,
0.0380609966814518,
0.07215981930494308,
-0.040462758392095566,
0.052255090326070786,
-0.004207897000014782,
-0.12001647800207138,
0.010721814818680286,
0.19478526711463928,
0.03160078823566437,
-0.018515469506382942,
-0.19824060797691345,
-0.086224265396595,
0.08573541790246964,
-0.04514545574784279,
-0.04354014992713928,
0.024262862280011177,
0.05748875066637993,
-0.007627985440194607,
-0.08900473266839981,
-0.09966698288917542,
-0.01192252803593874,
-0.07240097969770432,
0.013879590667784214,
-0.0078848572447896,
-0.051729436963796616,
0.004757686518132687,
0.06248747557401657,
-0.06633897870779037,
-0.11005683243274689,
-0.013208428397774696,
-0.04142572358250618,
-0.11566640436649323,
-0.04724564775824547,
-0.059794194996356964,
-0.09274250268936157,
0.04036884009838104,
0.15135088562965393,
0.07603774219751358,
0.014101899228990078,
-0.029401367530226707,
0.05128074437379837,
0.01577482372522354,
0.15973277390003204,
-0.05573039501905441,
-0.08764802664518356,
-0.02916448377072811,
0.04537304490804672,
0.009518847800791264,
-0.01563224196434021,
-0.030920574441552162,
-0.003563446458429098,
0.027760107070207596,
0.06594938039779663,
0.06448383629322052,
0.06286453455686569,
-0.05076198652386665,
-0.03168601542711258,
0.04566050320863724,
-0.12532564997673035,
0.024885568767786026,
0.010881338268518448,
-0.044805847108364105,
0.01902967318892479,
0.08263722062110901,
-0.011457000859081745,
-0.07246704399585724,
-0.0027312729507684708,
-0.10101475566625595,
-0.025968413800001144,
-0.05249577760696411,
-0.13564297556877136,
-0.0025277931708842516,
0.015245042741298676,
-0.04080679640173912,
-0.10250868648290634,
-0.13353385031223297,
-0.07535677403211594,
0.031460728496313095,
-0.03743351623415947,
-0.008389509283006191,
-0.11863210052251816,
-0.0232083760201931,
0.004487359896302223,
0.0016969440039247274,
-0.08005426824092865,
-0.007794703356921673,
0.019200554117560387,
-0.05269373953342438,
0.059014275670051575,
0.05601014196872711,
0.03104495443403721,
-0.11948516219854355,
0.026180176064372063,
-0.13741324841976166,
0.15670806169509888,
-0.03843049705028534,
0.07000000774860382,
-0.1362788826227188,
0.025121646001935005,
0.0033650901168584824,
0.06837895512580872,
0.002887170063331723,
0.15709000825881958,
-0.2135496884584427,
-0.0665012001991272,
0.1477864533662796,
-0.06602742522954941,
-0.10007413476705551,
0.11322135478258133,
-0.03079035133123398,
0.1357315182685852,
0.1136438399553299,
0.11915411800146103,
0.11789143830537796,
-0.05031775310635567,
-0.006196693982928991,
0.019852302968502045,
-0.06355646997690201,
0.1503657102584839,
0.03571495786309242,
-0.05735780671238899,
0.0723651647567749,
-0.00788883212953806,
-0.046595972031354904,
0.007553087547421455,
-0.00294921244494617,
-0.05428031459450722,
0.012338810600340366,
-0.03133075311779976,
0.06509774923324585,
-0.034469958394765854,
-0.00577704468742013,
0.0020659405272454023,
-0.10848448425531387,
0.10428161919116974,
0.07298962771892548,
-0.07122302800416946,
0.014580797404050827,
-0.09160582721233368,
0.008715976029634476,
-0.001106461975723505,
0.01394679769873619,
-0.20183932781219482,
-0.1411273181438446,
0.01700662076473236,
0.0002335645694984123,
0.11685799062252045,
0.04705500230193138,
0.06529900431632996,
0.05011846870183945,
-0.024679459631443024,
-0.021368764340877533,
0.04425233602523804,
0.0022735819220542908,
-0.08854059875011444,
-0.13466966152191162,
0.005609448533505201,
-0.037578776478767395,
0.10424862056970596,
-0.13135121762752533,
0.02738886885344982,
0.005026181694120169,
0.08754251897335052,
0.045612405985593796,
-0.014096681959927082,
-0.006232359912246466,
-0.02827419713139534,
-0.01936284452676773,
-0.03530982509255409,
0.04394608736038208,
0.014581895433366299,
-0.1403329223394394,
0.09957076609134674,
-0.21624836325645447,
-0.12120380997657776,
0.08098728954792023,
0.00019803966279141605,
-0.07024652510881424,
-0.05018763989210129,
-0.02490922063589096,
0.0007127475109882653,
-0.03702375665307045,
-0.06997153908014297,
0.18236812949180603,
0.07301221042871475,
0.1041165292263031,
-0.03735246881842613,
-0.02448136918246746,
-0.05304688215255737,
-0.04424519091844559,
-0.05291672796010971,
0.1061425507068634,
-0.040792979300022125,
-0.17265193164348602,
0.05225497484207153,
0.07398132979869843,
-0.0475381501019001,
0.10527627915143967,
-0.017864864319562912,
-0.06722739338874817,
-0.06284699589014053,
0.037969090044498444,
0.044912319630384445,
0.004582964815199375,
-0.05213242769241333,
0.008353196084499359,
0.05193635821342468,
0.011021165177226067,
0.020000247284770012,
-0.070318304002285,
0.04334893822669983,
0.05637575685977936,
0.002612277865409851,
0.096108578145504,
0.026381218805909157,
-0.0030640815384685993,
0.07609838247299194,
0.004402332007884979,
0.0008151347865350544,
-0.047129902988672256,
-0.04850616678595543,
-0.11701558530330658,
0.16355830430984497,
-0.13686132431030273,
-0.20242571830749512,
-0.15612119436264038,
0.026138193905353546,
-0.047981876879930496,
0.024093708023428917,
0.07908372581005096,
-0.05933545157313347,
-0.07529275119304657,
-0.08116406947374344,
0.06700489670038223,
0.09551993757486343,
-0.04724179208278656,
-0.002463491866365075,
0.040958404541015625,
0.022405024617910385,
-0.12341245263814926,
-0.0070977103896439075,
0.0020309926476329565,
-0.07807257026433945,
-0.0063135805539786816,
-0.03745071962475777,
0.0636553168296814,
0.12035107612609863,
0.05914130434393883,
-0.01468619704246521,
-0.011794346384704113,
0.21301740407943726,
-0.08974026143550873,
0.06777152419090271,
0.16796302795410156,
0.0008500058320350945,
0.0649314820766449,
0.11188244819641113,
0.017037831246852875,
-0.05909763649106026,
0.057781025767326355,
0.06497187912464142,
-0.011534509249031544,
-0.15642531216144562,
-0.11472965031862259,
-0.06543606519699097,
0.00009700332884676754,
0.11736233532428741,
0.041568875312805176,
0.03893129527568817,
0.03639592230319977,
-0.0325973741710186,
0.004159290809184313,
0.12239479273557663,
0.11351507157087326,
0.12543843686580658,
-0.024657513946294785,
0.10295946151018143,
-0.05317104980349541,
-0.07868611067533493,
0.059873562306165695,
-0.01967715658247471,
0.14405930042266846,
0.025944046676158905,
0.1419810950756073,
0.07362763583660126,
-0.020445112138986588,
-0.023127591237425804,
0.0794859379529953,
-0.020962096750736237,
0.014752012677490711,
-0.03261721134185791,
-0.09938465058803558,
-0.007847435772418976,
0.0905555710196495,
0.08571673184633255,
-0.027635499835014343,
-0.03839603811502457,
0.0714702233672142,
0.13581296801567078,
0.1358094960451126,
0.0891493633389473,
-0.24359790980815887,
-0.04381119832396507,
0.045653022825717926,
-0.07346576452255249,
-0.062288928776979446,
0.005870221648365259,
0.053356945514678955,
-0.09953482449054718,
0.04595112428069115,
-0.010814093984663486,
0.09372197836637497,
-0.05854078754782677,
0.027291039004921913,
-0.04749169200658798,
0.04258153960108757,
-0.006583216600120068,
0.06448432058095932,
-0.21347634494304657,
0.1050683930516243,
0.039658430963754654,
0.05287837237119675,
-0.04764128476381302,
0.023838184773921967,
0.07691384106874466,
0.039405759423971176,
0.17696355283260345,
-0.03140273690223694,
-0.025853680446743965,
0.00263809016905725,
-0.07435330003499985,
-0.005069015081971884,
0.05883486941456795,
-0.1208084300160408,
0.09049620479345322,
-0.06360652297735214,
-0.0352623276412487,
-0.003868316300213337,
0.038869887590408325,
-0.055408868938684464,
-0.1637595146894455,
0.002125262515619397,
0.01031029038131237,
-0.004681784193962812,
-0.038307853043079376,
-0.006924774497747421,
0.01756501942873001,
0.191395103931427,
-0.09589798003435135,
-0.049517929553985596,
-0.1288330852985382,
-0.026330269873142242,
0.11238877475261688,
-0.08144982904195786,
0.003632747335359454,
0.006363554857671261,
0.15100964903831482,
-0.0542648509144783,
-0.08332476764917374,
0.07845745980739594,
-0.05545589327812195,
-0.06520743668079376,
-0.05266217142343521,
0.11960459500551224,
0.0508776418864727,
0.055893074721097946,
0.048188455402851105,
0.07258835434913635,
-0.01423177495598793,
-0.08686841279268265,
-0.07327980548143387,
0.10952519625425339,
-0.0061746621504426,
0.07681286334991455,
-0.1386645883321762,
-0.06004466116428375,
-0.10551020503044128,
0.05552227422595024,
0.18239982426166534,
0.22550931572914124,
-0.07078588753938675,
0.06089794635772705,
0.148037388920784,
-0.09271671622991562,
-0.23660418391227722,
-0.06448528170585632,
0.009397726505994797,
0.04441101476550102,
0.07415717095136642,
-0.10302296280860901,
0.06800907105207443,
0.04394088312983513,
-0.003843359649181366,
-0.07768295705318451,
-0.25490009784698486,
-0.1471743881702423,
0.12950167059898376,
0.01634962670505047,
-0.04564435034990311,
-0.10590758919715881,
-0.05160331726074219,
-0.06591688841581345,
-0.015429210849106312,
0.07844238728284836,
-0.09786722809076309,
0.09702349454164505,
0.06068083643913269,
0.008489735424518585,
0.057991873472929,
0.004054256249219179,
0.13778217136859894,
0.05605949088931084,
0.04187387600541115,
-0.04285387322306633,
-0.020028073340654373,
0.08645568042993546,
-0.08310952037572861,
0.14035114645957947,
-0.046463847160339355,
0.03739742934703827,
-0.10909423232078552,
-0.02876722440123558,
-0.03536223620176315,
0.025319214910268784,
-0.04914122819900513,
-0.05182567983865738,
-0.019802415743470192,
0.05598613992333412,
0.10701306909322739,
-0.011631803587079048,
0.013354672119021416,
-0.08275513350963593,
0.06985118240118027,
0.15615881979465485,
0.13188861310482025,
0.07644501328468323,
-0.14266546070575714,
0.03183284029364586,
0.0025590481236577034,
0.05075135454535484,
-0.12395826727151871,
0.07562030106782913,
0.07620148360729218,
-0.007831554859876633,
0.15647025406360626,
0.02151595614850521,
-0.08111408352851868,
-0.020546428859233856,
0.05375204235315323,
-0.08702227473258972,
-0.16593754291534424,
-0.04681983217597008,
-0.013658009469509125,
-0.10930874943733215,
-0.05089458078145981,
0.1573687344789505,
-0.023342382162809372,
0.008524361066520214,
0.03775496408343315,
0.042744506150484085,
-0.034880731254816055,
0.10534310340881348,
-0.016944296658039093,
0.049849119037389755,
-0.04469440504908562,
0.09505197405815125,
0.0840831771492958,
-0.0952649638056755,
0.027738692238926888,
0.1267322450876236,
-0.07407639920711517,
-0.0913306400179863,
-0.0372307114303112,
0.11292357742786407,
-0.06285121291875839,
0.02855997160077095,
-0.05583096295595169,
-0.0828276053071022,
0.013749789446592331,
0.0761830136179924,
0.03731600195169449,
0.06442717462778091,
-0.08799967914819717,
-0.008657128550112247,
-0.09497390687465668,
0.08211352676153183,
0.09140153974294662,
0.016316408291459084,
-0.035507846623659134,
0.09916286915540695,
-0.0266825370490551,
-0.012905796989798546,
-0.03584686666727066,
-0.03527078777551651,
-0.07943589985370636,
-0.01082644984126091,
-0.04003196582198143,
-0.011986586265265942,
-0.09230682998895645,
-0.006154751870781183,
0.03154730051755905,
0.0329773873090744,
-0.010170944035053253,
-0.01312355138361454,
-0.04457862675189972,
-0.06738092750310898,
-0.04121601581573486,
0.10115928947925568,
-0.14373132586479187,
-0.01778244599699974,
0.041173163801431656,
-0.10609489679336548,
0.08015836030244827,
-0.008490451611578465,
-0.028740491718053818,
0.0406600721180439,
-0.032071780413389206,
-0.04434344172477722,
0.030972659587860107,
0.035955190658569336,
0.0545600987970829,
-0.1067667230963707,
0.016834735870361328,
-0.05409726873040199,
0.015528928488492966,
0.012037456035614014,
0.021268820390105247,
-0.08315832912921906,
0.019907601177692413,
-0.02972934953868389,
-0.027320344001054764,
-0.09818895161151886,
0.017821453511714935,
0.039180394262075424,
0.03172500431537628,
0.15629138052463531,
-0.06618576496839523,
0.061928778886795044,
-0.14435943961143494,
0.00856958981603384,
0.005684705916792154,
-0.053827930241823196,
0.08923831582069397,
-0.10965843498706818,
0.058340009301900864,
-0.04494757950305939,
0.05665641278028488,
-0.023818397894501686,
0.05741206556558609,
0.07332541793584824,
0.03559251129627228,
-0.005349569022655487,
0.030976729467511177,
0.05099240317940712,
0.04570921137928963,
-0.005104337353259325,
-0.06709802150726318,
0.01575397327542305,
0.01696198247373104,
-0.056863933801651,
0.05489314720034599,
0.0828951969742775,
0.0268558356910944,
0.08845210075378418,
0.08439335227012634,
-0.011518734507262707,
-0.08033540099859238,
0.018864765763282776,
-0.026749638840556145,
0.0540829636156559,
-0.039772506803274155,
0.043657634407281876,
0.16687963902950287,
-0.15825097262859344,
0.10885846614837646,
-0.003966029733419418,
-0.06160314381122589,
-0.08930661529302597,
-0.11907179653644562,
-0.07390495389699936,
-0.028989726677536964,
-0.01854987069964409,
-0.12751443684101105,
-0.010780111886560917,
-0.02046024613082409,
0.011990761384367943,
-0.0020441333763301373,
0.131559818983078,
-0.12503394484519958,
-0.093376524746418,
0.08273472636938095,
-0.01904457062482834,
0.05444880574941635,
0.017840219661593437,
0.016376037150621414,
0.012935789301991463,
0.06458700448274612,
0.0350317507982254,
0.04615107178688049,
0.048184800893068314,
0.03729676082730293,
-0.09123317152261734,
-0.07797972112894058,
0.004475834779441357,
-0.009711271151900291,
-0.054740823805332184,
0.06677733361721039,
0.047575708478689194,
-0.08557570725679398,
-0.0033187984954565763,
0.24199023842811584,
-0.09959582984447479,
-0.11334960162639618,
-0.18795953691005707,
0.19564437866210938,
0.033631764352321625,
0.043697379529476166,
-0.02328728511929512,
-0.09246295690536499,
-0.015938710421323776,
0.13276872038841248,
0.1740609109401703,
-0.08781656622886658,
0.017923690378665924,
0.034182995557785034,
0.01835460774600506,
-0.0034386832267045975,
0.02333133853971958,
0.057303573936223984,
0.19452202320098877,
-0.044861096888780594,
0.09221634268760681,
-0.013579270802438259,
-0.04979146271944046,
-0.06516878306865692,
0.09440719336271286,
0.024465253576636314,
0.029179081320762634,
-0.01016614492982626,
0.12026365101337433,
-0.03446601331233978,
-0.0912558063864708,
-0.04103395715355873,
-0.08871123939752579,
-0.1187572106719017,
-0.03600756451487541,
0.03476500138640404,
0.020855864509940147,
0.09881311655044556,
0.03315285965800285,
-0.033708635717630386,
0.11470349133014679,
-0.01032562367618084,
-0.05538012087345123,
-0.0157189778983593,
0.025216765701770782,
-0.03248918429017067,
0.16440989077091217,
-0.0009896645788103342,
-0.025310102850198746,
0.12401502579450607,
-0.005170061718672514,
-0.056237414479255676,
0.08284918963909149,
0.04792475327849388,
-0.07920178025960922,
0.11619944870471954,
0.07330974191427231,
-0.017887761816382408,
0.09573409706354141,
0.08495849370956421,
-0.19623985886573792,
0.06628475338220596,
-0.042147401720285416,
-0.045919764786958694,
-0.06240082532167435,
0.06505049020051956,
-0.07973908632993698,
0.12739630043506622,
0.1753937304019928,
-0.024340135976672173,
0.007937926799058914,
-0.007603991311043501,
-0.005183283239603043,
0.03716641664505005,
0.07431482523679733,
-0.033356089144945145,
-0.09325932711362839,
-0.001266028848476708,
-0.0287096556276083,
0.01966506615281105,
-0.2816868722438812,
-0.12744250893592834,
0.016936367377638817,
-0.015169553458690643,
-0.0399395227432251,
0.1382201761007309,
0.09964381158351898,
-0.004854604601860046,
-0.03189346194267273,
-0.20500341057777405,
0.034746840596199036,
0.11198236048221588,
-0.11505858600139618,
-0.08008871972560883
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2159
- Accuracy: 0.9265
- F1: 0.9264
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8539 | 1.0 | 250 | 0.3270 | 0.909 | 0.9070 |
| 0.2595 | 2.0 | 500 | 0.2159 | 0.9265 | 0.9264 |
### Framework versions
- Transformers 4.32.1
- Pytorch 2.2.0
- Datasets 2.12.0
- Tokenizers 0.13.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["emotion"], "metrics": ["accuracy", "f1"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.9265, "name": "Accuracy"}, {"type": "f1", "value": 0.9264499819187068, "name": "F1"}]}]}]} | text-classification | Lionel79/distilbert-base-uncased-finetuned-emotion | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:51:13+00:00 | [] | [] | TAGS
#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| distilbert-base-uncased-finetuned-emotion
=========================================
This model is a fine-tuned version of distilbert-base-uncased on the emotion dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2159
* Accuracy: 0.9265
* F1: 0.9264
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 64
* eval\_batch\_size: 64
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.32.1
* Pytorch 2.2.0
* Datasets 2.12.0
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.32.1\n* Pytorch 2.2.0\n* Datasets 2.12.0\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.32.1\n* Pytorch 2.2.0\n* Datasets 2.12.0\n* Tokenizers 0.13.3"
] | [
77,
98,
4,
30
] | [
"passage: TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.32.1\n* Pytorch 2.2.0\n* Datasets 2.12.0\n* Tokenizers 0.13.3"
] | [
-0.10786963999271393,
0.15385624766349792,
-0.0021172550041228533,
0.14879444241523743,
0.15497882664203644,
0.03871756047010422,
0.10486365109682083,
0.13311579823493958,
-0.05349451303482056,
0.02133587747812271,
0.11341053992509842,
0.13712860643863678,
0.02587086521089077,
0.12171731144189835,
-0.06745874136686325,
-0.26776888966560364,
-0.010380463674664497,
0.05123697593808174,
0.01660560630261898,
0.1376214325428009,
0.10506100952625275,
-0.10366459935903549,
0.10870903730392456,
-0.008016177453100681,
-0.15645956993103027,
-0.0031520396005362272,
0.004319741390645504,
-0.027633462101221085,
0.12966777384281158,
0.01503471378237009,
0.0767735093832016,
0.010040266439318657,
0.08390124142169952,
-0.21069616079330444,
0.014158341102302074,
0.03086206503212452,
0.007001637946814299,
0.08634278178215027,
0.028238655999302864,
-0.021515263244509697,
0.11588914692401886,
-0.0743112713098526,
0.04386403039097786,
0.023437300696969032,
-0.12630993127822876,
-0.24906271696090698,
-0.07869331538677216,
0.0585213266313076,
0.07085756212472916,
0.1120038703083992,
-0.02735213004052639,
0.12825465202331543,
-0.0694635808467865,
0.09030162543058395,
0.19005261361598969,
-0.23530013859272003,
-0.0694461464881897,
0.013451436534523964,
0.009136711247265339,
0.06483641266822815,
-0.11628562211990356,
-0.048399053514003754,
0.04231684282422066,
0.051127754151821136,
0.12657363712787628,
-0.03743881359696388,
-0.046095870435237885,
0.00030723391682840884,
-0.11711524426937103,
-0.05147048458456993,
0.19314998388290405,
0.08369556814432144,
-0.03601032868027687,
-0.05802260711789131,
-0.05657770112156868,
-0.13323140144348145,
-0.027623312547802925,
0.0007005709339864552,
0.053916532546281815,
-0.003039877861738205,
-0.07172446697950363,
-0.008315780200064182,
-0.10894648730754852,
-0.02658843994140625,
-0.03472846746444702,
0.10697606205940247,
-0.0029974994249641895,
0.004049208946526051,
0.006906111724674702,
0.10167716443538666,
-0.01626761257648468,
-0.14908285439014435,
0.02324669435620308,
0.002214213367551565,
0.0340772420167923,
-0.029645200818777084,
-0.07065534591674805,
-0.029934585094451904,
-0.021509533748030663,
0.10532593727111816,
-0.04705158248543739,
0.04410700872540474,
0.02295246161520481,
0.027009477838873863,
-0.056076619774103165,
0.20868274569511414,
-0.03170190379023552,
-0.0677049309015274,
-0.009141860529780388,
0.10192139446735382,
0.041313089430332184,
-0.006180733907967806,
-0.13302679359912872,
0.029904618859291077,
0.11179554462432861,
0.001985499868169427,
-0.07733079046010971,
0.07578252255916595,
-0.09046613425016403,
-0.04665973037481308,
0.01143798790872097,
-0.06938638538122177,
0.01685219444334507,
0.008251652121543884,
-0.07670505344867706,
-0.012752982787787914,
0.01140595506876707,
0.023469852283596992,
-0.004268206190317869,
0.06145339086651802,
-0.08515611290931702,
0.015881935134530067,
-0.07276295125484467,
-0.09544269740581512,
0.015877844765782356,
-0.06803330779075623,
0.03855748474597931,
-0.09931475669145584,
-0.22732017934322357,
-0.021706264466047287,
0.07454996556043625,
-0.03272264078259468,
-0.07533112168312073,
-0.07967877388000488,
-0.04481687769293785,
0.014545499347150326,
-0.004798811860382557,
0.06550635397434235,
-0.07066573947668076,
0.09317109733819962,
0.038843341171741486,
0.07197543978691101,
-0.048377860337495804,
0.05357969179749489,
-0.13953092694282532,
0.020418070256710052,
-0.13557958602905273,
0.07282153517007828,
-0.03646650165319443,
0.09583524614572525,
-0.08206088840961456,
-0.10972724109888077,
0.03465336188673973,
-0.028262197971343994,
0.05818522348999977,
0.12279657274484634,
-0.17432630062103271,
-0.08782128244638443,
0.13823825120925903,
-0.052340079098939896,
-0.12093514204025269,
0.13676734268665314,
-0.060150470584630966,
0.0886857658624649,
0.07968563586473465,
0.21959231793880463,
0.05568988248705864,
-0.03196915239095688,
0.0018568056402727962,
0.0159639660269022,
0.08128275722265244,
-0.013510175049304962,
0.08487745374441147,
0.018560873344540596,
0.0004087289562448859,
0.03383069112896919,
-0.044584840536117554,
0.07708292454481125,
-0.08079864084720612,
-0.1045478880405426,
-0.024796001613140106,
-0.11254564672708511,
0.07369744032621384,
0.08355197310447693,
0.05909111723303795,
-0.10556068271398544,
-0.07473999261856079,
0.021028844639658928,
0.10309426486492157,
-0.06365621089935303,
0.024345271289348602,
-0.060679078102111816,
0.06783180683851242,
-0.01740766316652298,
-0.015799732878804207,
-0.16618779301643372,
0.024300534278154373,
0.017816221341490746,
0.042625222355127335,
0.010407701134681702,
0.00011563625594135374,
0.06381502747535706,
0.04261792451143265,
-0.07529566437005997,
-0.043570149689912796,
-0.04073748365044594,
0.004743334837257862,
-0.10646118968725204,
-0.1999274045228958,
-0.03185970336198807,
-0.019267039373517036,
0.20398740470409393,
-0.2075328230857849,
0.042987287044525146,
-0.021289508789777756,
0.06045426428318024,
0.019514940679073334,
-0.03104247897863388,
-0.021985523402690887,
0.051196806132793427,
-0.04166296124458313,
-0.057085368782281876,
0.08877991884946823,
0.019347364082932472,
-0.11053329706192017,
-0.037648383527994156,
-0.11573468148708344,
0.13828586041927338,
0.11453717947006226,
-0.07724911719560623,
-0.05096052587032318,
-0.011666364967823029,
-0.05409456044435501,
-0.01284603402018547,
-0.00891244225203991,
0.019728267565369606,
0.17708267271518707,
0.0016890851547941566,
0.15376748144626617,
-0.07042228430509567,
-0.01956043392419815,
0.014058439992368221,
-0.04205281287431717,
-0.003639749949797988,
0.13096700608730316,
0.05806568264961243,
-0.12827558815479279,
0.15043175220489502,
0.1771499067544937,
-0.06474464386701584,
0.15126283466815948,
-0.02910444140434265,
-0.04259314388036728,
-0.04657718539237976,
-0.054196350276470184,
-0.023380381986498833,
0.09307828545570374,
-0.15243540704250336,
-0.0000954354036366567,
0.02284187637269497,
0.007990931160748005,
-0.008669360540807247,
-0.19372764229774475,
-0.0517781525850296,
0.052121929824352264,
-0.04010524973273277,
-0.021131515502929688,
-0.0048640454187989235,
-0.009294191375374794,
0.09511858224868774,
0.020457295700907707,
-0.07483871281147003,
0.04313094541430473,
-0.0030817664228379726,
-0.07910101115703583,
0.1883542239665985,
-0.10129839926958084,
-0.18302664160728455,
-0.1181315928697586,
-0.07933665812015533,
-0.07764486223459244,
0.025836873799562454,
0.06696654111146927,
-0.09378093481063843,
-0.012591363862156868,
-0.09525296092033386,
0.016126280650496483,
-0.0023432495072484016,
-0.0025161891244351864,
0.04718885198235512,
-0.026726072654128075,
0.07217293977737427,
-0.09398370236158371,
-0.025100160390138626,
-0.025253277271986008,
-0.029557855799794197,
0.046041641384363174,
-0.00563657796010375,
0.10954222083091736,
0.15280334651470184,
0.009650804102420807,
0.004976233001798391,
-0.036011483520269394,
0.2708383798599243,
-0.06987486034631729,
-0.01364441029727459,
0.1519523561000824,
-0.01672094501554966,
0.06634510308504105,
0.1350424438714981,
0.05598163604736328,
-0.10171249508857727,
0.01836135797202587,
0.036192506551742554,
-0.02606002613902092,
-0.1854025274515152,
-0.0403282456099987,
-0.04853285104036331,
0.021921204403042793,
0.08338285237550735,
0.013719373382627964,
0.035707131028175354,
0.0712011381983757,
0.02835322543978691,
0.04994082450866699,
-0.047142576426267624,
0.0721484124660492,
0.13835684955120087,
0.02264263480901718,
0.1011199802160263,
-0.01633407548069954,
-0.03773894160985947,
0.0665694996714592,
-0.0100485784932971,
0.1644955724477768,
-0.0033370445016771555,
0.1536683887243271,
0.03480406105518341,
0.1709873527288437,
-0.04877491295337677,
0.04916338995099068,
-0.011701179668307304,
-0.021876150742173195,
-0.05096090957522392,
-0.023732896894216537,
-0.08058983087539673,
0.03817359358072281,
-0.039158035069704056,
0.0937698632478714,
-0.14288777112960815,
-0.026727553457021713,
0.066497802734375,
0.2920115292072296,
0.04978232458233833,
-0.32787638902664185,
-0.129535511136055,
0.029916271567344666,
-0.034566882997751236,
-0.017653178423643112,
0.008557885885238647,
0.07309021800756454,
-0.10060042887926102,
0.04204085096716881,
-0.0583312101662159,
0.08823873847723007,
-0.0666181892156601,
0.07567090541124344,
0.03307931870222092,
0.05895235762000084,
0.008988069370388985,
0.0879354178905487,
-0.25621652603149414,
0.25364217162132263,
-0.0023151040077209473,
0.04304833710193634,
-0.07093948870897293,
-0.013211854733526707,
0.06963794678449631,
0.08781445026397705,
0.06087224557995796,
0.008407032117247581,
0.02440677024424076,
-0.20552709698677063,
-0.03405110910534859,
0.02877647429704666,
0.05180564150214195,
-0.07505659013986588,
0.08742415904998779,
-0.03503333404660225,
0.014607190154492855,
0.06582649052143097,
0.05498434603214264,
-0.054277949035167694,
-0.0917537659406662,
0.003441758453845978,
0.03943857178092003,
-0.0020399014465510845,
-0.05066829174757004,
-0.11161503940820694,
-0.0701228529214859,
0.14675728976726532,
0.03441286459565163,
-0.04499237611889839,
-0.10211523622274399,
0.06856408715248108,
0.043490853160619736,
-0.08893720805644989,
0.0048354738391935825,
0.001775942975655198,
0.09222470223903656,
0.029126212000846863,
-0.06612056493759155,
0.10448334366083145,
-0.061954252421855927,
-0.17157432436943054,
-0.051985692232847214,
0.11155852675437927,
0.04568316042423248,
0.07429338991641998,
-0.00023545653675682843,
-0.007771698758006096,
-0.057529617100954056,
-0.07093857228755951,
0.04301084578037262,
0.029090894386172295,
0.05384063348174095,
0.022311879321932793,
-0.024379659444093704,
0.033373136073350906,
-0.08843714743852615,
-0.030589943751692772,
0.18241235613822937,
0.27556827664375305,
-0.07385875284671783,
0.028555508702993393,
0.04760949686169624,
-0.06276819854974747,
-0.1705576628446579,
0.03542859107255936,
0.05585474893450737,
0.008488197810947895,
0.06377178430557251,
-0.18472297489643097,
0.08477172255516052,
0.06754208356142044,
-0.017284046858549118,
0.07477312535047531,
-0.27057981491088867,
-0.1067764163017273,
0.12274671345949173,
0.12082284688949585,
0.12997005879878998,
-0.1411570906639099,
-0.01181687880307436,
-0.05322002246975899,
-0.12868112325668335,
0.11742711067199707,
-0.06398417800664902,
0.12193787097930908,
-0.012242830358445644,
0.1363881379365921,
0.015997249633073807,
-0.02439885586500168,
0.14685390889644623,
0.016700686886906624,
0.08788302540779114,
-0.07674893736839294,
-0.02201853133738041,
0.02474316768348217,
-0.055021192878484726,
0.023542707785964012,
-0.10705158114433289,
0.03249339386820793,
-0.15318086743354797,
-0.0303424634039402,
-0.10143056511878967,
0.014021016657352448,
-0.03180517628788948,
-0.0817965641617775,
-0.037057843059301376,
0.058123476803302765,
0.09478578716516495,
-0.00010560039663687348,
0.08395745605230331,
0.00866024848073721,
0.10492681711912155,
0.11898066848516464,
0.0969957709312439,
-0.054358113557100296,
-0.05119810998439789,
-0.033592693507671356,
-0.021984046325087547,
0.04354695975780487,
-0.16003088653087616,
0.028302541002631187,
0.1269611269235611,
0.006329052150249481,
0.17715811729431152,
0.06922376155853271,
-0.031028125435113907,
0.01623346656560898,
0.05172966793179512,
-0.1542404145002365,
-0.08952361345291138,
-0.04583325609564781,
-0.053811438381671906,
-0.1575137972831726,
-0.002131700050085783,
0.10412297397851944,
-0.06386931985616684,
-0.009223876520991325,
-0.01175718754529953,
0.023311028257012367,
-0.04269681125879288,
0.142645001411438,
0.04049009084701538,
0.02570192515850067,
-0.1001073494553566,
0.08803397417068481,
0.03450384736061096,
-0.08673682063817978,
0.02643882855772972,
0.05259567126631737,
-0.0807848647236824,
-0.05755642056465149,
0.032712843269109726,
0.19467392563819885,
-0.07157408446073532,
-0.02765222080051899,
-0.142852321267128,
-0.10719112306833267,
0.07149970531463623,
0.09723677486181259,
0.11434870213270187,
0.003471504431217909,
-0.08256813883781433,
0.009618504904210567,
-0.09625250101089478,
0.08714092522859573,
0.07442023605108261,
0.03834052011370659,
-0.13954944908618927,
0.09502808749675751,
-0.016581369563937187,
0.047127898782491684,
-0.016249150037765503,
0.0029920798260718584,
-0.08929934352636337,
-0.0015883194282650948,
-0.1425708830356598,
-0.03177447244524956,
-0.03406258299946785,
0.0241534560918808,
-0.004936247132718563,
-0.057493358850479126,
-0.037479959428310394,
-0.001998688792809844,
-0.1167297214269638,
-0.031069768592715263,
0.04168209061026573,
0.07661912590265274,
-0.10005468130111694,
-0.05335061997175217,
0.03306620195508003,
-0.06847581267356873,
0.09753039479255676,
0.045348845422267914,
0.01122125331312418,
0.04831235855817795,
-0.1451757252216339,
0.02026209980249405,
0.056299589574337006,
0.007551161572337151,
0.04496411606669426,
-0.08189159631729126,
-0.024827033281326294,
-0.014706923626363277,
0.03177018091082573,
0.020965229719877243,
0.1179559975862503,
-0.11729274690151215,
0.010628129355609417,
0.023817678913474083,
-0.06599261611700058,
-0.07297675311565399,
0.03534078970551491,
0.06473889946937561,
0.0369071364402771,
0.20862217247486115,
-0.06968249380588531,
0.04032745957374573,
-0.2128250002861023,
0.0000053046014727442525,
-0.0016034153522923589,
-0.11328635364770889,
-0.13129249215126038,
-0.08807288110256195,
0.04873386770486832,
-0.04586807265877724,
0.10197573155164719,
0.031951382756233215,
0.04899057373404503,
0.013159841299057007,
0.009640460833907127,
0.0521463006734848,
-0.008259502239525318,
0.19278447329998016,
0.02916562929749489,
-0.05457959324121475,
0.07926923781633377,
0.05041637271642685,
0.10819898545742035,
0.15195481479167938,
0.1654563993215561,
0.14374446868896484,
0.031194260343909264,
0.0866158977150917,
0.008737652562558651,
-0.006209502927958965,
-0.1727370023727417,
0.0050940741784870625,
-0.014032453298568726,
0.11915533989667892,
-0.017813390120863914,
0.2548348605632782,
0.04076025262475014,
-0.17259834706783295,
0.06304483860731125,
-0.07618582248687744,
-0.08040673285722733,
-0.07887792587280273,
-0.093267522752285,
-0.07721094787120819,
-0.1466258019208908,
-0.0068125431425869465,
-0.13252782821655273,
0.012929178774356842,
0.08810996264219284,
-0.010138685815036297,
-0.05222697928547859,
0.10930579155683517,
-0.01951885223388672,
-0.001364486524835229,
0.10366425663232803,
-0.009682134725153446,
-0.07086294144392014,
-0.10235605388879776,
-0.06878915429115295,
-0.0024297558702528477,
0.0034828234929591417,
0.046687282621860504,
-0.050282251089811325,
-0.06439366191625595,
0.005809057038277388,
-0.029671015217900276,
-0.11802200227975845,
0.005215846933424473,
0.017510011792182922,
0.06415820121765137,
0.054682668298482895,
-0.00024214820587076247,
0.01751777157187462,
0.007761599496006966,
0.2306690812110901,
-0.07819679379463196,
-0.01545387040823698,
-0.10931369662284851,
0.2306901216506958,
-0.002836816944181919,
-0.03202522546052933,
0.02835112251341343,
-0.08619765192270279,
0.006211056374013424,
0.23591099679470062,
0.18840987980365753,
-0.11888208985328674,
-0.007020673248916864,
-0.0189321581274271,
0.006210660561919212,
-0.04096046835184097,
0.09307718276977539,
0.13052819669246674,
-0.05554452911019325,
-0.105021171271801,
0.009870709851384163,
-0.06466060876846313,
-0.025669677183032036,
-0.017150579020380974,
0.058389514684677124,
0.049155667424201965,
0.011040478944778442,
-0.04984201490879059,
0.06277304887771606,
-0.08318071812391281,
-0.08547347784042358,
0.04535643011331558,
-0.21071724593639374,
-0.154916450381279,
-0.03207647055387497,
0.07124355435371399,
0.04278791323304176,
0.07237530499696732,
-0.015994876623153687,
0.023697273805737495,
0.11486302316188812,
-0.027601080015301704,
-0.08715210855007172,
-0.08913209289312363,
0.11688287556171417,
-0.1185763031244278,
0.1976739764213562,
-0.06173856183886528,
0.02690866030752659,
0.11833703517913818,
0.05785610154271126,
-0.06209227815270424,
0.07626096904277802,
0.05384266749024391,
-0.02331363596022129,
0.0178146343678236,
0.12157309055328369,
-0.030445005744695663,
0.11203291267156601,
0.04850682616233826,
-0.15407238900661469,
0.00034062599297612906,
-0.018847834318876266,
-0.0615813322365284,
-0.04122871533036232,
-0.021832125261425972,
-0.06147240102291107,
0.12933792173862457,
0.2127343863248825,
-0.0481329970061779,
-0.020160071551799774,
-0.06551926583051682,
0.0029647729825228453,
0.058545783162117004,
0.004429569002240896,
-0.06366611272096634,
-0.19644784927368164,
-0.0049848672933876514,
0.07994397729635239,
-0.0033324810210615396,
-0.26326531171798706,
-0.09645535796880722,
-0.007121717091649771,
-0.05200618505477905,
-0.06899163872003555,
0.09506385773420334,
0.03484056517481804,
0.03661433234810829,
-0.04664323478937149,
-0.059479907155036926,
-0.06750879436731339,
0.16873109340667725,
-0.13106505572795868,
-0.08160562813282013
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "abhishek/llama-2-7b-hf-small-shards"} | null | SudiptoPramanik/Llama_afterRLHF | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:abhishek/llama-2-7b-hf-small-shards",
"region:us"
] | 2024-02-12T11:51:13+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #arxiv-1910.09700 #base_model-abhishek/llama-2-7b-hf-small-shards #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-abhishek/llama-2-7b-hf-small-shards #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
47,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-abhishek/llama-2-7b-hf-small-shards #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.10699347406625748,
0.1971227377653122,
-0.003583307145163417,
0.024544263258576393,
0.07635419070720673,
0.020235417410731316,
0.07336179167032242,
0.1257624328136444,
0.02395547926425934,
0.12760676443576813,
0.05164535716176033,
0.10284461826086044,
0.1184501051902771,
0.20987476408481598,
-0.011586859822273254,
-0.1837567836046219,
0.023216333240270615,
-0.06825324892997742,
0.012838215567171574,
0.1236560046672821,
0.13674558699131012,
-0.09517788141965866,
0.0742860808968544,
-0.020621996372938156,
-0.006409227382391691,
-0.02964291162788868,
-0.06618174910545349,
-0.014642400667071342,
0.04959871992468834,
0.04011470451951027,
0.05427331104874611,
-0.0007050324347801507,
0.09231961518526077,
-0.2698877155780792,
0.012657910585403442,
0.0488964319229126,
-0.0077476417645812035,
0.08430449664592743,
0.09796309471130371,
-0.044352639466524124,
0.11223210394382477,
-0.04441770166158676,
0.1343943178653717,
0.07890889048576355,
-0.09981678426265717,
-0.22156867384910583,
-0.06683343648910522,
0.07883855700492859,
0.1861802488565445,
0.062083832919597626,
-0.03839198872447014,
0.12056545168161392,
-0.06991444528102875,
0.01541983149945736,
0.08678808808326721,
-0.10728004574775696,
-0.06306198239326477,
0.08447086811065674,
0.12202158570289612,
0.0869792252779007,
-0.11815701425075531,
-0.03623269870877266,
0.03171732649207115,
0.04267009347677231,
0.08279767632484436,
0.012394574470818043,
0.17877650260925293,
0.03932635858654976,
-0.13975664973258972,
-0.05130520462989807,
0.1109873354434967,
0.010414454154670238,
-0.03627357259392738,
-0.22283034026622772,
-0.01827269420027733,
-0.08598846197128296,
-0.03783434256911278,
-0.053248390555381775,
0.03684910014271736,
0.007758451160043478,
0.10966116189956665,
-0.041317541152238846,
-0.07912120223045349,
-0.016949554905295372,
0.11304212361574173,
0.06928108632564545,
0.013114630244672298,
-0.01894671842455864,
0.0023737659212201834,
0.12741796672344208,
0.05850732699036598,
-0.1293746680021286,
-0.05273188278079033,
-0.06183089688420296,
-0.032115574926137924,
-0.019647523760795593,
0.05221475660800934,
0.02657434716820717,
0.04376090690493584,
0.25717324018478394,
-0.018542109057307243,
0.061512865126132965,
0.050300922244787216,
0.012497742660343647,
0.02870885282754898,
0.10214105993509293,
-0.04010435938835144,
-0.195006862282753,
-0.010903414338827133,
0.10465381294488907,
0.008602630347013474,
-0.028525955975055695,
-0.04961594194173813,
0.0243409164249897,
0.03495613858103752,
0.1190592497587204,
0.10147359222173691,
-0.023673931136727333,
-0.06292092800140381,
-0.060509711503982544,
0.21188734471797943,
-0.15716682374477386,
0.05172860994935036,
0.026346126571297646,
-0.008332066237926483,
-0.06606714427471161,
0.015129856765270233,
0.01274073775857687,
-0.03832026571035385,
0.11036225408315659,
-0.0609382800757885,
-0.0472223274409771,
-0.11608365178108215,
-0.045180078595876694,
0.03236669301986694,
-0.010381942614912987,
-0.04850010201334953,
-0.028497161343693733,
-0.08790459483861923,
-0.09626614302396774,
0.0981471911072731,
-0.05558031424880028,
-0.05750330165028572,
-0.023148728534579277,
-0.06426914036273956,
0.02663780003786087,
0.02426796220242977,
0.06023980304598808,
-0.028073471039533615,
0.04176730290055275,
-0.029233820736408234,
0.07016145437955856,
0.08655054867267609,
0.03624721243977547,
-0.07439612597227097,
0.07048047333955765,
-0.18390044569969177,
0.08384464681148529,
-0.06330139189958572,
0.028704959899187088,
-0.1604757308959961,
0.0008641813765279949,
0.000858556421007961,
0.02307170256972313,
0.04658922180533409,
0.15425190329551697,
-0.19318126142024994,
-0.033398572355508804,
0.17197148501873016,
-0.10080040246248245,
-0.1122516542673111,
0.0376737117767334,
-0.04523179307579994,
0.1636926382780075,
0.03823866695165634,
0.009596826508641243,
0.09352289885282516,
-0.15142850577831268,
-0.012126720510423183,
-0.032214242964982986,
0.018481090664863586,
0.06540505588054657,
0.0724090114235878,
-0.0801382064819336,
0.004011440556496382,
0.010885223746299744,
-0.05428250506520271,
-0.022097747772932053,
-0.03675760328769684,
-0.09783302992582321,
0.004062614403665066,
-0.08073797076940536,
0.010093307122588158,
0.004141668323427439,
-0.08483864367008209,
-0.014521435834467411,
-0.1403089165687561,
-0.016513125970959663,
0.07492953538894653,
0.006066426634788513,
-0.011766308918595314,
-0.077457495033741,
0.03123028576374054,
-0.052109312266111374,
-0.013744068332016468,
-0.1444820612668991,
-0.009579253382980824,
0.025940893217921257,
-0.15662381052970886,
0.008185234852135181,
-0.1315445899963379,
0.07076448947191238,
0.014909021556377411,
-0.06265942752361298,
-0.03554891422390938,
0.022541819140315056,
-0.00812786165624857,
-0.06669461727142334,
-0.2229660153388977,
-0.034530654549598694,
-0.046577490866184235,
0.1284744143486023,
-0.21768152713775635,
0.049146588891744614,
0.005905713886022568,
0.11993034929037094,
0.00970812700688839,
-0.06187696382403374,
0.024081360548734665,
-0.06256313621997833,
-0.022238943725824356,
-0.06936368346214294,
-0.010271318256855011,
-0.0010852667037397623,
-0.03008008375763893,
0.026824045926332474,
-0.1517122983932495,
-0.05488169193267822,
0.09042689949274063,
0.09196798503398895,
-0.13659480214118958,
0.004888962954282761,
-0.03786727786064148,
-0.06262458115816116,
-0.08200738579034805,
-0.07621298730373383,
0.06317068636417389,
0.047225069254636765,
0.051110975444316864,
-0.08825308084487915,
-0.07311037927865982,
-0.0005547718610614538,
-0.01647217571735382,
-0.024021800607442856,
0.12094622850418091,
0.07286685705184937,
-0.09482403099536896,
0.09619268029928207,
0.07993049174547195,
0.040188584476709366,
0.09983401745557785,
-0.006383558269590139,
-0.09686554223299026,
-0.03377996385097504,
0.05269542708992958,
0.013857529498636723,
0.15131650865077972,
-0.057154133915901184,
0.04630991071462631,
0.048604581505060196,
-0.03889191150665283,
0.042232248932123184,
-0.09867005050182343,
0.01393104251474142,
0.01091119647026062,
-0.016083067283034325,
0.024716703221201897,
-0.025643637403845787,
0.012800296768546104,
0.09092353284358978,
0.06621462106704712,
0.03387434780597687,
0.016909204423427582,
-0.0417947992682457,
-0.13967853784561157,
0.17289355397224426,
-0.09058249741792679,
-0.22044524550437927,
-0.15456469357013702,
0.02931894361972809,
0.051984500139951706,
-0.014690536074340343,
0.033910397440195084,
-0.0437987819314003,
-0.09740009903907776,
-0.08537320047616959,
0.03353768214583397,
0.051486674696207047,
-0.06674826890230179,
-0.06382626295089722,
0.035320259630680084,
0.02747560478746891,
-0.13649800419807434,
0.024960268288850784,
0.05282491073012352,
0.003955569583922625,
-0.004507279023528099,
0.03299186751246452,
0.08000575006008148,
0.20870190858840942,
-0.0012048265198245645,
0.0024030576460063457,
0.059266701340675354,
0.27994251251220703,
-0.14962585270404816,
0.12574094533920288,
0.11970692127943039,
-0.05579524487257004,
0.08755192160606384,
0.20636200904846191,
0.03928378224372864,
-0.08095680922269821,
0.020490724593400955,
0.038306090980768204,
-0.04005756601691246,
-0.26552459597587585,
-0.05348338931798935,
-0.023825332522392273,
-0.07506044954061508,
0.0825115218758583,
0.08191896229982376,
0.09928261488676071,
0.034811388701200485,
-0.07660829275846481,
-0.07147528231143951,
0.05693404749035835,
0.11530021578073502,
-0.045699335634708405,
0.021867569535970688,
0.08388914912939072,
-0.04707364737987518,
0.0020529800094664097,
0.0888100415468216,
-0.012312786653637886,
0.13886423408985138,
0.04918956011533737,
0.1159273162484169,
0.06998395919799805,
0.07160802185535431,
0.005560335237532854,
0.056816842406988144,
-0.003474916098639369,
0.0305032916367054,
0.014795099385082722,
-0.09407155960798264,
0.026729417964816093,
0.11973105370998383,
0.006128047592937946,
0.0315077006816864,
0.02389327436685562,
-0.07172010093927383,
0.03817487880587578,
0.21053661406040192,
0.028563229367136955,
-0.2022886723279953,
-0.07291050255298615,
0.06759870052337646,
-0.07351738959550858,
-0.14863112568855286,
-0.010235132649540901,
0.01938733644783497,
-0.15944671630859375,
0.019709307700395584,
-0.041707731783390045,
0.11046206206083298,
-0.06526513397693634,
-0.04224035143852234,
0.09332343190908432,
0.0593833290040493,
-0.045153722167015076,
0.04003725200891495,
-0.17527417838573456,
0.1076190173625946,
0.033965740352869034,
0.07103317230939865,
-0.08938317000865936,
0.08650250732898712,
0.004977005068212748,
-0.015346567146480083,
0.15393781661987305,
0.004726273473352194,
-0.06522157043218613,
-0.08024265617132187,
-0.06988941133022308,
-0.017593787983059883,
0.08946067839860916,
-0.14124135673046112,
0.0725068524479866,
-0.018988752737641335,
-0.037213075906038284,
0.001412353478372097,
-0.10247126966714859,
-0.10711991041898727,
-0.16406604647636414,
0.06134098395705223,
-0.0775548592209816,
0.009087552316486835,
-0.08360070735216141,
-0.05033625289797783,
0.017176294699311256,
0.16568967700004578,
-0.18367375433444977,
-0.11813259869813919,
-0.14735034108161926,
-0.11470728367567062,
0.16306579113006592,
-0.050595786422491074,
0.0860830619931221,
-0.008587904274463654,
0.1608627438545227,
-0.013207466341555119,
-0.02747003547847271,
0.08988741040229797,
-0.0884033814072609,
-0.19483958184719086,
-0.050292644649744034,
0.18809963762760162,
0.13542763888835907,
0.029606154188513756,
-0.017969269305467606,
0.027754077687859535,
-0.04867918789386749,
-0.1078491359949112,
0.01773413084447384,
0.13608407974243164,
0.059113383293151855,
-0.0021800033282488585,
-0.03181421756744385,
-0.11991541087627411,
-0.052374012768268585,
-0.038359206169843674,
-0.014029915444552898,
0.20557501912117004,
-0.07351532578468323,
0.16126474738121033,
0.13773149251937866,
-0.05824268236756325,
-0.207164004445076,
0.035298075526952744,
0.028918376192450523,
0.019543716683983803,
0.02825239859521389,
-0.18215224146842957,
0.08249195665121078,
-0.015853941440582275,
-0.07721028476953506,
0.16621133685112,
-0.1943388432264328,
-0.1367349624633789,
0.09281676262617111,
0.018619608134031296,
-0.20687447488307953,
-0.13956449925899506,
-0.11179383099079132,
-0.02479638159275055,
-0.14937305450439453,
0.060378313064575195,
0.018668927252292633,
0.007699999492615461,
0.012134905904531479,
0.015525379218161106,
0.04227226972579956,
-0.05075035244226456,
0.1977524757385254,
-0.0218501053750515,
0.01052519679069519,
-0.05541570857167244,
-0.09953241050243378,
0.010562731884419918,
-0.06785273551940918,
0.11703959107398987,
-0.025299720466136932,
0.024598106741905212,
-0.15247198939323425,
-0.04788878187537193,
-0.06704726070165634,
0.01610306091606617,
-0.09523893892765045,
-0.09148925542831421,
-0.05135367438197136,
0.0771908387541771,
0.10642462223768234,
-0.02709820307791233,
0.02728138118982315,
-0.0833612009882927,
0.08857712149620056,
0.20854216814041138,
0.17252217233181,
0.0428154431283474,
-0.04481231048703194,
0.022708119824528694,
-0.03291280195116997,
0.0417916439473629,
-0.22793839871883392,
0.04840424656867981,
0.062022220343351364,
0.038499753922224045,
0.08592144399881363,
-0.006578431464731693,
-0.1627155840396881,
-0.08227770775556564,
0.08061929047107697,
-0.05432986095547676,
-0.164516419172287,
-0.029017983004450798,
0.0328204520046711,
-0.19620777666568756,
-0.038294583559036255,
0.03501031547784805,
-0.016923129558563232,
-0.038813259452581406,
0.02078322321176529,
0.08643589168787003,
-0.015556852333247662,
0.10236597806215286,
0.08246387541294098,
0.09547659009695053,
-0.10202079266309738,
0.06659121066331863,
0.08575590699911118,
-0.022748500108718872,
0.00828038901090622,
0.14232218265533447,
-0.04721663519740105,
-0.025748105719685555,
0.07952147722244263,
0.10450857132673264,
0.0038585320580750704,
-0.04637661948800087,
0.014981595799326897,
-0.06875118613243103,
0.06565114855766296,
0.12420330196619034,
0.019887294620275497,
-0.012110329233109951,
0.06930259615182877,
0.027000833302736282,
-0.09257467836141586,
0.12686027586460114,
0.07092197239398956,
0.025484124198555946,
-0.022495120763778687,
-0.032836899161338806,
-0.014672204852104187,
-0.007120269816368818,
-0.01606658101081848,
-0.0029035231564193964,
-0.0875977948307991,
-0.002999938791617751,
-0.12824030220508575,
0.01881222426891327,
-0.0860484316945076,
0.003732419805601239,
0.010212101973593235,
-0.045056432485580444,
-0.005548934917896986,
-0.005526408087462187,
-0.07897616922855377,
-0.05740382522344589,
-0.032781168818473816,
0.0744740292429924,
-0.13598650693893433,
0.023424271494150162,
0.07497330754995346,
-0.11312133073806763,
0.06420570611953735,
-0.009431622922420502,
0.011679664254188538,
0.004349694121629,
-0.14308170974254608,
0.05545574054121971,
-0.0226447694003582,
-0.011635751463472843,
0.017502691596746445,
-0.17212380468845367,
-0.006804498843848705,
-0.05037299543619156,
-0.07313927263021469,
0.006437409203499556,
-0.02083471789956093,
-0.12881781160831451,
0.125935897231102,
-0.013599801808595657,
-0.06700551509857178,
-0.017621202394366264,
0.05137757584452629,
0.0749928429722786,
-0.021006669849157333,
0.09092144668102264,
-0.02227432280778885,
0.08135953545570374,
-0.18160021305084229,
-0.01245962642133236,
-0.012562201358377934,
0.033254027366638184,
-0.022921988740563393,
-0.02522880770266056,
0.050981201231479645,
-0.013588383793830872,
0.16283321380615234,
-0.00988541729748249,
0.05819924175739288,
0.04845843464136124,
0.012515101581811905,
0.03407284617424011,
0.06761890649795532,
0.05763082578778267,
-0.025644637644290924,
-0.011524547822773457,
0.03243128955364227,
-0.0071792141534388065,
-0.04577532038092613,
-0.14254331588745117,
0.05909429490566254,
0.18087618052959442,
0.08097733557224274,
0.02932438626885414,
0.012315950356423855,
-0.12867899239063263,
-0.09323500841856003,
0.09093678742647171,
-0.014520907774567604,
-0.027486171573400497,
-0.06759712845087051,
0.20742575824260712,
0.13779567182064056,
-0.19944335520267487,
0.07878630608320236,
-0.04706763103604317,
-0.03584764897823334,
-0.13350442051887512,
-0.1593652367591858,
-0.05861877277493477,
-0.03632631152868271,
-0.03253157436847687,
-0.06311355531215668,
0.057844940572977066,
0.05163907632231712,
0.002178684575483203,
-0.0016993326134979725,
0.10467545688152313,
0.005679375492036343,
-0.02864239178597927,
0.04888579249382019,
0.07234039157629013,
0.04647107794880867,
-0.07961642742156982,
0.011512331664562225,
-0.00018376098887529224,
0.009386717341840267,
0.06049186363816261,
0.023965366184711456,
-0.057554714381694794,
0.02256162092089653,
-0.00836299080401659,
-0.12026534229516983,
0.041432395577430725,
-0.010670728050172329,
-0.020493080839514732,
0.14475546777248383,
0.028639420866966248,
0.004857164807617664,
-0.017802000045776367,
0.22684593498706818,
-0.07779701799154282,
-0.08634472638368607,
-0.12921692430973053,
0.07303082197904587,
-0.05119756609201431,
0.03276696801185608,
0.029197784140706062,
-0.12964490056037903,
0.010032201185822487,
0.15956175327301025,
0.13611631095409393,
0.0008855846244841814,
0.009519358165562153,
0.04324052855372429,
0.008392111398279667,
-0.021933143958449364,
0.017339738085865974,
0.04192319139838219,
0.2069290429353714,
-0.072747603058815,
0.0817645937204361,
-0.01218883041292429,
-0.07534444332122803,
-0.020809708163142204,
0.13590647280216217,
-0.011990283615887165,
-0.009379115886986256,
-0.05743042379617691,
0.13458845019340515,
-0.059213269501924515,
-0.21838124096393585,
0.05993166193366051,
-0.08758814632892609,
-0.13376697897911072,
-0.03769029304385185,
0.00638866750523448,
-0.02785409800708294,
0.013334917835891247,
0.06855727732181549,
-0.05614680424332619,
0.16600708663463593,
0.03080684132874012,
-0.05920194834470749,
-0.0951462984085083,
0.056191202253103256,
-0.1347576528787613,
0.2860422432422638,
0.02786613255739212,
0.0335145965218544,
0.10492386668920517,
-0.022750895470380783,
-0.14059822261333466,
0.01842447929084301,
0.10965978354215622,
-0.06516814976930618,
0.04669521749019623,
0.1617397516965866,
-0.007608379237353802,
0.12335260212421417,
0.05789272114634514,
-0.06586656719446182,
0.031267981976270676,
-0.056328143924474716,
-0.0599135160446167,
-0.12529927492141724,
0.06682249903678894,
-0.07435989379882812,
0.14613938331604004,
0.12364298850297928,
-0.06717362254858017,
-0.004558040294796228,
-0.015941154211759567,
0.08029764145612717,
0.014468277804553509,
0.1276220679283142,
0.02077033556997776,
-0.18411029875278473,
0.048255812376737595,
0.006786162033677101,
0.11166655272245407,
-0.21364440023899078,
-0.0620138943195343,
0.0410505086183548,
-0.025246944278478622,
-0.0855063796043396,
0.12109514325857162,
0.04862869158387184,
0.020647911354899406,
-0.02955223061144352,
-0.0953143760561943,
0.008462884463369846,
0.1554509550333023,
-0.10332811623811722,
-0.014355878345668316
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5
This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0807
- Em accuracy: 86.336
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.35.1
- Pytorch 2.1.2+cu121
- Datasets 2.14.7
- Tokenizers 0.14.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "google-t5/t5-small", "model-index": [{"name": "t5", "results": []}]} | text2text-generation | macabdul9/t5-small-glue-10K | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T11:52:54+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google-t5/t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# t5
This model is a fine-tuned version of google-t5/t5-small on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0807
- Em accuracy: 86.336
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.35.1
- Pytorch 2.1.2+cu121
- Datasets 2.14.7
- Tokenizers 0.14.1
| [
"# t5\n\nThis model is a fine-tuned version of google-t5/t5-small on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0807\n- Em accuracy: 86.336",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 128\n- eval_batch_size: 64\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 512\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.7\n- Tokenizers 0.14.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google-t5/t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# t5\n\nThis model is a fine-tuned version of google-t5/t5-small on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0807\n- Em accuracy: 86.336",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 128\n- eval_batch_size: 64\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 512\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.7\n- Tokenizers 0.14.1"
] | [
82,
56,
6,
12,
8,
3,
113,
4,
35
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google-t5/t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# t5\n\nThis model is a fine-tuned version of google-t5/t5-small on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0807\n- Em accuracy: 86.336## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 128\n- eval_batch_size: 64\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 512\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.35.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.7\n- Tokenizers 0.14.1"
] | [
-0.10779975354671478,
0.18635457754135132,
-0.0027445435989648104,
0.08743532747030258,
0.1476438045501709,
0.01454109139740467,
0.10008418560028076,
0.12740759551525116,
-0.06603668630123138,
0.11436251550912857,
0.08091061562299728,
0.04163190349936485,
0.07073431462049484,
0.1802772432565689,
0.018154079094529152,
-0.24993345141410828,
-0.00506226159632206,
-0.025970378890633583,
-0.07320178300142288,
0.10768219083547592,
0.1121673732995987,
-0.07609239965677261,
0.0752883180975914,
0.006017620209604502,
-0.12627725303173065,
0.011793057434260845,
-0.019277481362223625,
-0.0691068172454834,
0.08477088063955307,
0.030762815847992897,
0.030117809772491455,
0.02813248336315155,
0.09627697616815567,
-0.19250233471393585,
-0.002975985174998641,
0.07401641458272934,
0.02267131395637989,
0.096010223031044,
0.07726436853408813,
0.004007445182651281,
0.04816391319036484,
-0.14620636403560638,
0.08377434313297272,
0.05630847066640854,
-0.06222235783934593,
-0.15355439484119415,
-0.06875592470169067,
0.1076088547706604,
0.10180094093084335,
0.08645457774400711,
-0.00039968229248188436,
0.12270724773406982,
-0.023336704820394516,
0.0617552325129509,
0.2073785811662674,
-0.278439998626709,
-0.057467784732580185,
0.0701821818947792,
0.04924079403281212,
0.0694899708032608,
-0.09574238210916519,
0.024521611630916595,
0.04830442741513252,
-0.004499087575823069,
0.1026245653629303,
0.012361875735223293,
0.03801342844963074,
-0.0123478714376688,
-0.11413685232400894,
-0.03648551180958748,
0.2050887495279312,
0.09129378199577332,
-0.04460981860756874,
-0.12996786832809448,
-0.05316764488816261,
-0.1274813860654831,
-0.009886704385280609,
-0.04957202449440956,
0.038612000644207,
-0.02964339405298233,
-0.05709657073020935,
-0.06199263408780098,
-0.05706509202718735,
-0.05495917797088623,
0.03788654878735542,
0.07884906232357025,
0.03174413740634918,
-0.010780541226267815,
0.008464229293167591,
0.1045796200633049,
-0.007414979860186577,
-0.12638475000858307,
-0.033998213708400726,
-0.003313623135909438,
-0.11464202404022217,
-0.04494057223200798,
-0.011945996433496475,
-0.0007004390936344862,
0.02050386182963848,
0.15280061960220337,
0.013528289273381233,
0.08133631944656372,
0.04960763081908226,
0.010274185799062252,
-0.010458632372319698,
0.15710487961769104,
-0.04953991249203682,
-0.09704781323671341,
0.018502211198210716,
0.09181534498929977,
0.025884730741381645,
-0.02470150776207447,
-0.08998006582260132,
-0.02075163647532463,
0.128120556473732,
0.07558990269899368,
0.010585876181721687,
0.04088097810745239,
-0.05918223038315773,
-0.0368628203868866,
0.027518952265381813,
-0.13465248048305511,
0.024348249658942223,
-0.03309324011206627,
-0.05957204848527908,
-0.028081238269805908,
0.03724023327231407,
-0.015778252854943275,
-0.05581614375114441,
0.013407062739133835,
-0.08905191719532013,
-0.04043395817279816,
-0.04323505610227585,
-0.012409891933202744,
0.0019134131725877523,
-0.0553438737988472,
-0.00978251826018095,
-0.09290177375078201,
-0.18467845022678375,
-0.05962740257382393,
0.02820037119090557,
-0.07713653892278671,
-0.10292466729879379,
-0.020921289920806885,
-0.03993687778711319,
0.026861779391765594,
-0.019623853266239166,
0.06994474679231644,
-0.022788863629102707,
0.05977800861001015,
0.013036355376243591,
0.029658783227205276,
0.0933004766702652,
0.032556306570768356,
-0.08095437288284302,
0.05438085272908211,
-0.16243180632591248,
0.12272421270608902,
-0.08623961359262466,
0.055493321269750595,
-0.16561606526374817,
-0.07780873030424118,
-0.014601035974919796,
-0.03597988188266754,
0.06239146366715431,
0.14077794551849365,
-0.12612298130989075,
-0.02843005210161209,
0.13101820647716522,
-0.06693394482135773,
-0.11227861046791077,
0.10491243749856949,
0.001467116759158671,
0.027104778215289116,
0.057129211723804474,
0.14708897471427917,
0.13459192216396332,
-0.08688701689243317,
-0.005580590572208166,
0.0255990419536829,
0.06805657595396042,
0.036249130964279175,
0.08109167218208313,
-0.057088688015937805,
-0.023332646116614342,
0.02446572296321392,
-0.0824500098824501,
0.008702210150659084,
-0.06657756119966507,
-0.06762034446001053,
-0.057333558797836304,
-0.07846955955028534,
0.05791022628545761,
0.015259133651852608,
0.023712504655122757,
-0.04603356868028641,
-0.14474903047084808,
0.03879953920841217,
0.11364607512950897,
-0.06590934842824936,
0.0031900100875645876,
-0.06432144343852997,
0.08317174017429352,
-0.0435035265982151,
-0.005414620041847229,
-0.1739671230316162,
-0.14002364873886108,
0.06363031268119812,
-0.10973494499921799,
0.028527867048978806,
-0.03414054214954376,
0.04151463881134987,
0.055660929530858994,
-0.04651985689997673,
-0.03939448297023773,
-0.06424470245838165,
-0.007940676063299179,
-0.07759390026330948,
-0.1597609519958496,
-0.048419248312711716,
-0.004077424295246601,
0.15975427627563477,
-0.23300187289714813,
0.01666133664548397,
-0.00669622328132391,
0.13236483931541443,
0.0017718580784276128,
-0.07752583175897598,
0.02407086454331875,
-0.020374765619635582,
-0.025495707988739014,
-0.1326313018798828,
0.030199790373444557,
0.020991498604416847,
-0.11837562173604965,
-0.02677450142800808,
-0.14332032203674316,
0.07466199994087219,
0.06331253051757812,
0.08989084511995316,
-0.11130909621715546,
-0.06564360111951828,
-0.04870828241109848,
-0.05089370533823967,
-0.06013284996151924,
-0.031140748411417007,
0.18481393158435822,
0.00239126686938107,
0.11550723016262054,
-0.07631555944681168,
-0.07198607176542282,
0.020302370190620422,
0.01315320748835802,
-0.026167280972003937,
0.06914614140987396,
0.03293972089886665,
-0.13806061446666718,
0.07799704372882843,
0.09012817591428757,
-0.00615418516099453,
0.11411704868078232,
-0.05674075335264206,
-0.07219910621643066,
-0.04273436963558197,
0.02511242963373661,
0.007760256994515657,
0.09323110431432724,
-0.10935967415571213,
0.0029874518513679504,
0.03447984531521797,
0.009311199188232422,
0.018917573615908623,
-0.10478760302066803,
0.009326104074716568,
0.042702723294496536,
-0.04240776598453522,
0.038708046078681946,
-0.02756652422249317,
-0.022105352953076363,
0.07920718193054199,
0.03997604176402092,
0.012316899374127388,
0.035419199615716934,
-0.0014001581585034728,
-0.099310003221035,
0.18574650585651398,
-0.10060150176286697,
-0.17985640466213226,
-0.13249395787715912,
0.08280250430107117,
-0.055910587310791016,
-0.01260050106793642,
-0.0021309552248567343,
-0.06511159986257553,
-0.06432849168777466,
-0.08751632273197174,
0.015554457902908325,
-0.0693860799074173,
0.015181508846580982,
0.06601323932409286,
0.021892115473747253,
0.1062445342540741,
-0.10591516643762589,
0.016063785180449486,
0.024790078401565552,
-0.08761011809110641,
-0.029129058122634888,
0.014642994850873947,
0.10121411085128784,
0.1284756362438202,
-0.007295586634427309,
0.022571174427866936,
-0.04182460159063339,
0.18834717571735382,
-0.08592317998409271,
0.04264692962169647,
0.13147848844528198,
0.0433521494269371,
0.06333582103252411,
0.10102182626724243,
0.010070881806313992,
-0.08487226068973541,
0.0525277741253376,
0.03436897322535515,
-0.03259725496172905,
-0.2510876953601837,
-0.026198720559477806,
-0.022558467462658882,
-0.04613019898533821,
0.12564830482006073,
0.07415860891342163,
0.050347864627838135,
0.08098703622817993,
-0.02498937025666237,
0.06126977130770683,
0.004464657045900822,
0.09808297455310822,
0.0987543985247612,
0.05148119851946831,
0.09272550791501999,
-0.0208741445094347,
0.018219850957393646,
0.06677065789699554,
0.006150031462311745,
0.24026896059513092,
-0.02894698455929756,
0.16933098435401917,
0.014474056661128998,
0.13017050921916962,
-0.010182871483266354,
0.03733460232615471,
0.043835073709487915,
0.015980930998921394,
0.008739743381738663,
-0.0733436793088913,
-0.042096592485904694,
0.02953455224633217,
0.011129317805171013,
0.03576904535293579,
-0.09040635079145432,
0.07061497867107391,
0.01601351425051689,
0.23385299742221832,
0.05068830028176308,
-0.31983956694602966,
-0.0952577143907547,
0.00920899212360382,
-0.014852828346192837,
-0.08760131895542145,
0.024229034781455994,
0.08587287366390228,
-0.142414852976799,
0.08616272360086441,
-0.0525839701294899,
0.08054261654615402,
-0.07894980162382126,
-0.01886054314672947,
0.07042687386274338,
0.11069037765264511,
-0.005255799740552902,
0.10052991658449173,
-0.18076840043067932,
0.17805731296539307,
0.00384698947891593,
0.059125881642103195,
-0.04165999963879585,
0.06193364039063454,
0.0017455099150538445,
0.07833972573280334,
0.14695170521736145,
0.00752033106982708,
-0.042702410370111465,
-0.13573050498962402,
-0.13061191141605377,
0.011808337643742561,
0.10411396622657776,
-0.12746116518974304,
0.0622408464550972,
-0.06630808860063553,
-0.018820594996213913,
0.02252494916319847,
-0.08343884348869324,
-0.19479694962501526,
-0.15140193700790405,
0.025904517620801926,
-0.030184831470251083,
0.03509943187236786,
-0.08804275095462799,
-0.10260793566703796,
-0.045652542263269424,
0.23282557725906372,
-0.03426933288574219,
-0.07662779092788696,
-0.16807019710540771,
0.0925927385687828,
0.12373707443475723,
-0.07370166480541229,
0.0517612025141716,
-0.004546452313661575,
0.15439656376838684,
0.053566981106996536,
-0.08296126127243042,
0.08317913115024567,
-0.07118802517652512,
-0.18325436115264893,
-0.06420411169528961,
0.1483478993177414,
0.024341022595763206,
0.03516378626227379,
0.009581434540450573,
0.004896104335784912,
0.03263711929321289,
-0.09743457287549973,
0.017379742115736008,
0.08690805733203888,
0.06045849621295929,
0.0458994023501873,
-0.03193116560578346,
0.040965136140584946,
-0.04406847059726715,
-0.026868879795074463,
0.10634513944387436,
0.1986619234085083,
-0.09534949064254761,
0.06390941888093948,
0.05980999022722244,
-0.07729164510965347,
-0.16365216672420502,
0.0354473777115345,
0.12382375448942184,
0.015760749578475952,
0.049953024834394455,
-0.16823653876781464,
0.10450002551078796,
0.09676463156938553,
-0.03294618800282478,
0.0112740658223629,
-0.2857966423034668,
-0.1453900784254074,
0.05489727854728699,
0.0896565318107605,
-0.03476610407233238,
-0.1510186344385147,
-0.08333504945039749,
-0.028483036905527115,
-0.07949896901845932,
0.06004525348544121,
0.0032169788610190153,
0.08021263033151627,
-0.003168395720422268,
0.012164073064923286,
0.036728277802467346,
-0.027652516961097717,
0.14579913020133972,
0.03914579004049301,
0.032117124646902084,
-0.06407660245895386,
0.0636853501200676,
0.10457588732242584,
-0.09358488768339157,
0.06904809176921844,
-0.023752646520733833,
0.10420391708612442,
-0.13380007445812225,
-0.031112773343920708,
-0.03887680917978287,
0.07779082655906677,
-0.061937686055898666,
-0.053912874311208725,
-0.048887766897678375,
0.04911363497376442,
0.058224666863679886,
-0.030856585130095482,
0.07076075673103333,
0.022751616314053535,
0.06301963329315186,
0.11430276185274124,
0.09513309597969055,
0.02760300785303116,
-0.1840512454509735,
-0.014862225390970707,
-0.024590937420725822,
0.033869750797748566,
-0.15521466732025146,
0.024939827620983124,
0.10148854553699493,
0.04721536859869957,
0.10279443860054016,
0.025002622976899147,
-0.06635496020317078,
-0.0052501130849123,
0.039764802902936935,
-0.08301941305398941,
-0.19241799414157867,
-0.06380587071180344,
-0.03818615525960922,
-0.14959795773029327,
0.02710266225039959,
0.09819632768630981,
-0.05407669395208359,
-0.006326400209218264,
-0.011817258782684803,
0.020651280879974365,
0.01835724152624607,
0.1548624038696289,
0.038397181779146194,
0.08469854295253754,
-0.06940390169620514,
0.1425810605287552,
0.10150635987520218,
-0.06749074161052704,
0.049694739282131195,
0.046933479607105255,
-0.102723628282547,
-0.018985837697982788,
0.05745216831564903,
0.0749911516904831,
-0.020452478900551796,
-0.03166558966040611,
-0.05067644640803337,
-0.07923899590969086,
0.04142327234148979,
0.0239252969622612,
0.036162640899419785,
-0.0032743671908974648,
-0.004884920548647642,
0.003637971356511116,
-0.11105705797672272,
0.10088057816028595,
0.060015007853507996,
0.09074059873819351,
-0.18938703835010529,
0.03313926234841347,
0.023277444764971733,
0.05225541442632675,
-0.023990469053387642,
-0.002829326782375574,
-0.08436840027570724,
-0.043411772698163986,
-0.05570339784026146,
0.014028361067175865,
-0.03401743248105049,
-0.0008929140749387443,
-0.02134840004146099,
-0.04659292846918106,
-0.028256071731448174,
0.0577169805765152,
-0.04061131551861763,
-0.09222865104675293,
0.02104930952191353,
0.07740692049264908,
-0.10565076023340225,
0.008321491070091724,
0.03436578810214996,
-0.11783064156770706,
0.12737980484962463,
0.02421245165169239,
0.042121678590774536,
0.0012821786804124713,
-0.06111426278948784,
0.047869857400655746,
0.025479687377810478,
0.018064815551042557,
0.02579062059521675,
-0.11300110816955566,
0.005229638423770666,
-0.04725264757871628,
0.007649057544767857,
-0.0041076247580349445,
0.024419518187642097,
-0.13411180675029755,
-0.044451985508203506,
-0.06422889977693558,
-0.012267706915736198,
-0.06124279648065567,
0.04882718250155449,
0.06696268171072006,
-0.00021558043954428285,
0.12287982553243637,
-0.06859362870454788,
0.024196892976760864,
-0.22541479766368866,
-0.026246700435876846,
-0.00048186711501330137,
-0.013362032361328602,
-0.05623805150389671,
-0.02572023496031761,
0.07615850865840912,
-0.04002625122666359,
0.09658872336149216,
-0.013434577733278275,
0.1262248456478119,
0.03165357932448387,
-0.003461721120402217,
0.04095018282532692,
0.02153131179511547,
0.16310946643352509,
0.060694754123687744,
-0.0012092667166143656,
0.09521124511957169,
-0.006586316041648388,
0.06439860910177231,
-0.01454867236316204,
0.10661143064498901,
0.10079221427440643,
-0.08901070058345795,
0.06011224165558815,
0.03907083347439766,
-0.10532654821872711,
-0.19599410891532898,
0.12348436564207077,
-0.04945871978998184,
0.13667936623096466,
-0.03799952566623688,
0.09634946286678314,
0.12337615340948105,
-0.14851191639900208,
0.03126435726881027,
-0.033705007284879684,
-0.11043503880500793,
-0.10254435986280441,
-0.14058968424797058,
-0.08315553516149521,
-0.14951203763484955,
0.013541432097554207,
-0.10318127274513245,
-0.002259930595755577,
0.08257725834846497,
0.005695746745914221,
-0.0026719137094914913,
0.13927261531352997,
-0.0013250933261588216,
-0.01729537732899189,
0.07296911627054214,
0.0213975477963686,
-0.01619580015540123,
-0.04196443781256676,
-0.05942520499229431,
0.03650446981191635,
0.03319038078188896,
0.07544063776731491,
-0.027328578755259514,
0.015510366298258305,
0.06008695811033249,
0.0075032939203083515,
-0.07614874839782715,
0.012451373040676117,
0.008402219973504543,
-0.02344728447496891,
0.0009689584840089083,
0.036923449486494064,
-0.024467188864946365,
-0.04675808548927307,
0.2838342487812042,
-0.05702243745326996,
-0.017895152792334557,
-0.11713790893554688,
0.15209127962589264,
0.015679679811000824,
-0.003222170053049922,
0.06270997226238251,
-0.12129414081573486,
0.006038219202309847,
0.14614684879779816,
0.11542361229658127,
0.0030360929667949677,
-0.02565278857946396,
0.000780292961280793,
-0.027234066277742386,
-0.042485665529966354,
0.1305771768093109,
0.08582593500614166,
-0.0028731715865433216,
-0.022376099601387978,
0.02726288139820099,
0.012949425727128983,
-0.04451185092329979,
-0.09483785182237625,
0.09228820353746414,
0.002094171242788434,
0.02896706759929657,
-0.021414417773485184,
0.09607537090778351,
-0.007761466782540083,
-0.18132880330085754,
0.03562406077980995,
-0.1318293958902359,
-0.1794937700033188,
-0.03744867071509361,
0.050479959696531296,
-0.019671322777867317,
0.04633232578635216,
0.018763547763228416,
0.0016176598146557808,
0.13198652863502502,
-0.005769798997789621,
-0.08465518802404404,
-0.06619071960449219,
0.04352771118283272,
-0.0671902745962143,
0.25473588705062866,
-0.0017804389353841543,
0.02165490947663784,
0.1185266450047493,
0.007178347557783127,
-0.1773214489221573,
0.018121058121323586,
0.07204939424991608,
-0.010446595028042793,
0.09316956251859665,
0.14462131261825562,
-0.011136230081319809,
0.08383318036794662,
0.05057914927601814,
-0.06699953973293304,
-0.03636815398931503,
-0.06353794783353806,
0.03292745724320412,
-0.10439269989728928,
0.023936772719025612,
-0.06395505368709564,
0.1628597229719162,
0.14572711288928986,
-0.06916340440511703,
-0.0224101934581995,
-0.07034825533628464,
0.030045751482248306,
0.027698485180735588,
0.11592889577150345,
0.025061191990971565,
-0.17413370311260223,
0.017222294583916664,
-0.004734685178846121,
0.0463193878531456,
-0.2215971052646637,
-0.08619657903909683,
0.03070194646716118,
-0.046795837581157684,
-0.06754317879676819,
0.13692520558834076,
0.05660642310976982,
0.014765764586627483,
-0.03595487400889397,
-0.10152004659175873,
-0.04782550781965256,
0.1468474119901657,
-0.17979532480239868,
-0.05279823765158653
] |
null | null | transformers |
# DPOB-NMTOB-7B
DPOB-NMTOB-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [eren23/dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/dpo-binarized-NeutrixOmnibe-7B)
* [paulml/OmniBeagleSquaredMBX-v3-7B-v2](https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B-v2)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: eren23/dpo-binarized-NeutrixOmnibe-7B
layer_range: [0, 32]
- model: paulml/OmniBeagleSquaredMBX-v3-7B-v2
layer_range: [0, 32]
merge_method: slerp
base_model: eren23/dpo-binarized-NeutrixOmnibe-7B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "paulml/DPOB-NMTOB-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "cc-by-nc-4.0", "tags": ["merge", "mergekit", "lazymergekit", "eren23/dpo-binarized-NeutrixOmnibe-7B", "paulml/OmniBeagleSquaredMBX-v3-7B-v2"], "base_model": ["eren23/dpo-binarized-NeutrixOmnibe-7B", "paulml/OmniBeagleSquaredMBX-v3-7B-v2"]} | text-generation | paulml/DPOB-NMTOB-7B | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"eren23/dpo-binarized-NeutrixOmnibe-7B",
"paulml/OmniBeagleSquaredMBX-v3-7B-v2",
"base_model:eren23/dpo-binarized-NeutrixOmnibe-7B",
"base_model:paulml/OmniBeagleSquaredMBX-v3-7B-v2",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T11:56:12+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #eren23/dpo-binarized-NeutrixOmnibe-7B #paulml/OmniBeagleSquaredMBX-v3-7B-v2 #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-paulml/OmniBeagleSquaredMBX-v3-7B-v2 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DPOB-NMTOB-7B
DPOB-NMTOB-7B is a merge of the following models using LazyMergekit:
* eren23/dpo-binarized-NeutrixOmnibe-7B
* paulml/OmniBeagleSquaredMBX-v3-7B-v2
## Configuration
## Usage
| [
"# DPOB-NMTOB-7B\n\nDPOB-NMTOB-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeutrixOmnibe-7B\n* paulml/OmniBeagleSquaredMBX-v3-7B-v2",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #eren23/dpo-binarized-NeutrixOmnibe-7B #paulml/OmniBeagleSquaredMBX-v3-7B-v2 #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-paulml/OmniBeagleSquaredMBX-v3-7B-v2 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DPOB-NMTOB-7B\n\nDPOB-NMTOB-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeutrixOmnibe-7B\n* paulml/OmniBeagleSquaredMBX-v3-7B-v2",
"## Configuration",
"## Usage"
] | [
161,
73,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #eren23/dpo-binarized-NeutrixOmnibe-7B #paulml/OmniBeagleSquaredMBX-v3-7B-v2 #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-paulml/OmniBeagleSquaredMBX-v3-7B-v2 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DPOB-NMTOB-7B\n\nDPOB-NMTOB-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeutrixOmnibe-7B\n* paulml/OmniBeagleSquaredMBX-v3-7B-v2## Configuration## Usage"
] | [
-0.0700288861989975,
0.13101939857006073,
-0.007630344945937395,
0.0096651092171669,
0.04472579434514046,
0.0464479960501194,
0.13631251454353333,
0.09762877225875854,
-0.01983976550400257,
0.05760360136628151,
0.06863360106945038,
0.1551894098520279,
0.03122086450457573,
0.1600867360830307,
-0.07474586367607117,
-0.18136487901210785,
0.04245267063379288,
0.03789912164211273,
-0.07865024358034134,
0.06048004329204559,
0.109719417989254,
-0.030764412134885788,
0.09091831743717194,
0.009625639766454697,
-0.06877560913562775,
0.022049643099308014,
-0.04652887210249901,
-0.017245030030608177,
0.07158546149730682,
0.0630403608083725,
0.04730702191591263,
0.060255151242017746,
-0.032662954181432724,
-0.16612491011619568,
0.02374090813100338,
0.014627468772232533,
-0.023998968303203583,
0.059778545051813126,
0.09894116967916489,
-0.0212873388081789,
0.05128368362784386,
-0.104557104408741,
0.017083704471588135,
0.04272499307990074,
-0.0712304338812828,
-0.10715372115373611,
-0.0980730727314949,
0.1134621649980545,
0.028242487460374832,
0.010204906575381756,
-0.011472372338175774,
0.12603454291820526,
0.022773195058107376,
0.08931642770767212,
0.15121078491210938,
-0.2915601432323456,
-0.010507572442293167,
0.14820167422294617,
0.05536716431379318,
-0.03287456929683685,
0.0347476489841938,
0.026282574981451035,
-0.0029520487878471613,
0.012636984698474407,
0.035955678671598434,
-0.09004991501569748,
0.12395207583904266,
-0.05380908027291298,
-0.1348847895860672,
0.00955455657094717,
0.12798947095870972,
0.034888334572315216,
-0.021974094212055206,
-0.09954207390546799,
-0.08279059082269669,
0.06420400738716125,
-0.038308240473270416,
-0.06807953864336014,
0.03211536630988121,
-0.029195306822657585,
0.03136579692363739,
-0.06580331921577454,
-0.015108968131244183,
-0.007331569213420153,
-0.09339793026447296,
0.1396394968032837,
0.02179851196706295,
-0.008537132292985916,
-0.031328920274972916,
0.06360634416341782,
-0.18985851109027863,
-0.10583727806806564,
-0.015589135698974133,
-0.05076167732477188,
0.012012019753456116,
-0.04027969762682915,
-0.07563763111829758,
-0.10459470003843307,
0.12014786154031754,
0.2476184368133545,
-0.08030465245246887,
0.03507530689239502,
0.025951988995075226,
0.032351553440093994,
-0.0017936811782419682,
-0.021304666996002197,
-0.12238951027393341,
-0.17399314045906067,
0.07572001218795776,
0.05100587010383606,
0.06740201264619827,
0.03513428568840027,
-0.08824377506971359,
-0.07207830995321274,
0.05836918577551842,
0.024864938110113144,
0.07395405322313309,
0.10658907890319824,
-0.044962674379348755,
-0.07142399251461029,
0.2007422298192978,
-0.08917311578989029,
0.002442786702886224,
-0.013299047015607357,
-0.03604812175035477,
0.00839983019977808,
0.06983791291713715,
0.036761168390512466,
-0.028778918087482452,
0.06921027600765228,
-0.046035367995500565,
-0.04741722717881203,
-0.018590044230222702,
-0.0980897843837738,
0.022650476545095444,
-0.03135940432548523,
-0.008401532657444477,
-0.12325900793075562,
-0.19936347007751465,
-0.005290307570248842,
0.013281824998557568,
-0.02059868909418583,
-0.021877815946936607,
-0.05227058380842209,
0.019809558987617493,
-0.01689460501074791,
0.0030675467569381,
0.0496266670525074,
0.021568870171904564,
0.004035433288663626,
0.044158872216939926,
0.05015050247311592,
-0.1307806521654129,
0.015050076879560947,
-0.06441865861415863,
0.09931931644678116,
-0.17603318393230438,
0.04833179712295532,
-0.0353863462805748,
-0.0025204590056091547,
-0.13537177443504333,
-0.015981953591108322,
-0.04840699955821037,
0.0023934331256896257,
0.0812489241361618,
0.11904480308294296,
-0.08278422057628632,
-0.07675757259130478,
0.13331717252731323,
-0.0933956503868103,
-0.1175377294421196,
0.07589209824800491,
0.023892855271697044,
-0.002949342131614685,
0.04645198956131935,
0.15192535519599915,
0.12957338988780975,
-0.017724089324474335,
-0.06539086252450943,
0.0020206659100949764,
0.006497013848274946,
0.027459044009447098,
0.06253118813037872,
-0.03934279829263687,
-0.03230287507176399,
0.03957280144095421,
0.0007082619122229517,
0.02938953973352909,
-0.01942293345928192,
-0.04091973602771759,
-0.05367165431380272,
-0.08614141494035721,
0.09781656414270401,
-0.06791789084672928,
0.012020932510495186,
-0.06856219470500946,
-0.03876421973109245,
0.052433937788009644,
0.12286354601383209,
-0.03946549445390701,
0.011969187296926975,
-0.06710757315158844,
0.08811390399932861,
-0.09674973785877228,
0.05261287838220596,
-0.12190650403499603,
-0.07056029886007309,
0.017154786735773087,
-0.08450411260128021,
0.016356268897652626,
-0.042766641825437546,
0.08314302563667297,
0.03143616020679474,
-0.06904980540275574,
-0.03679494932293892,
0.08786073327064514,
0.021557264029979706,
-0.01848018914461136,
-0.10289018601179123,
-0.04046658053994179,
-0.05348639562726021,
0.1691770851612091,
-0.10273962467908859,
0.04740164428949356,
-0.022286348044872284,
0.18910396099090576,
0.007356914691627026,
-0.0045986175537109375,
0.03484614938497543,
0.032575760036706924,
-0.018194524571299553,
-0.010351988486945629,
0.06778247654438019,
0.002237909007817507,
-0.1170487180352211,
0.0802091434597969,
-0.12742382287979126,
0.08849417418241501,
0.0938921794295311,
0.0002767735568340868,
-0.037086378782987595,
-0.0736382007598877,
0.0011992137879133224,
-0.06108984723687172,
0.09058669954538345,
-0.07919081300497055,
0.02712107077240944,
0.028353532776236534,
0.11707336455583572,
-0.06406093388795853,
-0.0411185584962368,
0.007107602898031473,
-0.03142275661230087,
-0.06865277886390686,
0.07483270019292831,
0.0032753057312220335,
-0.22366252541542053,
0.10336004197597504,
0.13050448894500732,
-0.010821541771292686,
0.10359756648540497,
0.03649303317070007,
-0.004894604440778494,
-0.07380706071853638,
0.015071433037519455,
0.02942427434027195,
-0.04485403001308441,
-0.054965995252132416,
0.05457787215709686,
0.07513195276260376,
0.01168370246887207,
0.06713118404150009,
-0.04197066277265549,
0.03210986405611038,
0.01959245093166828,
-0.010583128780126572,
0.07826518267393112,
0.1082029640674591,
0.005837312899529934,
0.07630987465381622,
0.01164825726300478,
-0.039135780185461044,
0.034858979284763336,
-0.00008084702130872756,
-0.08344265073537827,
0.14865002036094666,
-0.13600338995456696,
-0.23684118688106537,
-0.13448452949523926,
-0.07904940843582153,
-0.12941838800907135,
-0.009250223636627197,
0.03123481199145317,
0.015705013647675514,
-0.038788385689258575,
-0.08809051662683487,
0.033414579927921295,
-0.025603512302041054,
-0.021262390539050102,
-0.024540890008211136,
0.007260502781718969,
0.04667429253458977,
-0.11382077634334564,
-0.024856459349393845,
0.018755223602056503,
-0.01700948178768158,
0.06627119332551956,
-0.04787330701947212,
0.07984452694654465,
0.07949431985616684,
0.018070360645651817,
-0.014926539734005928,
-0.009370798245072365,
0.21098408102989197,
-0.019688451662659645,
0.04830764979124069,
0.17790792882442474,
-0.0410049706697464,
0.07368176430463791,
0.13005749881267548,
0.013891027309000492,
-0.04045775160193443,
-0.004576522391289473,
-0.0064500547014176846,
0.004392718430608511,
-0.18562743067741394,
-0.12061959505081177,
-0.047410666942596436,
0.026328466832637787,
0.053452182561159134,
0.05119827762246132,
0.043309710919857025,
0.07291257381439209,
-0.06523186713457108,
0.05073457956314087,
0.039429109543561935,
0.0739603191614151,
0.23343347012996674,
0.010569272562861443,
0.0894397720694542,
-0.0174289271235466,
-0.04723834618926048,
0.0561026968061924,
0.09815271943807602,
0.04011154919862747,
0.04782941937446594,
0.1471860557794571,
0.04410790279507637,
0.01719677448272705,
0.029024476185441017,
0.069687120616436,
-0.013971736654639244,
-0.000524777511600405,
-0.015978749841451645,
-0.09006130695343018,
0.03249302878975868,
0.014258300885558128,
0.019260631874203682,
0.041887518018484116,
-0.0335262157022953,
-0.013221386820077896,
0.06990453600883484,
0.10258617252111435,
0.06263595819473267,
-0.2616487145423889,
-0.024362929165363312,
0.017088154330849648,
0.03289616480469704,
-0.03393465653061867,
-0.05190405622124672,
-0.0012062949826940894,
-0.0681077167391777,
0.1517169028520584,
-0.030878160148859024,
0.06187208741903305,
-0.06566095352172852,
0.015980195254087448,
-0.019584525376558304,
0.1419631391763687,
0.001154324971139431,
0.05164266377687454,
-0.2470046579837799,
0.07918602973222733,
0.05329006537795067,
-0.00803426280617714,
0.004071385134011507,
0.04206324368715286,
0.017588134855031967,
0.10819041728973389,
0.07485869526863098,
-0.003883359720930457,
0.10411636531352997,
-0.03340093046426773,
-0.09257587790489197,
-0.012869314290583134,
0.08585511893033981,
-0.0407831110060215,
0.07414906471967697,
-0.008691729977726936,
-0.05966835096478462,
0.00434085913002491,
0.08413704484701157,
-0.18157140910625458,
-0.11570794880390167,
0.10645129531621933,
0.08324339240789413,
0.011210373602807522,
-0.06977478414773941,
-0.024452993646264076,
-0.15257219970226288,
0.2654778063297272,
-0.039662715047597885,
-0.06923391669988632,
-0.07801792025566101,
-0.011111072264611721,
0.13331103324890137,
-0.08193700760602951,
0.06958470493555069,
-0.03980129584670067,
0.07111210376024246,
-0.09207763522863388,
-0.13972827792167664,
0.056058336049318314,
-0.0898083820939064,
-0.08333522826433182,
-0.042794495820999146,
0.11267375200986862,
-0.0428658127784729,
-0.0018034280510619283,
0.009219508618116379,
0.015829240903258324,
-0.017148863524198532,
-0.041170116513967514,
0.02462667226791382,
0.10908957570791245,
0.05563029274344444,
0.10315833240747452,
-0.05571494251489639,
-0.08428090065717697,
-0.03019733726978302,
-0.007525186520069838,
0.1501307487487793,
0.2948251962661743,
-0.0073762210085988045,
0.0019953271839767694,
0.12508133053779602,
-0.050820015370845795,
-0.17622213065624237,
-0.07124331593513489,
0.04030128940939903,
-0.00104161212220788,
0.020305199548602104,
-0.10057050734758377,
0.04271165281534195,
0.12338295578956604,
0.004389097914099693,
0.12595227360725403,
-0.25765565037727356,
-0.12650901079177856,
0.1009855642914772,
0.04939362406730652,
0.06895678490400314,
-0.100592702627182,
-0.08478566259145737,
-0.08493136614561081,
-0.18890629708766937,
0.09807465225458145,
0.007197852246463299,
0.07355660200119019,
-0.03860250487923622,
0.05806640908122063,
0.02182409167289734,
-0.04118029773235321,
0.12419018149375916,
-0.023087365552783012,
0.021342841908335686,
-0.06562007963657379,
-0.05269318446516991,
0.03736886754631996,
-0.05978880077600479,
0.07428120076656342,
-0.09989577531814575,
0.04938378557562828,
0.0027599020395427942,
-0.03510601446032524,
-0.050793129950761795,
0.0709698498249054,
-0.037369582802057266,
-0.06642791628837585,
-0.019358031451702118,
0.04102646932005882,
0.0010237025562673807,
0.019797910004854202,
0.06796155869960785,
-0.036019206047058105,
0.0254160538315773,
0.22557762265205383,
0.07733332365751266,
-0.034823037683963776,
-0.0243686530739069,
-0.0045594098046422005,
-0.058086518198251724,
0.03837861120700836,
0.025376316159963608,
0.017681816592812538,
0.08833319693803787,
-0.0022308158222585917,
0.080185666680336,
0.035622525960206985,
-0.08388544619083405,
-0.052456922829151154,
0.06915348023176193,
-0.15600883960723877,
-0.0860283151268959,
-0.03402946516871452,
-0.030645975843071938,
-0.05506078153848648,
0.07173854112625122,
0.21493108570575714,
0.002850462682545185,
-0.03223983198404312,
0.03549510985612869,
0.005261322483420372,
-0.08167193830013275,
0.14953382313251495,
0.03429945558309555,
0.036531124264001846,
-0.060367267578840256,
0.05576138570904732,
0.03562692180275917,
-0.042467281222343445,
-0.020420586690306664,
0.054818958044052124,
-0.09209351986646652,
-0.07817131280899048,
-0.09356258809566498,
0.14942197501659393,
-0.042806003242731094,
-0.03831474110484123,
-0.07817164808511734,
-0.054098743945360184,
0.038955941796302795,
0.09228501468896866,
0.05317853018641472,
0.02265683002769947,
0.019529251381754875,
-0.03719807416200638,
-0.0406557060778141,
0.08212734013795853,
0.022026026621460915,
0.12020640075206757,
-0.07898896932601929,
0.03507108613848686,
-0.058244042098522186,
0.026343844830989838,
-0.019542155787348747,
0.031008241698145866,
-0.14263544976711273,
-0.05490558221936226,
-0.12743902206420898,
-0.024056825786828995,
-0.1195964440703392,
-0.012607625685632229,
-0.01327530201524496,
0.02879476733505726,
-0.007025564089417458,
0.007819782942533493,
-0.02190386690199375,
-0.07011666148900986,
-0.01397263165563345,
0.05014440044760704,
-0.07765541225671768,
-0.02227831445634365,
-0.007160739973187447,
-0.05399123579263687,
0.07350729405879974,
0.007438981905579567,
-0.0023013532627373934,
-0.03800078108906746,
-0.12328445166349411,
-0.06987374275922775,
0.039924073964357376,
0.016483129933476448,
0.010711519978940487,
-0.12408656626939774,
-0.009411566890776157,
0.01552923209965229,
-0.07172808796167374,
-0.025427954271435738,
0.101021409034729,
-0.09412825852632523,
0.0039042211137712,
-0.05128448083996773,
-0.02140476368367672,
-0.056653570383787155,
-0.01864459179341793,
0.03858174383640289,
0.04592482000589371,
0.10427823662757874,
-0.0420723594725132,
0.02807486243546009,
-0.16290900111198425,
-0.014400752261281013,
0.0009988532401621342,
-0.11458484828472137,
0.051016345620155334,
0.007328107953071594,
0.03575359657406807,
-0.011237443424761295,
0.14362099766731262,
-0.04661547765135765,
-0.1038476899266243,
0.003002523211762309,
-0.04002145677804947,
-0.03354838117957115,
0.023981530219316483,
0.1817852109670639,
0.06130991131067276,
-0.01854257471859455,
-0.07570558041334152,
0.0830281600356102,
0.04990590363740921,
0.06647617369890213,
0.02824581041932106,
0.09806525707244873,
0.010257161222398281,
0.06642604619264603,
0.08757482469081879,
-0.0358242429792881,
-0.028374776244163513,
-0.0005557897966355085,
0.035178352147340775,
0.06554385274648666,
-0.005033078137785196,
0.105176642537117,
0.13492031395435333,
-0.12880398333072662,
0.04585779085755348,
0.03216732665896416,
-0.021371180191636086,
-0.08572189509868622,
-0.19626478850841522,
-0.12165272235870361,
-0.08505824208259583,
-0.018107762560248375,
-0.11547882109880447,
-0.02849475108087063,
0.0038611942436546087,
0.014989721588790417,
-0.011906785890460014,
0.06588488072156906,
-0.04158971086144447,
-0.04747011139988899,
0.032540470361709595,
-0.017009537667036057,
-0.03106754831969738,
0.029595645144581795,
-0.049697939306497574,
0.019452307373285294,
0.03827403858304024,
0.02951568365097046,
-0.009127496741712093,
0.017679404467344284,
0.0447504036128521,
-0.03952758386731148,
-0.07805929332971573,
-0.024012817069888115,
0.04569143056869507,
0.014260200783610344,
0.08086098730564117,
0.02137523517012596,
-0.04042978957295418,
-0.011281074956059456,
0.07544073462486267,
-0.00941966101527214,
-0.11897312849760056,
-0.05030365660786629,
0.14393344521522522,
0.01429519522935152,
0.07741650938987732,
0.03049566224217415,
-0.06271576136350632,
0.00695673655718565,
0.13320046663284302,
0.28287357091903687,
-0.04429011046886444,
0.016614258289337158,
0.05363575369119644,
0.006204730365425348,
0.04212870076298714,
0.03944152221083641,
0.07052493840456009,
0.16862709820270538,
-0.021892406046390533,
0.05914447456598282,
-0.021960480138659477,
-0.008471331559121609,
-0.055875927209854126,
0.02699134312570095,
0.025591576471924782,
-0.007695808075368404,
0.03827335312962532,
0.06136376038193703,
-0.05897996947169304,
0.00041947118006646633,
-0.01767653413116932,
-0.11254875361919403,
-0.11401654779911041,
-0.06790714710950851,
-0.0018088612705469131,
-0.019767317920923233,
0.0574491061270237,
-0.039894066751003265,
-0.04595967382192612,
0.08572481572628021,
-0.044787466526031494,
-0.051480118185281754,
-0.08857391774654388,
0.005761673673987389,
-0.019479168578982353,
0.062068842351436615,
-0.011256330646574497,
0.07456950098276138,
0.1127789169549942,
0.014776607975363731,
-0.09150412678718567,
0.076192706823349,
0.014284505508840084,
-0.02975548431277275,
0.0826396569609642,
0.06732393056154251,
-0.037186916917562485,
0.09239601343870163,
0.03444669768214226,
-0.12607501447200775,
0.03274057060480118,
0.11921963095664978,
-0.035092148929834366,
-0.06981410086154938,
0.07934170216321945,
-0.07053455710411072,
0.12115667760372162,
0.136790931224823,
-0.03465310484170914,
0.014705094508826733,
-0.03464742377400398,
0.05016043409705162,
0.08580771833658218,
0.08376753330230713,
-0.04795388504862785,
-0.1629217267036438,
0.022112617269158363,
0.03649982437491417,
0.014795495197176933,
-0.21284253895282745,
-0.0692533403635025,
-0.09659704566001892,
-0.022215723991394043,
-0.06595844030380249,
0.05137772485613823,
0.10507724434137344,
0.006004386115819216,
-0.03886988386511803,
-0.08879377692937851,
-0.03232048079371452,
0.08299284428358078,
-0.135257750749588,
-0.10493389517068863
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | Mlxa/atd-gpt2-medium | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T11:57:52+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #gpt2 #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #gpt2 #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
57,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #gpt2 #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05622259899973869,
0.16002345085144043,
-0.004987028427422047,
0.023115945979952812,
0.0962471067905426,
0.011845538392663002,
0.06785304099321365,
0.11496778577566147,
-0.020396295934915543,
0.11142492294311523,
0.03292480856180191,
0.0972127765417099,
0.11474913358688354,
0.16215258836746216,
0.004439093638211489,
-0.23455148935317993,
0.04782992601394653,
-0.12695099413394928,
-0.033447545021772385,
0.11785799264907837,
0.14491069316864014,
-0.10402194410562515,
0.07766910642385483,
-0.030544815585017204,
-0.009361269883811474,
-0.03290390968322754,
-0.06365230679512024,
-0.05152205005288124,
0.05037128925323486,
0.06932847946882248,
0.06591591984033585,
0.007509593386203051,
0.09122733771800995,
-0.2655104100704193,
0.02280162274837494,
0.07630051672458649,
-0.0015554219717159867,
0.07497020810842514,
0.048351652920246124,
-0.08209776133298874,
0.0788840726017952,
-0.05696587264537811,
0.14718368649482727,
0.08216129243373871,
-0.08924587815999985,
-0.1965435892343521,
-0.08464295417070389,
0.10284840315580368,
0.18357418477535248,
0.05158785358071327,
-0.024141347035765648,
0.10476154088973999,
-0.08419200032949448,
0.008797040209174156,
0.06024181470274925,
-0.06443428993225098,
-0.05412506312131882,
0.06934051215648651,
0.07975570857524872,
0.07967228442430496,
-0.13025140762329102,
-0.014651902951300144,
0.011243549175560474,
0.007594773545861244,
0.08504551649093628,
0.022028017789125443,
0.14595499634742737,
0.04393624886870384,
-0.13030564785003662,
-0.044304780662059784,
0.09771761298179626,
0.04345165938138962,
-0.053857799619436264,
-0.2537047266960144,
-0.024983759969472885,
-0.03927002474665642,
-0.03094942681491375,
-0.038562554866075516,
0.04431856796145439,
-0.011080716736614704,
0.08032315224409103,
-0.01118796318769455,
-0.08149448037147522,
-0.041395120322704315,
0.06544242054224014,
0.062143467366695404,
0.026896316558122635,
-0.01158317644149065,
0.00973866879940033,
0.1224486380815506,
0.10907839238643646,
-0.12763150036334991,
-0.05768941715359688,
-0.06755511462688446,
-0.08307720720767975,
-0.04300352931022644,
0.03337155282497406,
0.044020529836416245,
0.04436098039150238,
0.2466370165348053,
0.01108562108129263,
0.05453123152256012,
0.045806169509887695,
0.010608446784317493,
0.06787561625242233,
0.11606968939304352,
-0.062306761741638184,
-0.09178462624549866,
-0.029058339074254036,
0.09215214103460312,
0.006741520017385483,
-0.042814407497644424,
-0.060904473066329956,
0.06479041278362274,
0.012608112767338753,
0.12110785394906998,
0.08444269746541977,
0.0026690615341067314,
-0.07305197417736053,
-0.06963318586349487,
0.18848419189453125,
-0.1598394364118576,
0.047875016927719116,
0.031182926148176193,
-0.038971830159425735,
-0.0014042917173355818,
0.008752269670367241,
0.02394084818661213,
-0.020246321335434914,
0.08923295140266418,
-0.05574449151754379,
-0.03784004598855972,
-0.11079790443181992,
-0.03252100944519043,
0.030985163524746895,
0.0051483530551195145,
-0.027043871581554413,
-0.033837489783763885,
-0.09040277451276779,
-0.059588029980659485,
0.0922931432723999,
-0.07471107691526413,
-0.04984431713819504,
-0.013726521283388138,
-0.07691634446382523,
0.023329194635152817,
0.016799474135041237,
0.08357251435518265,
-0.02157396264374256,
0.0384126678109169,
-0.0560205839574337,
0.0631464347243309,
0.11269522458314896,
0.029363946989178658,
-0.053069718182086945,
0.05750001594424248,
-0.24315528571605682,
0.10326608270406723,
-0.07320205867290497,
0.050549428910017014,
-0.15059062838554382,
-0.026000602170825005,
0.044471126049757004,
0.00805877335369587,
-0.013138634152710438,
0.14088952541351318,
-0.21621745824813843,
-0.0323486253619194,
0.16741067171096802,
-0.0939871072769165,
-0.07602590322494507,
0.059108685702085495,
-0.05233629792928696,
0.10869261622428894,
0.04351044446229935,
-0.02232111617922783,
0.060673557221889496,
-0.14475463330745697,
-0.01067100279033184,
-0.04139741137623787,
-0.02402937039732933,
0.16397778689861298,
0.07567544281482697,
-0.06286642700433731,
0.08052356541156769,
0.024165838956832886,
-0.017831770703196526,
-0.04484899342060089,
-0.023361295461654663,
-0.10819391161203384,
0.009856974706053734,
-0.06032416597008705,
0.02424289658665657,
-0.025761527940630913,
-0.09367526322603226,
-0.02868773601949215,
-0.1802000105381012,
-0.009223134256899357,
0.0881323292851448,
-0.011722641065716743,
-0.021903391927480698,
-0.12039245665073395,
0.011948852799832821,
0.031212422996759415,
0.002984174294397235,
-0.13029038906097412,
-0.05838731303811073,
0.027675874531269073,
-0.16422230005264282,
0.03272955119609833,
-0.05597274377942085,
0.05056252330541611,
0.03445037454366684,
-0.03187771514058113,
-0.033117350190877914,
0.009550533257424831,
0.006354342680424452,
-0.010578392073512077,
-0.2502359449863434,
-0.02440580166876316,
-0.0219739843159914,
0.17386503517627716,
-0.21793730556964874,
0.04213962331414223,
0.07686693966388702,
0.14929872751235962,
0.006240781396627426,
-0.038500864058732986,
0.010139784775674343,
-0.08222103863954544,
-0.030560437589883804,
-0.0643099993467331,
-0.012082485482096672,
-0.03717579320073128,
-0.05608142167329788,
0.05165567249059677,
-0.16133594512939453,
-0.028727244585752487,
0.1057019829750061,
0.06860516220331192,
-0.14001330733299255,
-0.019125886261463165,
-0.04171464592218399,
-0.043496038764715195,
-0.05877087265253067,
-0.0552728995680809,
0.1185101792216301,
0.05596614256501198,
0.04696191847324371,
-0.06956122815608978,
-0.07775315642356873,
0.007865429855883121,
-0.017090093344449997,
-0.017978519201278687,
0.08920905739068985,
0.07311701774597168,
-0.12023317068815231,
0.09247473627328873,
0.10194233059883118,
0.09365488588809967,
0.108615942299366,
-0.017981963232159615,
-0.08929306268692017,
-0.04584396257996559,
0.02045595459640026,
0.013332244008779526,
0.14797501266002655,
-0.01403066236525774,
0.056954506784677505,
0.03922648727893829,
-0.01123172789812088,
0.012020308524370193,
-0.09384570270776749,
0.027314940467476845,
0.034342724829912186,
-0.020308034494519234,
0.03796098753809929,
-0.04001156985759735,
0.019826533272862434,
0.08712323755025864,
0.04676510766148567,
0.04415108636021614,
0.011758276261389256,
-0.04233846068382263,
-0.10904491692781448,
0.173858180642128,
-0.12615609169006348,
-0.24583272635936737,
-0.14115718007087708,
0.0015609683468937874,
0.04152948409318924,
-0.009671499952673912,
0.003867273684591055,
-0.07054664939641953,
-0.11710625886917114,
-0.0934595838189125,
0.018713686615228653,
0.04491026699542999,
-0.07426843047142029,
-0.0596279613673687,
0.059872306883335114,
0.03894329443573952,
-0.14430272579193115,
0.022237464785575867,
0.047419775277376175,
-0.09032250195741653,
-0.006925572175532579,
0.08398029953241348,
0.06729988008737564,
0.17764869332313538,
0.009659109637141228,
-0.021044570952653885,
0.03080335259437561,
0.21258224546909332,
-0.14283664524555206,
0.11252175271511078,
0.14021345973014832,
-0.09024007618427277,
0.08099348843097687,
0.1948828399181366,
0.039186809211969376,
-0.10478170961141586,
0.03259138762950897,
0.02489176020026207,
-0.028939135372638702,
-0.25018003582954407,
-0.0680207833647728,
0.002590036718174815,
-0.04892077296972275,
0.07092583924531937,
0.0918794497847557,
0.09946957975625992,
0.015428726561367512,
-0.09732488542795181,
-0.08017807453870773,
0.0468163788318634,
0.10640767961740494,
0.0070237633772194386,
-0.01532268337905407,
0.08905128389596939,
-0.03260866180062294,
0.018378758803009987,
0.0954233929514885,
0.00412675691768527,
0.17459604144096375,
0.05586163327097893,
0.17767499387264252,
0.07751350849866867,
0.06634163856506348,
0.019167855381965637,
0.0069374511949718,
0.02067388966679573,
0.017508454620838165,
-0.004214957356452942,
-0.08522020280361176,
-0.00457410141825676,
0.12029227614402771,
0.06321834027767181,
0.024303704500198364,
0.0137604009360075,
-0.03941800817847252,
0.08438141644001007,
0.17332784831523895,
0.0020201504230499268,
-0.18486954271793365,
-0.07240456342697144,
0.07921045273542404,
-0.0910051167011261,
-0.10552998632192612,
-0.03353073075413704,
0.03346012532711029,
-0.1747758537530899,
0.02097497321665287,
-0.017018353566527367,
0.10809773951768875,
-0.13855572044849396,
-0.018670624122023582,
0.06328251957893372,
0.07232730835676193,
-0.0028869258239865303,
0.06308864802122116,
-0.153975248336792,
0.1050168052315712,
0.016289174556732178,
0.06754438579082489,
-0.09747608006000519,
0.10138221830129623,
-0.006303760688751936,
-0.007241528946906328,
0.13875643908977509,
0.010596190579235554,
-0.05694379657506943,
-0.08987913280725479,
-0.10555228590965271,
-0.008462639525532722,
0.12933635711669922,
-0.15157614648342133,
0.0847775787115097,
-0.028662750497460365,
-0.043171048164367676,
0.0024383023846894503,
-0.1199452206492424,
-0.1302652359008789,
-0.1875755488872528,
0.058235347270965576,
-0.1366453617811203,
0.039557021111249924,
-0.10582595318555832,
-0.04340389743447304,
-0.028466427698731422,
0.2041483372449875,
-0.2317875325679779,
-0.0682469978928566,
-0.1541893482208252,
-0.08429346233606339,
0.14446710050106049,
-0.04730919376015663,
0.08914490789175034,
-0.0013825427740812302,
0.19013537466526031,
0.024473950266838074,
-0.02387205697596073,
0.10308998823165894,
-0.09543927758932114,
-0.19450686872005463,
-0.08603953570127487,
0.15582145750522614,
0.13931062817573547,
0.03702725097537041,
-0.004593946039676666,
0.029260434210300446,
-0.020000332966446877,
-0.12535293400287628,
0.025526588782668114,
0.1793687790632248,
0.07859015464782715,
0.023437971249222755,
-0.025896867737174034,
-0.10993997752666473,
-0.06524094194173813,
-0.0335373692214489,
0.02718053013086319,
0.18264614045619965,
-0.07421271502971649,
0.1900695115327835,
0.13626199960708618,
-0.05445687845349312,
-0.1955246478319168,
0.018216576427221298,
0.040417760610580444,
0.010847307741641998,
0.03138056397438049,
-0.2078717201948166,
0.09027513861656189,
0.0014845491386950016,
-0.05172133818268776,
0.141556978225708,
-0.174949511885643,
-0.1512570083141327,
0.06491631269454956,
0.0364508256316185,
-0.19348180294036865,
-0.117862768471241,
-0.08817066252231598,
-0.046907443553209305,
-0.17498233914375305,
0.10519181191921234,
0.016932250931859016,
0.009516867808997631,
0.03492651879787445,
0.02640140987932682,
0.011080757714807987,
-0.03873949125409126,
0.19461296498775482,
-0.02505207620561123,
0.029532426968216896,
-0.08079101145267487,
-0.06136554479598999,
0.0607450045645237,
-0.05577658861875534,
0.07896649837493896,
-0.020188091322779655,
0.012835816480219364,
-0.1100873053073883,
-0.0468425452709198,
-0.027396185323596,
0.017321845516562462,
-0.09195652604103088,
-0.09473495930433273,
-0.05146971344947815,
0.09373841434717178,
0.08845265954732895,
-0.036603908985853195,
-0.04043547809123993,
-0.07348548620939255,
0.0325477197766304,
0.17183002829551697,
0.17659065127372742,
0.038550034165382385,
-0.08084331452846527,
-0.005880105309188366,
-0.01188716571778059,
0.04436201974749565,
-0.22519725561141968,
0.06208868324756622,
0.04557957127690315,
0.015879612416028976,
0.11362850666046143,
-0.018783990293741226,
-0.16298477351665497,
-0.06594224274158478,
0.06143777072429657,
-0.06664001196622849,
-0.18599680066108704,
0.0032026967965066433,
0.058006007224321365,
-0.1646854728460312,
-0.037671029567718506,
0.042260222136974335,
-0.0045668939128518105,
-0.04300284758210182,
0.01627597212791443,
0.08071378618478775,
0.005054219625890255,
0.07112491130828857,
0.05733523517847061,
0.0842885971069336,
-0.10417009145021439,
0.07519911974668503,
0.08007751405239105,
-0.08229218423366547,
0.031453702598810196,
0.08910130709409714,
-0.061817802488803864,
-0.03069761022925377,
0.032593827694654465,
0.07753410935401917,
0.019773589447140694,
-0.041717879474163055,
0.008655321784317493,
-0.09745000302791595,
0.06339588761329651,
0.09504765272140503,
0.03549657016992569,
0.014742289669811726,
0.034356739372015,
0.04988397657871246,
-0.07460241764783859,
0.11766603589057922,
0.022336218506097794,
0.01780087500810623,
-0.044981084764003754,
-0.05459042266011238,
0.032110098749399185,
-0.022974027320742607,
-0.010163158178329468,
-0.03885438293218613,
-0.07015778869390488,
-0.018130742013454437,
-0.15929651260375977,
-0.014899281784892082,
-0.04085385054349899,
0.007158880587667227,
0.02551902085542679,
-0.03834335505962372,
0.007963370531797409,
0.012195355258882046,
-0.07085035741329193,
-0.061454467475414276,
-0.022903166711330414,
0.09224231541156769,
-0.16436699032783508,
0.025155464187264442,
0.08285263180732727,
-0.12099926173686981,
0.09775067120790482,
0.021939631551504135,
0.0031351554207503796,
0.028338242322206497,
-0.1542527824640274,
0.04096807911992073,
-0.024365095421671867,
0.01272035762667656,
0.04409142583608627,
-0.22033950686454773,
0.001463581225834787,
-0.03818526118993759,
-0.05954346805810928,
-0.010227864608168602,
-0.033079732209444046,
-0.11291328817605972,
0.09883669763803482,
0.008058897219598293,
-0.08219768106937408,
-0.030809206888079643,
0.03451729565858841,
0.08243680745363235,
-0.02608415111899376,
0.15152283012866974,
0.0016822130419313908,
0.07172226905822754,
-0.17519205808639526,
-0.021702464669942856,
-0.011611736379563808,
0.02207101881504059,
-0.014536668546497822,
-0.015496513806283474,
0.042471300810575485,
-0.02421419881284237,
0.19108575582504272,
-0.026401294395327568,
0.038726791739463806,
0.06405707448720932,
0.01593620702624321,
-0.014801506884396076,
0.10957890748977661,
0.05975057929754257,
0.02399693801999092,
0.022115202620625496,
0.007329683285206556,
-0.039842452853918076,
-0.014149460941553116,
-0.19538825750350952,
0.06474217027425766,
0.1377464383840561,
0.08781574666500092,
-0.01322576031088829,
0.07683692127466202,
-0.10024392604827881,
-0.12397097796201706,
0.11215250939130783,
-0.06283260136842728,
-0.007701667957007885,
-0.06531554460525513,
0.13346771895885468,
0.14944057166576385,
-0.18992236256599426,
0.06835456937551498,
-0.06228158622980118,
-0.05332518368959427,
-0.11744599789381027,
-0.1957325041294098,
-0.055616896599531174,
-0.056456826627254486,
-0.014700124971568584,
-0.048795297741889954,
0.07307228446006775,
0.05693497136235237,
0.012962869368493557,
0.003600025549530983,
0.0766802653670311,
-0.015357231721282005,
0.0008028073934838176,
0.03077360987663269,
0.06600049883127213,
0.013312965631484985,
-0.02929985709488392,
0.020537450909614563,
-0.007275243755429983,
0.04005419462919235,
0.06378308683633804,
0.038119763135910034,
-0.02801438421010971,
0.01591232419013977,
-0.03770609200000763,
-0.10940317064523697,
0.0409080907702446,
-0.028551526367664337,
-0.08112191408872604,
0.13721226155757904,
0.02428387477993965,
0.005870606284588575,
-0.02180131897330284,
0.24582624435424805,
-0.07231455296278,
-0.09001907706260681,
-0.1473579704761505,
0.10211005061864853,
-0.04095151647925377,
0.06560079753398895,
0.04110138490796089,
-0.10732010751962662,
0.013498948886990547,
0.12688814103603363,
0.15896959602832794,
-0.044884394854307175,
0.020156091079115868,
0.03252736106514931,
0.003683826420456171,
-0.04006262496113777,
0.05253688618540764,
0.0694650411605835,
0.14883354306221008,
-0.04907030612230301,
0.08928520232439041,
0.005485867150127888,
-0.10256236046552658,
-0.03822692111134529,
0.11808354407548904,
-0.017866896465420723,
0.018703164532780647,
-0.057248231023550034,
0.11889533698558807,
-0.059861693531274796,
-0.23005777597427368,
0.06317704170942307,
-0.0720362737774849,
-0.14286935329437256,
-0.021647587418556213,
0.07456772774457932,
-0.017636949196457863,
0.02658887766301632,
0.07326807081699371,
-0.07681973278522491,
0.19899281859397888,
0.038975972682237625,
-0.05729197710752487,
-0.05658522993326187,
0.0789351835846901,
-0.114089734852314,
0.2792985737323761,
0.01164181251078844,
0.04984506592154503,
0.10365619510412216,
-0.016686614602804184,
-0.13768579065799713,
0.015234606340527534,
0.09244892746210098,
-0.09004336595535278,
0.03869183734059334,
0.2132277488708496,
-0.002569539239630103,
0.1152428612112999,
0.07714667171239853,
-0.07265080511569977,
0.04592108353972435,
-0.1130065843462944,
-0.0718315914273262,
-0.086885966360569,
0.09441597014665604,
-0.07240451127290726,
0.14123490452766418,
0.12318195402622223,
-0.053516924381256104,
0.010368985123932362,
-0.031209774315357208,
0.04651070013642311,
0.007842876948416233,
0.10365527868270874,
0.010769560933113098,
-0.18099099397659302,
0.022656621411442757,
0.018202748149633408,
0.10856854915618896,
-0.17241089046001434,
-0.09672945737838745,
0.04725200682878494,
0.001958663808181882,
-0.059874359518289566,
0.1282012164592743,
0.057909298688173294,
0.04923510178923607,
-0.043742597103118896,
-0.017267800867557526,
-0.009560109116137028,
0.13584671914577484,
-0.10737434774637222,
-0.0021453071385622025
] |
null | null | null |
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
| {"tags": ["CartPole-v1", "reinforce", "reinforcement-learning", "custom-implementation", "deep-rl-class"], "model-index": [{"name": "Reinforce-CartPole-v1", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "CartPole-v1", "type": "CartPole-v1"}, "metrics": [{"type": "mean_reward", "value": "458.90 +/- 59.57", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | arekpaterak/Reinforce-CartPole-v1 | [
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] | 2024-02-12T11:58:00+00:00 | [] | [] | TAGS
#CartPole-v1 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us
|
# Reinforce Agent playing CartPole-v1
This is a trained model of a Reinforce agent playing CartPole-v1 .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL
| [
"# Reinforce Agent playing CartPole-v1\n This is a trained model of a Reinforce agent playing CartPole-v1 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL"
] | [
"TAGS\n#CartPole-v1 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us \n",
"# Reinforce Agent playing CartPole-v1\n This is a trained model of a Reinforce agent playing CartPole-v1 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL"
] | [
39,
54
] | [
"passage: TAGS\n#CartPole-v1 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us \n# Reinforce Agent playing CartPole-v1\n This is a trained model of a Reinforce agent playing CartPole-v1 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL"
] | [
0.007526164408773184,
-0.12498430907726288,
-0.0013541718944907188,
0.09601131081581116,
0.11848696321249008,
-0.04186001420021057,
0.11405468732118607,
0.05624859035015106,
0.09539441019296646,
0.04239490255713463,
0.13636724650859833,
0.06906966865062714,
-0.004102868959307671,
0.12412862479686737,
0.09840741008520126,
-0.26058563590049744,
0.07420794665813446,
-0.04403980076313019,
-0.009944677352905273,
0.10139261186122894,
0.07836852967739105,
-0.08325441926717758,
0.051592715084552765,
0.00009572553972247988,
-0.044259943068027496,
0.0321260429918766,
0.013628939166665077,
-0.053157225251197815,
0.1606452465057373,
-0.07313758134841919,
0.10494591295719147,
-0.03843724727630615,
0.14574295282363892,
-0.1126825287938118,
0.04758213832974434,
0.05111503228545189,
-0.04548581689596176,
0.03848232328891754,
-0.12538743019104004,
-0.06033875793218613,
0.026815801858901978,
-0.015865681692957878,
0.12249194830656052,
0.03647647053003311,
-0.1777559220790863,
-0.13461355865001678,
-0.0165896974503994,
0.12325166910886765,
0.1627800315618515,
0.00512364786118269,
0.014270431362092495,
0.16791965067386627,
-0.1761058121919632,
0.025937072932720184,
0.11400806158781052,
-0.37275227904319763,
-0.00034436015994288027,
0.2240462601184845,
0.06164427846670151,
0.1252165287733078,
-0.12646614015102386,
0.010440526530146599,
0.07403992861509323,
0.04368630796670914,
0.049784936010837555,
-0.015430688858032227,
-0.12260042130947113,
0.08455035835504532,
-0.1383819431066513,
-0.058066487312316895,
0.1495426446199417,
-0.019741326570510864,
-0.009476418606936932,
-0.016515808179974556,
-0.009238536469638348,
-0.050979889929294586,
-0.03430935740470886,
-0.11778499186038971,
0.10755524039268494,
0.04975730925798416,
0.0038771627005189657,
-0.04602450504899025,
-0.05612579360604286,
-0.09815777093172073,
-0.03123871050775051,
0.0372777059674263,
-0.013706400990486145,
0.01091629359871149,
0.027692900970578194,
0.09935613721609116,
-0.13446329534053802,
0.01825822703540325,
-0.028096558526158333,
-0.028040969744324684,
-0.1316804438829422,
-0.11984307318925858,
-0.026084421202540398,
0.004223645199090242,
0.03029833547770977,
0.20433813333511353,
0.020139509811997414,
0.059011414647102356,
-0.0022708347532898188,
0.09776382148265839,
0.029780851677060127,
0.13517548143863678,
-0.04466623440384865,
0.19488364458084106,
0.07711011171340942,
0.05364556983113289,
0.03204274922609329,
-0.05344729498028755,
-0.19369827210903168,
0.04861246794462204,
0.06659778952598572,
0.08274952322244644,
-0.1178959533572197,
0.0059632807970047,
-0.10316018015146255,
0.0028950648847967386,
-0.10474003106355667,
-0.0642905905842781,
-0.02892979420721531,
0.031841445714235306,
-0.10535725951194763,
0.028785312548279762,
0.025052599608898163,
0.04140377417206764,
0.0676041767001152,
-0.12253966927528381,
-0.07404746115207672,
-0.021733485162258148,
-0.12817098200321198,
-0.09923440217971802,
0.08802318572998047,
-0.026199497282505035,
-0.005110981408506632,
-0.1253623217344284,
-0.2661486268043518,
-0.05670225992798805,
0.06396034359931946,
-0.03231031447649002,
-0.08589376509189606,
-0.1633463054895401,
0.026403428986668587,
-0.07700273394584656,
0.05221332609653473,
0.04776721075177193,
-0.03665859252214432,
0.02023705095052719,
-0.07958202809095383,
0.12739010155200958,
0.049698662012815475,
0.00541001046076417,
-0.09916839748620987,
0.07882837951183319,
-0.3034103214740753,
-0.02581131085753441,
-0.15228183567523956,
0.0772043839097023,
-0.07893010973930359,
0.01308529730886221,
0.05044940114021301,
0.043790437281131744,
-0.016942394897341728,
0.16269747912883759,
-0.17043575644493103,
-0.05301272124052048,
0.026445282623171806,
-0.09261117875576019,
-0.09916394203901291,
0.07275339215993881,
-0.06339669227600098,
0.21263530850410461,
0.08751397579908371,
0.17006252706050873,
-0.011036526411771774,
-0.16256992518901825,
0.1207515075802803,
0.07522942125797272,
-0.1639646589756012,
0.004287737421691418,
0.061784300953149796,
-0.0016935690073296428,
0.02746843732893467,
-0.01872866041958332,
-0.07289361208677292,
0.06302516162395477,
-0.07825060933828354,
0.022581040859222412,
0.06258945167064667,
-0.09531243145465851,
0.23986859619617462,
-0.005434412509202957,
0.0862451046705246,
-0.025957979261875153,
-0.09802921861410141,
0.00908072479069233,
0.07164718210697174,
-0.0014321404742076993,
0.01703714393079281,
-0.14553219079971313,
0.23044352233409882,
-0.07965081930160522,
0.011176814325153828,
-0.11607582122087479,
-0.1256982982158661,
0.011873425915837288,
0.13336114585399628,
0.059921663254499435,
0.16569606959819794,
0.09518871456384659,
-0.032197169959545135,
0.017584815621376038,
-0.0023385772947221994,
-0.09040450304746628,
0.01580043137073517,
-0.0021571461111307144,
-0.12167251110076904,
-0.07353103160858154,
-0.08134473115205765,
0.12585052847862244,
-0.20988115668296814,
0.015492538921535015,
0.04099845886230469,
0.008103687316179276,
0.04467369243502617,
0.023746047168970108,
-0.013269703835248947,
-0.00007021807687124237,
0.03244573250412941,
-0.10098352283239365,
0.12937165796756744,
0.013381263241171837,
0.014676140621304512,
-0.006365173030644655,
-0.05572463944554329,
0.03720450773835182,
0.040439579635858536,
-0.11237845569849014,
-0.11330515146255493,
-0.009658765979111195,
-0.0015364213613793254,
0.02637762948870659,
-0.022321155294775963,
0.052120618522167206,
0.27587956190109253,
0.05387469753623009,
0.10401033610105515,
-0.05769326910376549,
0.015315087512135506,
-0.015322818420827389,
-0.07135670632123947,
0.06358719617128372,
0.025013601407408714,
0.08050397783517838,
-0.03531401976943016,
0.03759452700614929,
0.1675453782081604,
-0.015888912603259087,
0.11127935349941254,
-0.06545067578554153,
-0.03844274953007698,
-0.043109722435474396,
0.05627678707242012,
0.015021559782326221,
0.04564907029271126,
0.0000015355876712419558,
-0.08444724231958389,
-0.03503387048840523,
-0.03988509997725487,
-0.010637006722390652,
-0.12273643165826797,
-0.00499896751716733,
0.01265440508723259,
-0.021940499544143677,
0.04488934203982353,
0.07375624030828476,
-0.04849626496434212,
0.025821007788181305,
0.06070821359753609,
-0.10193055868148804,
0.08957115560770035,
0.015067169442772865,
-0.06946801394224167,
0.13769419491291046,
-0.07484805583953857,
-0.045293889939785004,
-0.1025395318865776,
-0.1568877100944519,
0.09384927153587341,
0.06704871356487274,
-0.05427970737218857,
-0.1503879576921463,
-0.0016851738328114152,
-0.008973666466772556,
0.09206123650074005,
-0.006399387493729591,
-0.12621140480041504,
0.01989075168967247,
0.08295059949159622,
-0.05633419007062912,
-0.09804849326610565,
-0.0075809285044670105,
-0.05280788615345955,
-0.17707788944244385,
-0.03888550028204918,
-0.06398582458496094,
-0.06734282523393631,
0.23586803674697876,
0.02017230913043022,
0.08274748176336288,
-0.044721852988004684,
0.04250151664018631,
-0.012231717817485332,
0.0006326579605229199,
0.10689259320497513,
-0.09043551236391068,
-0.017900818958878517,
-0.001320177922025323,
-0.024820495396852493,
-0.07327181100845337,
0.029733488336205482,
-0.04272191599011421,
-0.08249637484550476,
-0.1415451467037201,
-0.04993678629398346,
-0.011005163192749023,
0.10754310339689255,
0.07337497919797897,
0.0048001972027122974,
-0.11733713001012802,
0.062058478593826294,
0.13692134618759155,
0.031207585707306862,
0.004062763415277004,
0.028157465159893036,
0.14977529644966125,
-0.10706274956464767,
-0.022463621571660042,
-0.038119975477457047,
-0.054863203316926956,
0.004114252515137196,
0.016883620992302895,
0.08840765058994293,
0.1410384476184845,
0.11468084901571274,
0.047563645988702774,
0.0464191697537899,
0.06561273336410522,
0.1694946140050888,
0.059157438576221466,
-0.10448314249515533,
-0.044678982347249985,
-0.0040070898830890656,
-0.10903503000736237,
0.057307638227939606,
0.16030821204185486,
0.06326017528772354,
-0.14463356137275696,
0.021787412464618683,
-0.038982175290584564,
0.13649246096611023,
0.020638149231672287,
-0.2677258849143982,
-0.008139112964272499,
0.023630544543266296,
-0.0010347915813326836,
-0.012379839085042477,
0.10821118950843811,
-0.040134772658348083,
-0.233198344707489,
-0.12299054861068726,
0.010077533312141895,
0.031144635751843452,
-0.1509784311056137,
0.015542911365628242,
-0.14036494493484497,
0.08027976751327515,
-0.007007129956036806,
0.07418135553598404,
-0.025149788707494736,
0.15060245990753174,
-0.028731435537338257,
0.01628703810274601,
-0.07902143895626068,
-0.047717493027448654,
0.09898673743009567,
-0.0046631391160190105,
0.1931537538766861,
0.005480166990309954,
-0.023713182657957077,
-0.12098433077335358,
-0.05229806900024414,
-0.04967813938856125,
0.010598190128803253,
-0.05373382940888405,
0.0765683576464653,
-0.02441473677754402,
-0.0039579677395522594,
-0.010900177992880344,
0.08942947536706924,
-0.05291692912578583,
0.03636563941836357,
-0.11246588081121445,
-0.05034820735454559,
0.14550213515758514,
-0.09163831174373627,
-0.10174685716629028,
-0.16205860674381256,
0.14137998223304749,
0.15070600807666779,
0.058216437697410583,
-0.04001476243138313,
0.03867831453680992,
-0.019183965399861336,
-0.024241572245955467,
0.07880574464797974,
0.009653856977820396,
0.1324782371520996,
-0.08983246237039566,
0.014327390119433403,
0.14589735865592957,
-0.05275948345661163,
0.016191845759749413,
-0.02304735779762268,
0.12202176451683044,
0.04650457948446274,
0.06189403310418129,
0.018547222018241882,
0.06655703485012054,
0.06466961652040482,
-0.02262885868549347,
0.08456692099571228,
0.030712679028511047,
-0.18644161522388458,
0.058530256152153015,
-0.09805119782686234,
0.22581584751605988,
0.05066308751702309,
0.06047345697879791,
0.2993181645870209,
0.21986234188079834,
-0.05372472479939461,
0.1669820249080658,
0.044286344200372696,
-0.05891284719109535,
-0.21245966851711273,
-0.03684934973716736,
-0.030655447393655777,
0.09436552971601486,
0.15607263147830963,
-0.0981721356511116,
-0.04201313853263855,
-0.00972361396998167,
-0.032264553010463715,
0.020120708271861076,
-0.24663487076759338,
-0.01734781451523304,
0.14379777014255524,
0.10629188269376755,
0.2451348900794983,
-0.006132842972874641,
0.023609744384884834,
0.049030207097530365,
0.018605992197990417,
-0.02483358606696129,
-0.21013511717319489,
0.09079083055257797,
0.006071676965802908,
0.04935038834810257,
0.022885039448738098,
-0.006052911281585693,
0.04500092566013336,
-0.073696069419384,
0.08904470503330231,
-0.08561883866786957,
-0.08341272175312042,
0.2185351401567459,
-0.03945168852806091,
-0.00661163916811347,
0.12917985022068024,
-0.011526807211339474,
-0.1097102016210556,
-0.015364703722298145,
0.027403371408581734,
0.030678823590278625,
-0.030246863141655922,
-0.03609466925263405,
0.024012766778469086,
0.10202405601739883,
-0.04282205551862717,
0.04565315693616867,
0.10240072011947632,
-0.020902957767248154,
0.15945613384246826,
0.13205459713935852,
0.10420060157775879,
0.002927543595433235,
-0.06464727967977524,
0.014349685050547123,
-0.055471502244472504,
0.02962767891585827,
-0.17038846015930176,
-0.0070191239938139915,
0.055695805698633194,
0.04772466421127319,
0.0945243164896965,
0.11333164572715759,
-0.127106174826622,
0.0300484336912632,
0.028996523469686508,
-0.06286120414733887,
-0.06029998138546944,
-0.002275418024510145,
-0.016458535566926003,
-0.008173024281859398,
-0.09947093576192856,
0.07884971052408218,
-0.10555081814527512,
-0.03306307643651962,
0.05025126785039902,
-0.0607193186879158,
-0.12852220237255096,
-0.010904680006206036,
0.1252979338169098,
0.061709314584732056,
-0.05078592896461487,
0.14939077198505402,
0.06109785661101341,
-0.08055379986763,
0.037185851484537125,
0.027442200109362602,
-0.08008874952793121,
-0.10198270529508591,
-0.0004569833690766245,
0.31761088967323303,
0.06076094135642052,
-0.0329466350376606,
-0.11946453154087067,
-0.15002015233039856,
0.04840146750211716,
0.1035679280757904,
0.12359631806612015,
0.011757869273424149,
-0.05322748050093651,
0.02236519381403923,
-0.05275069922208786,
0.03814244270324707,
0.06910209357738495,
-0.03928454965353012,
-0.13761694729328156,
0.0077122850343585014,
0.026647454127669334,
0.10174071043729782,
-0.06771174818277359,
-0.09184598177671432,
-0.18085066974163055,
0.09208621084690094,
-0.03432070091366768,
-0.10890032351016998,
0.027215104550123215,
-0.017406610772013664,
0.014248576015233994,
0.07639352232217789,
-0.047281619161367416,
0.01244808267802,
-0.1517520695924759,
0.07082249224185944,
0.05706808716058731,
0.08926787972450256,
0.000014311663107946515,
-0.054843269288539886,
0.07618319988250732,
-0.05763502046465874,
0.06680037826299667,
-0.053477559238672256,
0.005539732985198498,
0.10781200975179672,
-0.23264040052890778,
-0.021164139732718468,
0.009476077742874622,
-0.04681631922721863,
0.08765807747840881,
-0.19047698378562927,
0.024190550670027733,
-0.08897756040096283,
-0.024605726823210716,
0.01802127994596958,
-0.1086471825838089,
-0.04306677728891373,
0.08475461602210999,
0.037119291722774506,
-0.031288959085941315,
-0.04612116143107414,
-0.019314980134367943,
-0.0914498046040535,
0.053634315729141235,
0.07442525774240494,
-0.0687926784157753,
0.08314394950866699,
-0.05507456883788109,
0.00841207429766655,
-0.052043743431568146,
0.06760627031326294,
-0.012366239912807941,
-0.12672528624534607,
-0.02123171091079712,
-0.044928714632987976,
0.11662110686302185,
-0.023402327671647072,
0.022080281749367714,
0.014599837362766266,
0.0323631577193737,
-0.012065601535141468,
0.05028461292386055,
0.1019197478890419,
0.05136820673942566,
0.014879679307341576,
0.02292765863239765,
0.055746350437402725,
0.0757644772529602,
-0.1134679913520813,
0.06457309424877167,
-0.02098844014108181,
-0.08620109409093857,
0.1013324111700058,
0.06909440457820892,
0.037490107119083405,
0.15593400597572327,
0.22674402594566345,
0.10539932548999786,
-0.03564648702740669,
-0.03126971051096916,
0.12967991828918457,
0.17799612879753113,
-0.07682197540998459,
0.015780627727508545,
-0.0020607721526175737,
-0.017265556380152702,
-0.09849067777395248,
-0.13722245395183563,
-0.060460351407527924,
-0.2453264594078064,
0.1078341007232666,
-0.03288164362311363,
-0.04169659689068794,
0.128489688038826,
0.027952738106250763,
0.03724630922079086,
0.08183616399765015,
-0.12909026443958282,
-0.013460557907819748,
0.07749562710523605,
-0.08914026618003845,
-0.033571500331163406,
-0.17521262168884277,
-0.06771576404571533,
-0.08741120994091034,
-0.15989220142364502,
-0.06844990700483322,
0.029948782175779343,
0.035394806414842606,
0.010386589914560318,
-0.039711855351924896,
-0.01962728053331375,
0.011063394136726856,
-0.0025537724141031504,
-0.04985455423593521,
-0.01753084547817707,
0.021317757666110992,
-0.11333847790956497,
-0.024336790665984154,
0.16320326924324036,
-0.03297848999500275,
-0.18396754562854767,
-0.0405106395483017,
0.2157316505908966,
0.025046708062291145,
0.0590171180665493,
-0.073721744120121,
-0.016323629766702652,
0.021523483097553253,
0.20813441276550293,
0.10171995311975479,
-0.10821312665939331,
0.015457749366760254,
-0.03655189648270607,
0.0013793212128803134,
-0.061893612146377563,
0.10775819420814514,
0.06519263982772827,
-0.07549984753131866,
-0.17567221820354462,
-0.04389495030045509,
-0.08628730475902557,
0.03370477631688118,
-0.14383791387081146,
-0.03786516562104225,
0.1168690100312233,
0.004516853019595146,
-0.053927481174468994,
0.07883694022893906,
-0.17713546752929688,
0.03441957011818886,
-0.04880853369832039,
-0.13215437531471252,
-0.09491758048534393,
-0.10123858600854874,
0.0027463934384286404,
0.08913854509592056,
0.15567956864833832,
-0.06151591241359711,
-0.07471925020217896,
-0.009579092264175415,
-0.028091613203287125,
-0.052700337022542953,
-0.07900123298168182,
0.059512585401535034,
0.0007560851518064737,
0.16147300601005554,
-0.07439453154802322,
0.09558981657028198,
0.09099138528108597,
-0.021246420219540596,
-0.00915549136698246,
0.032866667956113815,
-0.003863809397444129,
-0.07436864078044891,
-0.04970616102218628,
0.02312966249883175,
0.027639856562018394,
0.10846075415611267,
-0.030836544930934906,
-0.1934703141450882,
0.11230092495679855,
0.09140218049287796,
-0.04296138137578964,
-0.046487610787153244,
0.05351927503943443,
-0.07097935676574707,
0.1252279132604599,
0.03444884717464447,
-0.02163051813840866,
0.013762647286057472,
-0.06370721012353897,
0.08370721340179443,
0.11594565212726593,
-0.048265840858221054,
-0.08278503268957138,
-0.06164652109146118,
0.012770666740834713,
0.02961382456123829,
-0.13650155067443848,
-0.21160630881786346,
-0.10802312940359116,
-0.1383298933506012,
0.004740108735859394,
-0.04703504592180252,
0.08498300611972809,
0.12991970777511597,
0.09780163317918777,
-0.011416295543313026,
-0.004867587238550186,
0.018085451796650887,
0.13192623853683472,
-0.11232008039951324,
-0.08192373812198639
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | lourenswal/bloom_prompt_tuning_1707739144.36424 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-12T11:59:05+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | timm | # Model card for DERETFound_AMD_AREDS
DERETFound - Expertise-informed Generative Model Enables Ultra-High Data Efficiency for Building Generalist Medical Foundation Model
A model fine-tuned on the AREDS dataset DERETFound model for OCT images.
See the paper here:
https://www.researchgate.net/publication/377358144_Expertise-informed_Generative_AI_Enables_Ultra-High_Data_Efficiency_for_Building_Generalist_Medical_Foundation_Model
| {"license": "apache-2.0", "library_name": "timm", "tags": ["image-classification", "timm"]} | image-classification | bitfount/DERETFound_AMD_AREDS | [
"timm",
"pytorch",
"image-classification",
"license:apache-2.0",
"region:us"
] | 2024-02-12T11:59:53+00:00 | [] | [] | TAGS
#timm #pytorch #image-classification #license-apache-2.0 #region-us
| # Model card for DERETFound_AMD_AREDS
DERETFound - Expertise-informed Generative Model Enables Ultra-High Data Efficiency for Building Generalist Medical Foundation Model
A model fine-tuned on the AREDS dataset DERETFound model for OCT images.
See the paper here:
URL
| [
"# Model card for DERETFound_AMD_AREDS\nDERETFound - Expertise-informed Generative Model Enables Ultra-High Data Efficiency for Building Generalist Medical Foundation Model\n\nA model fine-tuned on the AREDS dataset DERETFound model for OCT images.\n\nSee the paper here:\nURL"
] | [
"TAGS\n#timm #pytorch #image-classification #license-apache-2.0 #region-us \n",
"# Model card for DERETFound_AMD_AREDS\nDERETFound - Expertise-informed Generative Model Enables Ultra-High Data Efficiency for Building Generalist Medical Foundation Model\n\nA model fine-tuned on the AREDS dataset DERETFound model for OCT images.\n\nSee the paper here:\nURL"
] | [
26,
73
] | [
"passage: TAGS\n#timm #pytorch #image-classification #license-apache-2.0 #region-us \n# Model card for DERETFound_AMD_AREDS\nDERETFound - Expertise-informed Generative Model Enables Ultra-High Data Efficiency for Building Generalist Medical Foundation Model\n\nA model fine-tuned on the AREDS dataset DERETFound model for OCT images.\n\nSee the paper here:\nURL"
] | [
-0.10108834505081177,
0.1350446492433548,
-0.004872956313192844,
0.021050525829195976,
0.07043695449829102,
0.04927537962794304,
0.08202826231718063,
0.09746013581752777,
-0.0033509256318211555,
0.004530325531959534,
0.10893746465444565,
0.025691235437989235,
0.01057365257292986,
0.07739411294460297,
-0.004360864404588938,
-0.17424531280994415,
0.044450029730796814,
0.12118005007505417,
0.007612985093146563,
0.012507883831858635,
0.09460052847862244,
-0.05256194621324539,
0.07061770558357239,
0.05410092696547508,
-0.05919298157095909,
0.028172558173537254,
-0.020562298595905304,
-0.023867731913924217,
0.06124061346054077,
-0.024104131385684013,
0.012441825121641159,
0.021146437153220177,
0.08089786022901535,
0.01343977078795433,
0.041601281613111496,
-0.016976434737443924,
-0.0788307934999466,
0.10449817776679993,
0.02727300301194191,
-0.06105591356754303,
0.1733671873807907,
-0.03644626960158348,
-0.0055647906847298145,
-0.02603803761303425,
-0.06762382388114929,
-0.20534810423851013,
0.006964376661926508,
0.05149121955037117,
0.058248963207006454,
0.04251428321003914,
0.03370952233672142,
0.01892811991274357,
-0.041499022394418716,
0.027877092361450195,
-0.06330139935016632,
-0.07560832053422928,
-0.051226213574409485,
0.2063477486371994,
-0.06339916586875916,
-0.001095872139558196,
0.0645865947008133,
0.1090748980641365,
0.0344286784529686,
0.01744706928730011,
0.13490423560142517,
-0.02597643993794918,
0.0025544497184455395,
0.012183409184217453,
-0.07024914771318436,
-0.04143369197845459,
0.21445602178573608,
0.008286450058221817,
0.01612144708633423,
-0.013175714761018753,
-0.02621477097272873,
0.12159091234207153,
-0.09467434138059616,
0.020572546869516373,
-0.009122860617935658,
0.007770242169499397,
0.022896163165569305,
-0.02015954628586769,
-0.07177145779132843,
-0.06009596586227417,
-0.13834494352340698,
-0.10732199251651764,
-0.0017414744943380356,
0.09799253940582275,
-0.11967502534389496,
0.09209573268890381,
-0.12589778006076813,
-0.0545111820101738,
-0.0020726078655570745,
-0.06540541350841522,
-0.010049638338387012,
-0.031369682401418686,
-0.016943853348493576,
-0.0392773412168026,
0.14717639982700348,
0.07485369592905045,
0.058985430747270584,
-0.022467199712991714,
0.012024478986859322,
0.007677546236664057,
0.089607834815979,
-0.017511805519461632,
-0.14061510562896729,
-0.010863837786018848,
-0.017350034788250923,
0.03138074651360512,
-0.0024219206534326077,
0.0437130331993103,
-0.049825020134449005,
-0.0017616207478567958,
0.013859910890460014,
-0.04005943238735199,
-0.101980060338974,
0.039607834070920944,
-0.0010830453829839826,
-0.05314379185438156,
0.19963762164115906,
0.04690868407487869,
-0.042166709899902344,
-0.025703366845846176,
-0.009475549682974815,
0.05798735097050667,
0.10522907227277756,
0.05626770108938217,
-0.008314219303429127,
0.06123487651348114,
-0.015413908287882805,
-0.01952216774225235,
-0.026501281186938286,
0.026423651725053787,
-0.032673317939043045,
-0.03018081933259964,
0.06457486003637314,
-0.14504647254943848,
-0.20514516532421112,
0.05245320871472359,
0.13108724355697632,
-0.05079274624586105,
-0.0003682641254272312,
0.05212445929646492,
0.04315468296408653,
-0.020530682057142258,
0.02493826486170292,
0.11572994291782379,
-0.03939744830131531,
0.009015493094921112,
-0.03748050332069397,
0.05869569256901741,
-0.2360955774784088,
0.04806741699576378,
-0.03356032073497772,
0.02256043814122677,
-0.06411472707986832,
-0.007762024644762278,
-0.03403841704130173,
0.005614170804619789,
-0.08210783451795578,
-0.041587986052036285,
-0.07959994673728943,
0.0064978101290762424,
0.01193937286734581,
0.06215612217783928,
-0.07913433760404587,
-0.003179119899868965,
0.03045705147087574,
-0.06491357088088989,
-0.12291865795850754,
0.04736430570483208,
-0.05672014132142067,
0.08476752042770386,
0.05669993907213211,
0.07471304386854172,
0.1313418745994568,
-0.1403791606426239,
0.0383063443005085,
-0.04501017928123474,
-0.10291073471307755,
-0.1307406723499298,
0.08704394847154617,
0.07614278048276901,
-0.2118523269891739,
0.04119792953133583,
-0.07760931551456451,
0.03538941964507103,
-0.044767897576093674,
-0.0507073849439621,
-0.06110382825136185,
-0.12743304669857025,
-0.05243943631649017,
0.016219789162278175,
0.05625980719923973,
-0.034580692648887634,
0.0640869215130806,
0.0005639398004859686,
0.1223052591085434,
0.022366393357515335,
-0.10379870980978012,
-0.08196047693490982,
0.006024351809173822,
-0.15903586149215698,
0.03290056809782982,
-0.11967450380325317,
-0.07405023276805878,
-0.06490933895111084,
-0.1371755450963974,
0.011253686621785164,
-0.0328560434281826,
0.031208807602524757,
0.05155863240361214,
-0.0443447083234787,
0.0348614864051342,
0.07139276713132858,
0.042790282517671585,
-0.009041784331202507,
-0.040252625942230225,
0.05553489923477173,
-0.07324330508708954,
0.03036889247596264,
-0.08483554422855377,
-0.015363154001533985,
-0.0031634988263249397,
0.06045587360858917,
0.03600378707051277,
0.028400108218193054,
0.09668907523155212,
-0.05093088373541832,
-0.005406022537499666,
-0.036898646503686905,
0.07871712744235992,
-0.024673717096447945,
-0.011374481953680515,
0.0031011307146400213,
0.02778675965964794,
0.14237439632415771,
0.07850158959627151,
-0.026934612542390823,
0.0004432875430211425,
-0.09887349605560303,
-0.04551158845424652,
-0.0009820186533033848,
-0.14123712480068207,
-0.07499482482671738,
0.05748222395777702,
-0.03376049920916557,
0.08084822446107864,
0.04189866781234741,
0.04317152127623558,
0.01999267376959324,
0.014166251756250858,
-0.07004552334547043,
-0.018473241478204727,
0.24282827973365784,
-0.20441175997257233,
0.018058927729725838,
0.2271028608083725,
0.020554494112730026,
0.10782624036073685,
0.009080881252884865,
-0.08156665414571762,
-0.02416330575942993,
-0.09778517484664917,
0.003921256400644779,
0.2019464373588562,
-0.12852370738983154,
0.00415895925834775,
0.08183202147483826,
-0.042270272970199585,
0.07006758451461792,
-0.0733083039522171,
0.010815261863172054,
0.03294851630926132,
0.013707469217479229,
-0.04116223007440567,
0.055358752608299255,
-0.07397834211587906,
0.11868279427289963,
-0.07877585291862488,
-0.1507343202829361,
0.08585593104362488,
0.02860095165669918,
-0.011330986395478249,
0.06057804822921753,
0.020080357789993286,
-0.18502603471279144,
-0.036426715552806854,
0.02172543667256832,
-0.014778847806155682,
-0.01771061308681965,
-0.006866126321256161,
-0.07627063244581223,
-0.08104980736970901,
-0.04219641163945198,
-0.1366717368364334,
0.04419415071606636,
-0.023069407790899277,
0.029506364837288857,
-0.015923216938972473,
0.06240817531943321,
-0.08568104356527328,
-0.05748792365193367,
-0.0510142482817173,
-0.002606137888506055,
0.15407681465148926,
-0.12378131598234177,
0.11591565608978271,
0.07577870786190033,
0.028335364535450935,
0.03977245092391968,
0.008094921708106995,
0.12058181315660477,
-0.03546183928847313,
0.04486459121108055,
0.16384261846542358,
0.09398265928030014,
0.01602827198803425,
0.09891083836555481,
0.07035206258296967,
-0.11349774897098541,
0.05064457282423973,
0.057037103921175,
-0.11431337893009186,
-0.09051481634378433,
-0.10672036558389664,
-0.0541565902531147,
-0.07312951236963272,
0.09544450044631958,
0.05326305702328682,
0.10377810895442963,
0.1584293246269226,
0.07910792529582977,
0.044768109917640686,
-0.017405318096280098,
0.023183152079582214,
0.08833648264408112,
0.03184167295694351,
0.05005837231874466,
-0.07967417687177658,
-0.10574550181627274,
0.10763770341873169,
0.11820622533559799,
0.19918319582939148,
0.06914332509040833,
0.012441644445061684,
0.0432782806456089,
0.026010150089859962,
0.03339202702045441,
0.1131511703133583,
0.007286374922841787,
-0.017165301367640495,
-0.049517273902893066,
-0.01767166145145893,
0.0575902946293354,
0.011854211799800396,
-0.03967615216970444,
-0.09592282772064209,
0.033986203372478485,
0.06998468935489655,
0.06612313538789749,
0.055459942668676376,
0.05751765891909599,
-0.24423424899578094,
0.07398270070552826,
0.02545776404440403,
0.06934673339128494,
-0.1128830686211586,
0.039052270352840424,
-0.017768649384379387,
0.0016289283521473408,
0.10456082224845886,
-0.07654020935297012,
0.0898103266954422,
-0.05424756929278374,
0.055177606642246246,
0.056512702256441116,
-0.05483578145503998,
0.01260891929268837,
0.05255720391869545,
-0.2971818745136261,
0.0916801169514656,
-0.005188547074794769,
-0.005121838767081499,
-0.1474629044532776,
-0.04249090328812599,
0.14378656446933746,
0.11330953985452652,
0.1512017697095871,
0.013343053869903088,
-0.026651417836546898,
0.06943848729133606,
-0.12775400280952454,
0.05082809180021286,
-0.022850703448057175,
-0.1007351204752922,
-0.0017073149792850018,
0.044303543865680695,
-0.00004401123078423552,
-0.02643541619181633,
0.021803028881549835,
-0.1994394212961197,
-0.06137772276997566,
0.051502298563718796,
0.11055748164653778,
-0.031730715185403824,
-0.013462420552968979,
-0.04314345866441727,
-0.20139488577842712,
0.00279819848947227,
0.08348909765481949,
-0.09622669965028763,
-0.1554672122001648,
0.08004387468099594,
0.0338427871465683,
0.001768733374774456,
-0.0015058505814522505,
0.007950058206915855,
0.062374014407396317,
0.02220836654305458,
-0.1898108422756195,
0.05122524872422218,
-0.10105579346418381,
0.05094864219427109,
-0.07188500463962555,
0.002606484806165099,
-0.014364932663738728,
0.01414838619530201,
0.020206404849886894,
0.03272232785820961,
0.002359574194997549,
-0.06047966703772545,
0.18001863360404968,
-0.053858645260334015,
0.064224012196064,
0.07718273252248764,
-0.0037907923106104136,
-0.12640394270420074,
0.010714506730437279,
0.029772132635116577,
0.06649398803710938,
-0.07455450296401978,
-0.025577237829566002,
0.019390832632780075,
0.052541300654411316,
0.002434883266687393,
-0.19991619884967804,
0.008422939106822014,
0.06013435125350952,
-0.008406098932027817,
0.037657011300325394,
-0.25036540627479553,
0.19201143085956573,
0.11560267210006714,
-0.04070166125893593,
0.12085022032260895,
-0.13082176446914673,
-0.04458474740386009,
0.17311500012874603,
0.15671491622924805,
0.2822648882865906,
-0.1722131222486496,
0.008332222700119019,
0.002991106826812029,
-0.24474455416202545,
0.1885797083377838,
-0.18843187391757965,
0.025285637006163597,
-0.03373825550079346,
0.07822912931442261,
0.0004789732920471579,
-0.038167767226696014,
0.14099550247192383,
0.05000779777765274,
0.1426737755537033,
-0.07183089107275009,
-0.04056265577673912,
0.11221770197153091,
-0.035268329083919525,
0.17252101004123688,
0.010420533828437328,
0.011019911617040634,
-0.10445883870124817,
-0.03023689240217209,
0.00001825348772399593,
-0.03516165539622307,
0.04672786220908165,
-0.1099538654088974,
-0.013306516222655773,
0.06919640302658081,
-0.01834193989634514,
-0.015144732780754566,
0.0794413685798645,
-0.0018900763243436813,
-0.12359786033630371,
0.032224781811237335,
0.06061238795518875,
-0.04605277627706528,
-0.10784850269556046,
-0.08276116847991943,
-0.06002534180879593,
0.09542622417211533,
-0.12079548090696335,
0.05611187219619751,
0.12710167467594147,
-0.019959338009357452,
0.10014607757329941,
-0.009585767984390259,
0.042321085929870605,
0.035372305661439896,
0.10629913210868835,
-0.052122682332992554,
-0.07551223039627075,
-0.004767723847180605,
0.0916266143321991,
0.06682297587394714,
0.11592280864715576,
0.11121486127376556,
-0.022897426038980484,
0.02673128992319107,
-0.038436368107795715,
0.003960721660405397,
-0.03120112419128418,
0.05944579839706421,
0.03647160902619362,
-0.020885087549686432,
-0.09137305617332458,
0.05022880434989929,
0.08037394285202026,
0.01799142360687256,
-0.01580898091197014,
-0.13890127837657928,
-0.10043293982744217,
-0.10124759376049042,
-0.07074092328548431,
0.2682841718196869,
-0.12274467200040817,
-0.11543720215559006,
0.011555030941963196,
-0.14053355157375336,
0.04769646376371384,
0.047346942126750946,
0.06736234575510025,
-0.038410432636737823,
-0.09242360293865204,
-0.0626889020204544,
-0.07577817142009735,
0.0505046583712101,
-0.03593822568655014,
0.0937412902712822,
-0.09073323011398315,
-0.15823984146118164,
-0.05142975598573685,
0.057828668504953384,
-0.11044541746377945,
-0.0041547915898263454,
0.007560988422483206,
-0.0052565219812095165,
-0.3648691177368164,
0.025748465210199356,
-0.0007589291781187057,
0.000011146647011628374,
0.03869282081723213,
0.018063364550471306,
0.008411859162151814,
0.006151337176561356,
-0.07964062690734863,
0.03279333561658859,
0.06859949976205826,
0.04832098260521889,
-0.002539008390158415,
-0.08120903372764587,
0.008904081769287586,
0.0278286375105381,
0.10959132015705109,
-0.001373060280457139,
-0.04922565072774887,
-0.0570853054523468,
-0.03649687021970749,
-0.11935416609048843,
0.07014910131692886,
0.10074497759342194,
-0.015542539767920971,
-0.07702406495809555,
0.005934626329690218,
0.038920361548662186,
-0.09911037236452103,
0.003970846068114042,
0.0010430844267830253,
-0.03031894750893116,
-0.02755272015929222,
-0.09097129851579666,
0.015414787456393242,
-0.02143988385796547,
-0.03973168879747391,
0.03923819959163666,
0.07819241285324097,
0.0947299674153328,
-0.011310371570289135,
-0.0775054320693016,
-0.09067521244287491,
0.00840833317488432,
-0.025893189013004303,
-0.07585926353931427,
-0.19222508370876312,
-0.005194506607949734,
0.022015344351530075,
-0.015524211339652538,
0.2772175967693329,
-0.037737879902124405,
-0.12082842737436295,
-0.04120047017931938,
0.12533076107501984,
0.07064048200845718,
-0.0786217600107193,
0.14954929053783417,
0.026500917971134186,
-0.007036754861474037,
-0.051647547632455826,
0.05501392483711243,
0.10817182064056396,
0.04679058492183685,
0.08110145479440689,
0.03900148347020149,
0.007851704955101013,
-0.056807197630405426,
0.04284299165010452,
0.041909027844667435,
-0.004921941552311182,
-0.0010102541418746114,
0.010782686062157154,
0.022879818454384804,
0.02759484201669693,
-0.11363280564546585,
0.17610695958137512,
-0.055240027606487274,
-0.0034533541183918715,
0.024724995717406273,
-0.004804154857993126,
0.022328291088342667,
-0.19652795791625977,
-0.0731559470295906,
-0.01977468468248844,
0.012441088445484638,
-0.08184082061052322,
-0.06342485547065735,
0.15359164774417877,
0.1090528592467308,
-0.030977100133895874,
0.1068425104022026,
-0.08036547899246216,
0.07582592964172363,
0.12557744979858398,
-0.054304089397192,
-0.052725765854120255,
-0.0005340344505384564,
0.04720211401581764,
-0.029301941394805908,
0.07836552709341049,
-0.000682393612805754,
-0.008419795893132687,
0.03433169052004814,
-0.03417610004544258,
0.03291497007012367,
-0.004622556269168854,
-0.08188677579164505,
-0.04998306185007095,
-0.08671633154153824,
0.1003974974155426,
0.06245763227343559,
0.035964008420705795,
0.03642873838543892,
0.04375223442912102,
-0.006964957807213068,
-0.06216883286833763,
-0.17111806571483612,
0.05853267386555672,
-0.08576313406229019,
0.034278396517038345,
0.03276921063661575,
0.01090660598129034,
0.04570246487855911,
0.13717858493328094,
0.11615872383117676,
-0.13994254171848297,
-0.04323326796293259,
-0.02349189482629299,
0.011178234592080116,
0.03612798452377319,
0.10668228566646576,
0.024253807961940765,
0.10577207058668137,
-0.05717845633625984,
-0.07145781069993973,
-0.10331214219331741,
-0.02975328639149666,
-0.0758575052022934,
-0.030434302985668182,
0.1197015792131424,
-0.04121224954724312,
-0.026977557688951492,
0.07989685237407684,
-0.05580839887261391,
0.08735701441764832,
-0.009653056971728802,
0.044212862849235535,
-0.08814902603626251,
-0.03656701743602753,
-0.05448112264275551,
0.032415784895420074,
-0.011135491542518139,
-0.1455225944519043,
0.10658538341522217,
-0.022212965413928032,
-0.04498746246099472,
-0.10528945177793503,
0.017171181738376617,
0.050768546760082245,
0.01299330871552229,
0.17471933364868164,
0.032106366008520126,
0.03941112011671066,
0.05903363972902298,
-0.08542954176664352,
-0.11311835050582886,
0.17735131084918976,
-0.06434524059295654,
0.07653582096099854,
0.03769739344716072,
-0.025287998840212822,
0.0006956082070246339,
-0.06779806315898895,
0.05265225097537041,
0.025836488232016563,
-0.06674999743700027,
0.06991570442914963,
-0.020100077614188194,
-0.07505619525909424,
0.10329066216945648,
-0.019149350002408028,
0.07968247681856155,
0.05546658858656883,
-0.017219169065356255,
-0.007364541757851839,
-0.05464257672429085,
0.09526226669549942,
0.03672493249177933,
-0.06823329627513885,
-0.060524821281433105,
0.026955096051096916,
0.03971374034881592,
-0.10320404917001724,
-0.02124539017677307,
-0.033673759549856186,
-0.07346346229314804,
-0.1179715022444725,
0.006831474602222443,
-0.0031622382812201977,
-0.005776181351393461,
0.23137758672237396,
0.05114620178937912,
-0.050697747617959976,
0.1270836591720581,
-0.04713927581906319,
0.017685458064079285,
-0.0682072788476944,
-0.021473420783877373
] |
null | null | transformers |
# merged-dpo-binarized-NeutrixOmnibe-7B
merged-dpo-binarized-NeutrixOmnibe-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [eren23/dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/dpo-binarized-NeutrixOmnibe-7B)
* [Kukedlc/NeuTrixOmniBe-7B-model-remix](https://huggingface.co/Kukedlc/NeuTrixOmniBe-7B-model-remix)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: eren23/dpo-binarized-NeutrixOmnibe-7B
layer_range: [0, 32]
- model: Kukedlc/NeuTrixOmniBe-7B-model-remix
layer_range: [0, 32]
merge_method: slerp
base_model: eren23/dpo-binarized-NeutrixOmnibe-7B
parameters:
t:
- filter: self_attn
value: [0.2, 0.7, 0.8, 0.7, 1]
- filter: mlp
value: [0.8, 0.3, 0.2, 0.3, 0]
- value: 0.45
dtype: bfloat16
```
## 💻 Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "eren23/merged-dpo-binarized-NeutrixOmnibe-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "apache-2.0", "tags": ["merge", "mergekit", "lazymergekit", "eren23/dpo-binarized-NeutrixOmnibe-7B", "Kukedlc/NeuTrixOmniBe-7B-model-remix"], "base_model": ["eren23/dpo-binarized-NeutrixOmnibe-7B", "Kukedlc/NeuTrixOmniBe-7B-model-remix"]} | text-generation | eren23/merged-dpo-binarized-NeutrixOmnibe-7B | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"eren23/dpo-binarized-NeutrixOmnibe-7B",
"Kukedlc/NeuTrixOmniBe-7B-model-remix",
"base_model:eren23/dpo-binarized-NeutrixOmnibe-7B",
"base_model:Kukedlc/NeuTrixOmniBe-7B-model-remix",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T12:05:18+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #eren23/dpo-binarized-NeutrixOmnibe-7B #Kukedlc/NeuTrixOmniBe-7B-model-remix #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-Kukedlc/NeuTrixOmniBe-7B-model-remix #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# merged-dpo-binarized-NeutrixOmnibe-7B
merged-dpo-binarized-NeutrixOmnibe-7B is a merge of the following models using LazyMergekit:
* eren23/dpo-binarized-NeutrixOmnibe-7B
* Kukedlc/NeuTrixOmniBe-7B-model-remix
## Configuration
## Usage
| [
"# merged-dpo-binarized-NeutrixOmnibe-7B\n\nmerged-dpo-binarized-NeutrixOmnibe-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeutrixOmnibe-7B\n* Kukedlc/NeuTrixOmniBe-7B-model-remix",
"## Configuration",
"## Usage"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #eren23/dpo-binarized-NeutrixOmnibe-7B #Kukedlc/NeuTrixOmniBe-7B-model-remix #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-Kukedlc/NeuTrixOmniBe-7B-model-remix #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# merged-dpo-binarized-NeutrixOmnibe-7B\n\nmerged-dpo-binarized-NeutrixOmnibe-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeutrixOmnibe-7B\n* Kukedlc/NeuTrixOmniBe-7B-model-remix",
"## Configuration",
"## Usage"
] | [
152,
90,
4,
3
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #eren23/dpo-binarized-NeutrixOmnibe-7B #Kukedlc/NeuTrixOmniBe-7B-model-remix #base_model-eren23/dpo-binarized-NeutrixOmnibe-7B #base_model-Kukedlc/NeuTrixOmniBe-7B-model-remix #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# merged-dpo-binarized-NeutrixOmnibe-7B\n\nmerged-dpo-binarized-NeutrixOmnibe-7B is a merge of the following models using LazyMergekit:\n* eren23/dpo-binarized-NeutrixOmnibe-7B\n* Kukedlc/NeuTrixOmniBe-7B-model-remix## Configuration## Usage"
] | [
0.006481424439698458,
0.07899288088083267,
-0.007263829931616783,
-0.012710833922028542,
0.03473547473549843,
0.05685809254646301,
0.16127575933933258,
0.1233762875199318,
0.0278469230979681,
0.06110852211713791,
0.02671724371612072,
0.12109290063381195,
0.0372239351272583,
0.09983449429273605,
-0.03625025600194931,
-0.1868487447500229,
0.07877863198518753,
0.009591524489223957,
-0.03939344733953476,
0.06967917084693909,
0.0867411345243454,
-0.023633034899830818,
0.054913390427827835,
-0.003188937436789274,
-0.058052029460668564,
0.015539632178843021,
-0.017055872827768326,
-0.03197711333632469,
0.06198666989803314,
0.058317624032497406,
0.07695756107568741,
0.05775193125009537,
-0.040374353528022766,
-0.21812818944454193,
0.0190576184540987,
0.03429816663265228,
-0.03788651153445244,
0.04089811071753502,
0.06190371513366699,
-0.030024537816643715,
0.14569191634655,
-0.14017315208911896,
0.018752509728074074,
0.033823657780885696,
-0.1076359897851944,
-0.03549046441912651,
-0.13430313766002655,
0.14420482516288757,
0.07126133888959885,
0.0420190654695034,
-0.027435261756181717,
0.11531611531972885,
-0.026668457314372063,
0.06862810999155045,
0.15271635353565216,
-0.23097571730613708,
-0.041440144181251526,
0.08353564143180847,
0.08807314187288284,
-0.015308540314435959,
0.0035231427755206823,
0.03413357213139534,
-0.0024620969779789448,
-0.017607368528842926,
-0.011770584620535374,
-0.09694493561983109,
0.09288710355758667,
-0.07532456517219543,
-0.12917004525661469,
0.01746484450995922,
0.048393599689006805,
0.03392716124653816,
0.006585234310477972,
-0.10213980823755264,
-0.07449133694171906,
0.05050879716873169,
-0.03572072461247444,
-0.07607576996088028,
0.008557146415114403,
-0.05276035517454147,
0.04415467754006386,
-0.08266539126634598,
0.007250209804624319,
0.0018801020924001932,
-0.07929742336273193,
0.1908281147480011,
0.033740002661943436,
-0.03433120623230934,
-0.01761550083756447,
0.03024129383265972,
-0.06593997031450272,
-0.11748584359884262,
0.003070191480219364,
-0.03556482866406441,
-0.0015590398106724024,
-0.028399096801877022,
-0.030041184276342392,
-0.09947369247674942,
0.12330903112888336,
0.2440124750137329,
-0.06245068833231926,
0.03557107597589493,
0.004258114844560623,
0.014917214401066303,
-0.004851342644542456,
-0.027201391756534576,
-0.12298768013715744,
-0.18808069825172424,
0.05519392713904381,
0.0798773393034935,
0.08253701776266098,
-0.0014302830677479506,
-0.060187142342329025,
-0.05682744085788727,
0.042433448135852814,
0.05756203085184097,
0.09362465888261795,
0.06700640916824341,
-0.04961354285478592,
-0.07034678012132645,
0.16327105462551117,
-0.07365216314792633,
0.02192324958741665,
-0.03195149451494217,
-0.018479129299521446,
0.031226936727762222,
0.011449120007455349,
0.045526497066020966,
-0.041740596294403076,
0.07630033791065216,
-0.07266570627689362,
-0.01682044006884098,
-0.024565069004893303,
-0.07876330614089966,
0.03399646282196045,
-0.07575514167547226,
-0.03410430625081062,
-0.10622399300336838,
-0.09350524097681046,
-0.03082968480885029,
-0.0009525084169581532,
-0.041360992938280106,
-0.016145342960953712,
-0.04692581295967102,
0.011996311135590076,
0.01952207274734974,
0.014986072666943073,
-0.04896746948361397,
0.009458719752728939,
-0.014293895103037357,
0.0036985124461352825,
0.05115381255745888,
-0.11025147885084152,
0.020513731986284256,
-0.08270788937807083,
0.11371772736310959,
-0.19224199652671814,
0.05036243423819542,
-0.09317135065793991,
-0.0042287264950573444,
-0.14077329635620117,
-0.01407076045870781,
-0.02213006094098091,
-0.01844826713204384,
0.08400284498929977,
0.10729257017374039,
-0.08811485022306442,
-0.07169122993946075,
0.10322322696447372,
-0.08157429844141006,
-0.09444136172533035,
0.07428467273712158,
0.019679974764585495,
0.026507386937737465,
0.03369010239839554,
0.21350540220737457,
0.12186647206544876,
0.00451485114172101,
-0.10934856534004211,
-0.06259728223085403,
0.0033299627248197794,
0.10997346043586731,
0.03449484333395958,
-0.0389477014541626,
-0.016721539199352264,
0.022987347096204758,
0.030386999249458313,
0.05240510031580925,
-0.011738471686840057,
-0.037601348012685776,
-0.01956787146627903,
-0.04981962963938713,
0.1548609733581543,
-0.06449571996927261,
0.001403728616423905,
-0.0488138385117054,
-0.04383828863501549,
0.11975185573101044,
0.11973012983798981,
-0.025573402643203735,
0.0383358895778656,
-0.06093525141477585,
0.10450347512960434,
-0.025966953486204147,
0.03257891908288002,
-0.10215829312801361,
-0.104578398168087,
0.008296963758766651,
-0.09692294895648956,
0.03475164622068405,
-0.09100618958473206,
0.030187997967004776,
0.0012607424287125468,
-0.0809352844953537,
-0.035502709448337555,
0.013768509961664677,
0.04164648428559303,
-0.031211182475090027,
-0.11067569255828857,
-0.03380390629172325,
-0.018189435824751854,
0.20134684443473816,
-0.07103928178548813,
0.02553475834429264,
-0.029344644397497177,
0.23129644989967346,
-0.003610900603234768,
-0.001333951367996633,
0.02111191861331463,
0.06271941214799881,
-0.016709744930267334,
-0.001713076257146895,
0.054896753281354904,
-0.03901967778801918,
-0.13733786344528198,
0.05414937809109688,
-0.09255479276180267,
-0.025862274691462517,
0.07916924357414246,
0.07778342813253403,
-0.037554528564214706,
-0.006187497638165951,
0.00626242533326149,
-0.07259540259838104,
0.09849530458450317,
-0.030031153932213783,
0.0386931449174881,
0.04903949052095413,
0.056219879537820816,
-0.03438207879662514,
-0.02173122763633728,
0.004720275290310383,
-0.0344674177467823,
-0.04807889461517334,
0.07114153355360031,
0.005949009675532579,
-0.2823849022388458,
0.0921473577618599,
0.1588006466627121,
-0.018132127821445465,
0.1542937457561493,
0.028184877708554268,
-0.01647820509970188,
-0.08235139399766922,
-0.005908608436584473,
0.047469403594732285,
-0.01202141959220171,
-0.017188692465424538,
0.0449599027633667,
0.03960219398140907,
-0.0012721805833280087,
0.042009420692920685,
-0.04362154379487038,
0.013740114867687225,
-0.03566483035683632,
-0.00958119984716177,
0.09841830283403397,
0.10021357238292694,
0.015183819457888603,
0.06352578103542328,
0.03475271537899971,
-0.029754307121038437,
0.03692733496427536,
-0.00690701138228178,
-0.040100935846567154,
0.12231207638978958,
-0.1593119204044342,
-0.22526682913303375,
-0.13993631303310394,
-0.025818008929491043,
-0.16009879112243652,
-0.03808680921792984,
0.03282355144619942,
0.019853919744491577,
-0.005536946933716536,
-0.05801672115921974,
0.08568570017814636,
0.006859902758151293,
-0.062523752450943,
-0.06591648608446121,
0.004916398320347071,
0.039165955036878586,
-0.09952691942453384,
-0.019285758957266808,
0.03068183735013008,
-0.05208780989050865,
0.06514477729797363,
-0.03763803467154503,
0.11406466364860535,
0.0896456316113472,
0.013325990177690983,
-0.07370147854089737,
-0.011085412465035915,
0.14519400894641876,
-0.08789270371198654,
0.05213211849331856,
0.11159849166870117,
-0.03169390186667442,
0.06879174709320068,
0.155056431889534,
0.01347765140235424,
-0.013596528209745884,
0.004303209483623505,
0.06179263815283775,
0.02232164517045021,
-0.18565167486667633,
-0.10624860972166061,
-0.038300707936286926,
0.05770610272884369,
0.010508309118449688,
0.03435149043798447,
0.08710943907499313,
0.04374688118696213,
-0.08495565503835678,
0.013355333358049393,
0.027513807639479637,
0.06059354543685913,
0.23399190604686737,
0.00772554986178875,
0.08904816210269928,
-0.026202000677585602,
-0.07330641895532608,
0.041525498032569885,
0.055599819868803024,
0.07913210988044739,
0.05570923164486885,
0.16257153451442719,
0.05469905957579613,
0.08802780508995056,
0.05082092434167862,
0.03737546131014824,
0.015109339728951454,
-0.002608630806207657,
0.008028481155633926,
-0.07462407648563385,
0.008883398026227951,
0.020377373322844505,
0.04059780016541481,
0.010921808890998363,
0.0007673325599171221,
-0.03392266854643822,
0.09541372209787369,
0.11852468550205231,
0.08572545647621155,
-0.1668611764907837,
-0.012491580098867416,
0.022152714431285858,
0.001529455534182489,
-0.054304223507642746,
-0.03742679953575134,
-0.02712145261466503,
-0.09305806457996368,
0.15507161617279053,
-0.05137168988585472,
0.05476291850209236,
-0.06353209912776947,
0.010232396423816681,
-0.025857647880911827,
0.09227513521909714,
-0.009478727355599403,
0.057430241256952286,
-0.22003374993801117,
0.08619443327188492,
0.021972907707095146,
-0.009616806171834469,
0.010713297873735428,
-0.0016912383725866675,
0.04269392043352127,
0.08130055665969849,
0.07124917954206467,
0.02410365454852581,
0.05491655692458153,
-0.0536041185259819,
-0.0922565683722496,
-0.03739207610487938,
0.10287529230117798,
-0.06096870079636574,
0.10849250108003616,
-0.009142052382230759,
-0.038551267236471176,
-0.023918025195598602,
0.11923201382160187,
-0.2018805891275406,
-0.1300021857023239,
0.10531812161207199,
0.036744192242622375,
0.03568039834499359,
-0.0807282105088234,
-0.02544992044568062,
-0.021402545273303986,
0.22065091133117676,
0.01579168438911438,
-0.0403987392783165,
-0.0775178074836731,
-0.07035169750452042,
0.18058879673480988,
-0.09439559280872345,
0.05635137856006622,
-0.024229703471064568,
0.02865063212811947,
-0.08837705850601196,
-0.13900262117385864,
0.06957229971885681,
-0.049465566873550415,
-0.12025603652000427,
-0.027711059898138046,
0.09968562424182892,
-0.004970150534063578,
0.020763901993632317,
0.001579047180712223,
0.057414017617702484,
0.03720782697200775,
-0.03753810375928879,
0.0042141578160226345,
0.12242883443832397,
-0.013649865053594112,
0.11309285461902618,
-0.04692031443119049,
-0.01711849868297577,
-0.10448908060789108,
-0.021000802516937256,
0.129139244556427,
0.2583800256252289,
0.006399169098585844,
0.046132899820804596,
0.08014579117298126,
-0.07659880816936493,
-0.20782704651355743,
-0.08147991448640823,
0.039983075112104416,
-0.006699507590383291,
0.027262935414910316,
-0.1377975195646286,
0.018204648047685623,
0.1502668261528015,
0.005884448066353798,
0.10326816886663437,
-0.2476445436477661,
-0.11958387494087219,
0.041141510009765625,
0.03570038825273514,
0.06465469300746918,
-0.12855903804302216,
-0.08663518726825714,
-0.0916094183921814,
-0.1665789633989334,
0.1336306482553482,
-0.05395038053393364,
0.08945540338754654,
-0.06550706177949905,
0.02034606970846653,
0.021682582795619965,
-0.001185036962851882,
0.12455824762582779,
-0.029532156884670258,
0.003983908332884312,
-0.08283063024282455,
-0.086481973528862,
0.08356781303882599,
-0.07380977272987366,
0.03078642673790455,
-0.0846424251794815,
0.014239290729165077,
-0.0344148725271225,
-0.009213733486831188,
-0.07630517333745956,
0.0769721120595932,
-0.06977827847003937,
-0.03714864328503609,
-0.041926126927137375,
0.08051609992980957,
0.020836342126131058,
0.015958931297063828,
0.20490668714046478,
-0.005734411999583244,
0.08896125108003616,
0.159647136926651,
0.05488399788737297,
0.02986622229218483,
0.011115710251033306,
0.002247728407382965,
-0.060644861310720444,
0.02894163876771927,
0.05753706768155098,
0.010812204331159592,
0.09075938910245895,
0.0045409672893583775,
0.09537707269191742,
0.015530862845480442,
-0.10972701758146286,
-0.03562226518988609,
0.09746336936950684,
-0.11938299238681793,
-0.13443611562252045,
-0.04040868207812309,
-0.05835403501987457,
-0.06386149674654007,
0.02600144036114216,
0.22407793998718262,
-0.011897863820195198,
-0.050881125032901764,
0.029961463063955307,
0.03134514391422272,
-0.09132224321365356,
0.11598562449216843,
0.028407501056790352,
0.029586689546704292,
-0.04488489776849747,
0.06723980605602264,
0.09478466957807541,
-0.07712461054325104,
0.001495574600994587,
0.09730365872383118,
-0.0802672877907753,
-0.08743579685688019,
-0.0874745175242424,
0.13652706146240234,
-0.08114998042583466,
-0.029125114902853966,
-0.07049451023340225,
-0.07446355372667313,
-0.008316350169479847,
0.0989895686507225,
0.028474632650613785,
0.020910708233714104,
0.04355313256382942,
-0.03196315839886665,
-0.045064546167850494,
0.08408930897712708,
0.009634257294237614,
0.12544648349285126,
-0.05112403631210327,
0.012470480985939503,
-0.04122103750705719,
0.010567681863904,
-0.023942554369568825,
0.0371161513030529,
-0.1412726789712906,
-0.06182045117020607,
-0.1914307028055191,
-0.04313558712601662,
-0.13979408144950867,
-0.03314583748579025,
-0.008316779509186745,
0.04966561496257782,
-0.008481177501380444,
-0.02652900293469429,
-0.025122806429862976,
-0.09947288036346436,
-0.03555105999112129,
0.0751546174287796,
-0.06586570292711258,
-0.028522904962301254,
0.03007100149989128,
-0.06177002936601639,
0.0751381367444992,
0.018235808238387108,
0.051627639681100845,
0.010430492460727692,
-0.03906274959445,
-0.04797583445906639,
-0.00419715978205204,
0.022814160212874413,
-0.00032128774910233915,
-0.15493592619895935,
-0.049380917102098465,
-0.01718912646174431,
-0.03313445299863815,
0.00584567291662097,
-0.007825164124369621,
-0.10930432379245758,
-0.04131878912448883,
-0.05672965571284294,
-0.05882732942700386,
-0.04482918232679367,
0.00299432803876698,
0.03839072957634926,
0.03958314284682274,
0.11570271849632263,
-0.052556250244379044,
0.055814675986766815,
-0.10947675257921219,
-0.016875123605132103,
-0.008219153620302677,
-0.0937754213809967,
0.09207192063331604,
-0.042458292096853256,
0.030209409072995186,
-0.002566227223724127,
0.08868733793497086,
-0.04949003458023071,
-0.06888587772846222,
0.011022319085896015,
-0.09567346423864365,
-0.054055556654930115,
0.043147217482328415,
0.16883695125579834,
0.07751820981502533,
-0.04064437747001648,
-0.07700426131486893,
0.038687996566295624,
-0.02653094381093979,
0.06567975878715515,
0.06411685794591904,
0.11247708648443222,
0.04551446810364723,
0.048212844878435135,
0.09546087682247162,
-0.03302180394530296,
0.02730276621878147,
0.006407182198017836,
0.015538147650659084,
0.06200997531414032,
0.005820737686008215,
0.07435426861047745,
0.10246776789426804,
-0.14773523807525635,
0.0741102397441864,
0.015388667583465576,
-0.05094742029905319,
-0.0540766566991806,
-0.13019070029258728,
-0.10638127475976944,
-0.10411621630191803,
-0.0021699306089431047,
-0.10987445712089539,
0.0006277553620748222,
0.00812614243477583,
0.00785371009260416,
-0.015428513288497925,
0.11694923043251038,
-0.02857966348528862,
-0.06418181210756302,
0.08229809999465942,
0.0023072545882314444,
-0.04233364388346672,
0.008101441897451878,
-0.02938859909772873,
0.0136089026927948,
0.024433724582195282,
0.024679552763700485,
0.006241311319172382,
-0.03286581113934517,
0.05629802122712135,
-0.041113242506980896,
-0.10155877470970154,
0.02268991619348526,
0.03087770566344261,
0.041804857552051544,
0.10532369464635849,
0.0573023296892643,
-0.02868845872581005,
-0.012453199364244938,
0.11222820729017258,
-0.00791923888027668,
-0.11087950319051743,
-0.03385430574417114,
0.17145180702209473,
0.025428319349884987,
0.05371563881635666,
-0.0013824492925778031,
-0.08473064750432968,
-0.008275356143712997,
0.09009461104869843,
0.22548982501029968,
-0.03630347177386284,
0.01121012307703495,
0.02695528417825699,
0.014867018908262253,
0.04091772437095642,
0.02458774298429489,
0.051759421825408936,
0.12981480360031128,
-0.028904953971505165,
0.03721142187714577,
-0.023933449760079384,
-0.046700138598680496,
-0.0949677973985672,
-0.005066905170679092,
0.03763393312692642,
-0.023931436240673065,
0.003813034389168024,
0.06878066062927246,
-0.10546865314245224,
-0.04758370667695999,
0.0011765295639634132,
-0.14386609196662903,
-0.13990364968776703,
-0.07611856609582901,
0.048159629106521606,
-0.00842998269945383,
0.07451566308736801,
0.0030264335218816996,
-0.04007803648710251,
0.09031684696674347,
-0.005068937316536903,
-0.03718426451086998,
-0.061532940715551376,
0.014102213084697723,
-0.021672973409295082,
0.04471174255013466,
-0.010125783272087574,
0.04898564890027046,
0.11287246644496918,
0.017355335876345634,
-0.0876096859574318,
0.08259587734937668,
0.03117964044213295,
-0.02815091796219349,
0.05434555187821388,
0.12141972780227661,
-0.0035049142315983772,
0.04975191876292229,
0.07830701023340225,
-0.10629161447286606,
0.0568704828619957,
0.11612004786729813,
-0.008048124611377716,
-0.0733405202627182,
0.09592842310667038,
-0.09134037047624588,
0.14956249296665192,
0.20200513303279877,
-0.04692286252975464,
0.03175673261284828,
-0.01807934045791626,
-0.019238894805312157,
0.08881688863039017,
0.11194134503602982,
-0.050627510994672775,
-0.1363867074251175,
-0.0007353169494308531,
-0.00996345840394497,
0.02624703012406826,
-0.2178886979818344,
-0.06033246964216232,
-0.07666267454624176,
-0.017267176881432533,
-0.06543910503387451,
0.08204393833875656,
0.09157898277044296,
-0.010417400859296322,
-0.016628695651888847,
-0.1976376324892044,
-0.026601115241646767,
0.084436796605587,
-0.11533164978027344,
-0.07115159928798676
] |
null | null | diffusers |
# Youtube Thumbnail Suggestion Model Card
<!-- Provide a quick summary of what the model is/does. [Optional] -->
Presenting our groundbreaking generative model, specifically engineered for YouTube content. This innovative model, generates visually captivating and realistic thumbnails based on input prompts. Designed to elevate content aesthetics and viewer engagement, this tool represents a significant advancement in custom thumbnail creation for YouTube videos.
# Table of Contents
- [Model Details](#model-details)
- [Model Description](#model-description)
- [Uses](#uses)
- [Direct Use](#direct-use)
- [Downstream Use [Optional]](#downstream-use-optional)
- [Out-of-Scope Use](#out-of-scope-use)
- [Bias, Risks, and Limitations](#bias-risks-and-limitations)
- [Recommendations](#recommendations)
- [Training Details](#training-details)
- [Training Data](#training-data)
- [Training Procedure](#training-procedure)
- [Preprocessing](#preprocessing)
- [Speeds, Sizes, Times](#speeds-sizes-times)
- [Evaluation](#evaluation)
- [Testing Data, Factors & Metrics](#testing-data-factors--metrics)
- [Testing Data](#testing-data)
- [Factors](#factors)
- [Metrics](#metrics)
- [Results](#results)
- [Model Examination](#model-examination)
- [Environmental Impact](#environmental-impact)
- [Technical Specifications [optional]](#technical-specifications-optional)
- [Model Architecture and Objective](#model-architecture-and-objective)
- [Compute Infrastructure](#compute-infrastructure)
- [Hardware](#hardware)
- [Software](#software)
- [Citation](#citation)
- [Glossary [optional]](#glossary-optional)
- [More Information [optional]](#more-information-optional)
- [Model Card Authors [optional]](#model-card-authors-optional)
- [Model Card Contact](#model-card-contact)
- [How to Get Started with the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
<!-- Provide a longer summary of what this model is/does. -->
The model is meticulously crafted for YouTube content creators. This innovative model, tailored for thumbnail generation, ingeniously suggests visually striking thumbnails based on user-input prompts. With a focus on creativity and customization, this tool empowers users to enhance their video presence by effortlessly generating eye-catching and contextually relevant thumbnails, optimizing visual appeal and audience engagement on the YouTube.
- **Developed by:** MagicalAPI Co.
- **Model type:** Diffusion-based text-to-image generative model
- **Language(s):** en
- **License:** openrail++
- **Base Model:** Base model is enhanced from [SD-XL 1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0)
- **Resources for more information:** You can find out more details in our [Github (comming soon)]()
# Uses
This model is mainly designed for those creating videos and contents on YouTube, for better experiencing, describe your intended thumbnail by pointing on necessary elements or situation, separating by comma; for example:
"The office is modern and vibrant, filled with young, diverse professionals (various descents and genders) engaged in creative work. There are brainstorming sessions happening.
## Downstream Use
The base model is designed for generating Youtube thumbnails in various usecases and categories; specifically, you can also finetune the model with the particular images and captions needed for your own channel.
## Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
It is crucial to emphasize that this tool should never be employed for the creation, promotion, or endorsement of abusive, violent, or pornographic materials, as its purpose is to contribute to a constructive and enriching online environment.
# Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
# Training Details
## Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
More information on training data needed
## Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
### Preprocessing
More information needed
### Speeds, Sizes, Times
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
More information needed
# Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
## Testing Data, Factors & Metrics
### Testing Data
<!-- This should link to a Data Card if possible. -->
More information needed
### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
More information needed
### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
More information needed
## Results
More information needed
# Model Examination
More information needed
# Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
# Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
More information needed
**APA:**
More information needed
# Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
<!-- This section provides another layer of transparency and accountability. Whose views is this model card representing? How many voices were included in its construction? Etc. -->
MagicalAPI
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
More information needed
</details>
| {"language": ["en"], "license": "openrail++", "tags": ["text-to-image", "stable-diffusion", "thumbnail-generator", "youtube"]} | text-to-image | magicalapi/YouTube_Thumbnail_Suggestion | [
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"thumbnail-generator",
"youtube",
"en",
"arxiv:1910.09700",
"license:openrail++",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | 2024-02-12T12:09:01+00:00 | [
"1910.09700"
] | [
"en"
] | TAGS
#diffusers #safetensors #text-to-image #stable-diffusion #thumbnail-generator #youtube #en #arxiv-1910.09700 #license-openrail++ #endpoints_compatible #diffusers-StableDiffusionXLPipeline #region-us
|
# Youtube Thumbnail Suggestion Model Card
Presenting our groundbreaking generative model, specifically engineered for YouTube content. This innovative model, generates visually captivating and realistic thumbnails based on input prompts. Designed to elevate content aesthetics and viewer engagement, this tool represents a significant advancement in custom thumbnail creation for YouTube videos.
# Table of Contents
- Model Details
- Model Description
- Uses
- Direct Use
- [Downstream Use [Optional]](#downstream-use-optional)
- Out-of-Scope Use
- Bias, Risks, and Limitations
- Recommendations
- Training Details
- Training Data
- Training Procedure
- Preprocessing
- Speeds, Sizes, Times
- Evaluation
- Testing Data, Factors & Metrics
- Testing Data
- Factors
- Metrics
- Results
- Model Examination
- Environmental Impact
- [Technical Specifications [optional]](#technical-specifications-optional)
- Model Architecture and Objective
- Compute Infrastructure
- Hardware
- Software
- Citation
- [Glossary [optional]](#glossary-optional)
- [More Information [optional]](#more-information-optional)
- [Model Card Authors [optional]](#model-card-authors-optional)
- Model Card Contact
- How to Get Started with the Model
# Model Details
## Model Description
The model is meticulously crafted for YouTube content creators. This innovative model, tailored for thumbnail generation, ingeniously suggests visually striking thumbnails based on user-input prompts. With a focus on creativity and customization, this tool empowers users to enhance their video presence by effortlessly generating eye-catching and contextually relevant thumbnails, optimizing visual appeal and audience engagement on the YouTube.
- Developed by: MagicalAPI Co.
- Model type: Diffusion-based text-to-image generative model
- Language(s): en
- License: openrail++
- Base Model: Base model is enhanced from SD-XL 1.0
- Resources for more information: You can find out more details in our [Github (comming soon)]()
# Uses
This model is mainly designed for those creating videos and contents on YouTube, for better experiencing, describe your intended thumbnail by pointing on necessary elements or situation, separating by comma; for example:
"The office is modern and vibrant, filled with young, diverse professionals (various descents and genders) engaged in creative work. There are brainstorming sessions happening.
## Downstream Use
The base model is designed for generating Youtube thumbnails in various usecases and categories; specifically, you can also finetune the model with the particular images and captions needed for your own channel.
## Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
It is crucial to emphasize that this tool should never be employed for the creation, promotion, or endorsement of abusive, violent, or pornographic materials, as its purpose is to contribute to a constructive and enriching online environment.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
# Training Details
## Training Data
More information on training data needed
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
More information needed
### Factors
More information needed
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: More information needed
- Hours used: More information needed
- Cloud Provider: More information needed
- Compute Region: More information needed
- Carbon Emitted: More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
BibTeX:
More information needed
APA:
More information needed
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
MagicalAPI
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
More information needed
</details>
| [
"# Youtube Thumbnail Suggestion Model Card\n\n\nPresenting our groundbreaking generative model, specifically engineered for YouTube content. This innovative model, generates visually captivating and realistic thumbnails based on input prompts. Designed to elevate content aesthetics and viewer engagement, this tool represents a significant advancement in custom thumbnail creation for YouTube videos.",
"# Table of Contents\n\n- Model Details\n - Model Description\n- Uses\n - Direct Use\n - [Downstream Use [Optional]](#downstream-use-optional)\n - Out-of-Scope Use\n- Bias, Risks, and Limitations\n - Recommendations\n- Training Details\n - Training Data\n - Training Procedure\n - Preprocessing\n - Speeds, Sizes, Times\n- Evaluation\n - Testing Data, Factors & Metrics\n - Testing Data\n - Factors\n - Metrics\n - Results\n- Model Examination\n- Environmental Impact\n- [Technical Specifications [optional]](#technical-specifications-optional)\n - Model Architecture and Objective\n - Compute Infrastructure\n - Hardware\n - Software\n- Citation\n- [Glossary [optional]](#glossary-optional)\n- [More Information [optional]](#more-information-optional)\n- [Model Card Authors [optional]](#model-card-authors-optional)\n- Model Card Contact\n- How to Get Started with the Model",
"# Model Details",
"## Model Description\n\n\n\n\n\nThe model is meticulously crafted for YouTube content creators. This innovative model, tailored for thumbnail generation, ingeniously suggests visually striking thumbnails based on user-input prompts. With a focus on creativity and customization, this tool empowers users to enhance their video presence by effortlessly generating eye-catching and contextually relevant thumbnails, optimizing visual appeal and audience engagement on the YouTube.\n\n- Developed by: MagicalAPI Co.\n- Model type: Diffusion-based text-to-image generative model\n- Language(s): en\n- License: openrail++\n- Base Model: Base model is enhanced from SD-XL 1.0 \n- Resources for more information: You can find out more details in our [Github (comming soon)]()",
"# Uses\n\nThis model is mainly designed for those creating videos and contents on YouTube, for better experiencing, describe your intended thumbnail by pointing on necessary elements or situation, separating by comma; for example: \n\"The office is modern and vibrant, filled with young, diverse professionals (various descents and genders) engaged in creative work. There are brainstorming sessions happening.",
"## Downstream Use\n\nThe base model is designed for generating Youtube thumbnails in various usecases and categories; specifically, you can also finetune the model with the particular images and captions needed for your own channel.",
"## Out-of-Scope Use\n\nThe model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.\n\nIt is crucial to emphasize that this tool should never be employed for the creation, promotion, or endorsement of abusive, violent, or pornographic materials, as its purpose is to contribute to a constructive and enriching online environment.",
"# Bias, Risks, and Limitations\n\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.",
"## Recommendations",
"# Training Details",
"## Training Data\n\n\n\nMore information on training data needed",
"## Training Procedure",
"### Preprocessing\n\nMore information needed",
"### Speeds, Sizes, Times\n\n\n\nMore information needed",
"# Evaluation",
"## Testing Data, Factors & Metrics",
"### Testing Data\n\n\n\nMore information needed",
"### Factors\n\n\n\nMore information needed",
"### Metrics\n\n\n\nMore information needed",
"## Results \n\nMore information needed",
"# Model Examination\n\nMore information needed",
"# Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed",
"# Technical Specifications [optional]",
"## Model Architecture and Objective\n\nMore information needed",
"## Compute Infrastructure\n\nMore information needed",
"### Hardware\n\nMore information needed",
"### Software\n\nMore information needed\n\nBibTeX:\n\nMore information needed\n\nAPA:\n\nMore information needed",
"# Glossary [optional]\n\n\n\nMore information needed",
"# More Information [optional]\n\nMore information needed",
"# Model Card Authors [optional]\n\n\n\nMagicalAPI",
"# Model Card Contact\n\nMore information needed",
"# How to Get Started with the Model\n\nUse the code below to get started with the model.\n\n<details>\n<summary> Click to expand </summary>\n\nMore information needed\n\n</details>"
] | [
"TAGS\n#diffusers #safetensors #text-to-image #stable-diffusion #thumbnail-generator #youtube #en #arxiv-1910.09700 #license-openrail++ #endpoints_compatible #diffusers-StableDiffusionXLPipeline #region-us \n",
"# Youtube Thumbnail Suggestion Model Card\n\n\nPresenting our groundbreaking generative model, specifically engineered for YouTube content. This innovative model, generates visually captivating and realistic thumbnails based on input prompts. Designed to elevate content aesthetics and viewer engagement, this tool represents a significant advancement in custom thumbnail creation for YouTube videos.",
"# Table of Contents\n\n- Model Details\n - Model Description\n- Uses\n - Direct Use\n - [Downstream Use [Optional]](#downstream-use-optional)\n - Out-of-Scope Use\n- Bias, Risks, and Limitations\n - Recommendations\n- Training Details\n - Training Data\n - Training Procedure\n - Preprocessing\n - Speeds, Sizes, Times\n- Evaluation\n - Testing Data, Factors & Metrics\n - Testing Data\n - Factors\n - Metrics\n - Results\n- Model Examination\n- Environmental Impact\n- [Technical Specifications [optional]](#technical-specifications-optional)\n - Model Architecture and Objective\n - Compute Infrastructure\n - Hardware\n - Software\n- Citation\n- [Glossary [optional]](#glossary-optional)\n- [More Information [optional]](#more-information-optional)\n- [Model Card Authors [optional]](#model-card-authors-optional)\n- Model Card Contact\n- How to Get Started with the Model",
"# Model Details",
"## Model Description\n\n\n\n\n\nThe model is meticulously crafted for YouTube content creators. This innovative model, tailored for thumbnail generation, ingeniously suggests visually striking thumbnails based on user-input prompts. With a focus on creativity and customization, this tool empowers users to enhance their video presence by effortlessly generating eye-catching and contextually relevant thumbnails, optimizing visual appeal and audience engagement on the YouTube.\n\n- Developed by: MagicalAPI Co.\n- Model type: Diffusion-based text-to-image generative model\n- Language(s): en\n- License: openrail++\n- Base Model: Base model is enhanced from SD-XL 1.0 \n- Resources for more information: You can find out more details in our [Github (comming soon)]()",
"# Uses\n\nThis model is mainly designed for those creating videos and contents on YouTube, for better experiencing, describe your intended thumbnail by pointing on necessary elements or situation, separating by comma; for example: \n\"The office is modern and vibrant, filled with young, diverse professionals (various descents and genders) engaged in creative work. There are brainstorming sessions happening.",
"## Downstream Use\n\nThe base model is designed for generating Youtube thumbnails in various usecases and categories; specifically, you can also finetune the model with the particular images and captions needed for your own channel.",
"## Out-of-Scope Use\n\nThe model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.\n\nIt is crucial to emphasize that this tool should never be employed for the creation, promotion, or endorsement of abusive, violent, or pornographic materials, as its purpose is to contribute to a constructive and enriching online environment.",
"# Bias, Risks, and Limitations\n\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.",
"## Recommendations",
"# Training Details",
"## Training Data\n\n\n\nMore information on training data needed",
"## Training Procedure",
"### Preprocessing\n\nMore information needed",
"### Speeds, Sizes, Times\n\n\n\nMore information needed",
"# Evaluation",
"## Testing Data, Factors & Metrics",
"### Testing Data\n\n\n\nMore information needed",
"### Factors\n\n\n\nMore information needed",
"### Metrics\n\n\n\nMore information needed",
"## Results \n\nMore information needed",
"# Model Examination\n\nMore information needed",
"# Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed",
"# Technical Specifications [optional]",
"## Model Architecture and Objective\n\nMore information needed",
"## Compute Infrastructure\n\nMore information needed",
"### Hardware\n\nMore information needed",
"### Software\n\nMore information needed\n\nBibTeX:\n\nMore information needed\n\nAPA:\n\nMore information needed",
"# Glossary [optional]\n\n\n\nMore information needed",
"# More Information [optional]\n\nMore information needed",
"# Model Card Authors [optional]\n\n\n\nMagicalAPI",
"# Model Card Contact\n\nMore information needed",
"# How to Get Started with the Model\n\nUse the code below to get started with the model.\n\n<details>\n<summary> Click to expand </summary>\n\nMore information needed\n\n</details>"
] | [
78,
84,
230,
3,
186,
87,
48,
103,
87,
5,
3,
9,
4,
8,
12,
3,
11,
8,
7,
8,
5,
8,
68,
9,
10,
8,
6,
19,
11,
10,
12,
7,
44
] | [
"passage: TAGS\n#diffusers #safetensors #text-to-image #stable-diffusion #thumbnail-generator #youtube #en #arxiv-1910.09700 #license-openrail++ #endpoints_compatible #diffusers-StableDiffusionXLPipeline #region-us \n# Youtube Thumbnail Suggestion Model Card\n\n\nPresenting our groundbreaking generative model, specifically engineered for YouTube content. This innovative model, generates visually captivating and realistic thumbnails based on input prompts. Designed to elevate content aesthetics and viewer engagement, this tool represents a significant advancement in custom thumbnail creation for YouTube videos.# Table of Contents\n\n- Model Details\n - Model Description\n- Uses\n - Direct Use\n - [Downstream Use [Optional]](#downstream-use-optional)\n - Out-of-Scope Use\n- Bias, Risks, and Limitations\n - Recommendations\n- Training Details\n - Training Data\n - Training Procedure\n - Preprocessing\n - Speeds, Sizes, Times\n- Evaluation\n - Testing Data, Factors & Metrics\n - Testing Data\n - Factors\n - Metrics\n - Results\n- Model Examination\n- Environmental Impact\n- [Technical Specifications [optional]](#technical-specifications-optional)\n - Model Architecture and Objective\n - Compute Infrastructure\n - Hardware\n - Software\n- Citation\n- [Glossary [optional]](#glossary-optional)\n- [More Information [optional]](#more-information-optional)\n- [Model Card Authors [optional]](#model-card-authors-optional)\n- Model Card Contact\n- How to Get Started with the Model# Model Details",
"passage: ## Model Description\n\n\n\n\n\nThe model is meticulously crafted for YouTube content creators. This innovative model, tailored for thumbnail generation, ingeniously suggests visually striking thumbnails based on user-input prompts. With a focus on creativity and customization, this tool empowers users to enhance their video presence by effortlessly generating eye-catching and contextually relevant thumbnails, optimizing visual appeal and audience engagement on the YouTube.\n\n- Developed by: MagicalAPI Co.\n- Model type: Diffusion-based text-to-image generative model\n- Language(s): en\n- License: openrail++\n- Base Model: Base model is enhanced from SD-XL 1.0 \n- Resources for more information: You can find out more details in our [Github (comming soon)]()# Uses\n\nThis model is mainly designed for those creating videos and contents on YouTube, for better experiencing, describe your intended thumbnail by pointing on necessary elements or situation, separating by comma; for example: \n\"The office is modern and vibrant, filled with young, diverse professionals (various descents and genders) engaged in creative work. There are brainstorming sessions happening.## Downstream Use\n\nThe base model is designed for generating Youtube thumbnails in various usecases and categories; specifically, you can also finetune the model with the particular images and captions needed for your own channel.## Out-of-Scope Use\n\nThe model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.\n\nIt is crucial to emphasize that this tool should never be employed for the creation, promotion, or endorsement of abusive, violent, or pornographic materials, as its purpose is to contribute to a constructive and enriching online environment.# Bias, Risks, and Limitations\n\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.## Recommendations# Training Details## Training Data\n\n\n\nMore information on training data needed## Training Procedure### Preprocessing\n\nMore information needed### Speeds, Sizes, Times\n\n\n\nMore information needed# Evaluation## Testing Data, Factors & Metrics### Testing Data\n\n\n\nMore information needed### Factors\n\n\n\nMore information needed### Metrics\n\n\n\nMore information needed## Results \n\nMore information needed# Model Examination\n\nMore information needed# Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed# Technical Specifications [optional]## Model Architecture and Objective\n\nMore information needed"
] | [
-0.036039650440216064,
0.05731457471847534,
-0.003012764500454068,
-0.007430827245116234,
0.08013906329870224,
-0.026038961485028267,
0.043547507375478745,
0.08625368773937225,
-0.040142547339200974,
0.034345850348472595,
-0.011153936386108398,
0.006113331764936447,
0.06446348875761032,
0.08240874111652374,
0.009888103231787682,
-0.23677420616149902,
0.05046742409467697,
-0.03650321066379547,
-0.06218254566192627,
0.11289385706186295,
0.145999938249588,
-0.052481845021247864,
0.06444747000932693,
0.020816653966903687,
-0.11563460528850555,
-0.04531281441450119,
0.002436290495097637,
-0.029378412291407585,
0.04169706255197525,
0.06529979407787323,
0.0789024755358696,
0.01771470159292221,
0.014234127476811409,
-0.21109922230243683,
0.021919965744018555,
0.08646224439144135,
0.0037655094638466835,
0.011424059979617596,
0.08393467962741852,
0.01505138911306858,
0.12590919435024261,
-0.06426818668842316,
0.08953231573104858,
0.08263224363327026,
-0.09101913869380951,
-0.05026805028319359,
-0.026407085359096527,
0.06286317110061646,
0.10260163247585297,
0.026809604838490486,
-0.04467243701219559,
0.07813066244125366,
-0.06217694282531738,
0.043032437562942505,
0.05308210477232933,
-0.004821295849978924,
-0.07080203294754028,
0.03119109570980072,
0.10289132595062256,
0.012788250111043453,
-0.09237873554229736,
0.04058472812175751,
0.00015160627663135529,
-0.03237899765372276,
0.05138905346393585,
-0.026096869260072708,
0.15438562631607056,
-0.07684309780597687,
-0.08022411167621613,
-0.0025616995990276337,
0.15305741131305695,
0.09265275299549103,
-0.0552622526884079,
-0.21227087080478668,
-0.02017279341816902,
0.12303245067596436,
-0.057238269597291946,
-0.032620612531900406,
0.011842175386846066,
-0.011245873756706715,
0.08361503481864929,
-0.03784769028425217,
-0.08034168183803558,
0.011854629963636398,
0.0724518746137619,
0.08256210386753082,
0.012153161689639091,
0.01184908952564001,
-0.07483580708503723,
0.053825631737709045,
0.0021199844777584076,
-0.0981212705373764,
-0.002079477533698082,
-0.07930823415517807,
-0.020725008100271225,
-0.0314653180539608,
-0.023084571585059166,
-0.0720822811126709,
0.0035489238798618317,
0.09417729824781418,
0.07203968614339828,
0.08590827137231827,
-0.00903744250535965,
0.03198426961898804,
0.07127352058887482,
0.06575322896242142,
0.025962451472878456,
-0.11720053851604462,
0.021302470937371254,
0.09781178832054138,
-0.03253898769617081,
-0.04877735674381256,
-0.03485933691263199,
0.058544233441352844,
-0.02474491484463215,
0.019137747585773468,
0.13014671206474304,
0.06824859976768494,
-0.049744416028261185,
-0.05631108209490776,
0.13909144699573517,
-0.10723063349723816,
0.030082404613494873,
-0.017777524888515472,
-0.020630063489079475,
0.03160218894481659,
-0.013639281503856182,
0.04300495982170105,
-0.03781487047672272,
0.022793419659137726,
-0.038044434040784836,
-0.001736610196530819,
-0.142009899020195,
-0.03743012621998787,
0.055218737572431564,
-0.027523450553417206,
-0.01690339297056198,
-0.05887778848409653,
-0.11149787157773972,
-0.06912559270858765,
0.08714699000120163,
-0.028593633323907852,
-0.06353643536567688,
0.0026594442315399647,
-0.019586918875575066,
0.011535434052348137,
0.04606381058692932,
-0.043360888957977295,
0.0024851886555552483,
0.021722404286265373,
-0.06972818821668625,
-0.007493842393159866,
0.003788162022829056,
0.04309934750199318,
-0.03614868223667145,
0.05327082425355911,
-0.2374870479106903,
0.12939661741256714,
-0.046414852142333984,
0.018748510628938675,
-0.06462770700454712,
0.001319865696132183,
-0.018817508593201637,
0.06195741519331932,
-0.09176401793956757,
0.11238731443881989,
-0.21867601573467255,
-0.030936192721128464,
0.07343151420354843,
-0.12721675634384155,
-0.01969267800450325,
0.055106811225414276,
-0.05226929113268852,
0.13751082122325897,
0.08317199349403381,
0.11314745247364044,
0.0803072452545166,
-0.04127047210931778,
0.0485672727227211,
-0.0650327056646347,
-0.02443212829530239,
0.18012985587120056,
0.038913242518901825,
-0.02483065240085125,
0.024999022483825684,
0.02291218563914299,
-0.10035088658332825,
-0.021754123270511627,
0.0009617293253540993,
-0.044293709099292755,
0.030308952555060387,
-0.010190744884312153,
0.010222576558589935,
-0.014092253521084785,
-0.05488666146993637,
-0.027332166209816933,
-0.13479205965995789,
0.048617683351039886,
0.0533437542617321,
-0.02649720013141632,
0.032893646508455276,
-0.08429714292287827,
0.05484186112880707,
0.005027699284255505,
-0.011406381614506245,
-0.15374155342578888,
-0.06056185066699982,
0.030373435467481613,
-0.08595291525125504,
0.061832718551158905,
0.003543984144926071,
0.008278237655758858,
0.0038954950869083405,
-0.03359615430235863,
-0.013484190218150616,
-0.10114723443984985,
0.028049474582076073,
0.0015427693724632263,
-0.15776358544826508,
0.005726870149374008,
-0.040785178542137146,
0.056570857763290405,
-0.15179777145385742,
0.02136952616274357,
0.06924407184123993,
0.10828839242458344,
0.07062855362892151,
-0.09022300690412521,
-0.04088014364242554,
-0.023891963064670563,
-0.0020670746453106403,
-0.020125649869441986,
0.03382711857557297,
-0.027891362085938454,
-0.006622829008847475,
0.0657336488366127,
-0.12199734151363373,
-0.02183254435658455,
0.06806883215904236,
-0.09792467951774597,
-0.07814612984657288,
-0.024306146427989006,
0.003764817491173744,
-0.028663214296102524,
-0.02683628536760807,
-0.04315325617790222,
0.07893707603216171,
0.013395902700722218,
0.06014120206236839,
-0.08949148654937744,
-0.026995385065674782,
0.02465827390551567,
-0.008859261870384216,
-0.026581455022096634,
0.012590939179062843,
0.10601460933685303,
-0.10475234687328339,
0.04510055482387543,
0.10376804322004318,
0.09316150099039078,
0.14690548181533813,
-0.023295417428016663,
-0.059892430901527405,
0.008100559934973717,
-0.0026821657083928585,
-0.028075065463781357,
0.12503516674041748,
-0.011704321950674057,
0.008078200742602348,
0.02486126311123371,
0.010112017393112183,
0.028224796056747437,
-0.07398487627506256,
0.030679740011692047,
0.05586344748735428,
0.025007417425513268,
-0.0629412904381752,
-0.02822764962911606,
0.012275688350200653,
0.05864386633038521,
0.037796810269355774,
0.06497171521186829,
0.014412648975849152,
-0.05427543818950653,
-0.1126737967133522,
0.09734321385622025,
-0.09743797779083252,
-0.28583744168281555,
-0.11129255592823029,
0.04206538945436478,
-0.022280342876911163,
-0.01039324700832367,
0.00993242859840393,
-0.03596264868974686,
-0.07463018596172333,
-0.10238978266716003,
0.08772103488445282,
-0.06016291305422783,
-0.08892650902271271,
-0.04742839187383652,
0.07765617966651917,
0.02425120770931244,
-0.0778733640909195,
0.009749256074428558,
0.023320935666561127,
-0.03335689380764961,
-0.01798388734459877,
0.017623327672481537,
0.09029379487037659,
0.06013796105980873,
-0.009210125543177128,
-0.018249496817588806,
-0.0312952995300293,
0.18161596357822418,
-0.13190364837646484,
0.141398087143898,
0.16840225458145142,
-0.0902915745973587,
0.08081114292144775,
0.176774799823761,
0.021213578060269356,
-0.044144511222839355,
0.0033991336822509766,
0.06385230273008347,
-0.015810485929250717,
-0.12235903739929199,
-0.06902213394641876,
-0.02845695987343788,
-0.11267109215259552,
0.044625140726566315,
0.058170147240161896,
0.1101667732000351,
0.0461842343211174,
-0.07956214994192123,
-0.014081239700317383,
0.10213026404380798,
0.09277472645044327,
0.07232341915369034,
-0.017771050333976746,
0.06270182132720947,
-0.025623787194490433,
0.018888264894485474,
0.04259258508682251,
0.012652909383177757,
0.2801576852798462,
0.059394896030426025,
0.09549291431903839,
0.09982538223266602,
0.012059064581990242,
0.06440100818872452,
-0.02593797631561756,
0.011469350196421146,
0.002143469173461199,
-0.030315179377794266,
-0.05401218682527542,
-0.017436198890209198,
0.09645513445138931,
0.05075839161872864,
-0.05509769171476364,
-0.013383535668253899,
0.03455447405576706,
0.061735548079013824,
0.17623993754386902,
-0.060858145356178284,
-0.10707874596118927,
-0.0024369489401578903,
0.05402391403913498,
-0.07976403832435608,
-0.11049798130989075,
0.02635827474296093,
0.03225836530327797,
-0.1359308809041977,
0.05261450260877609,
-0.015009382739663124,
0.1341230571269989,
-0.07503212243318558,
-0.006796092726290226,
0.06784997135400772,
-0.027078155428171158,
0.003932538907974958,
0.05262312293052673,
-0.12572260200977325,
0.09093308448791504,
0.0027654683217406273,
0.047548748552799225,
-0.039945200085639954,
0.04726430028676987,
-0.006351328454911709,
0.08150142431259155,
0.1264857053756714,
0.022712044417858124,
-0.04551299288868904,
-0.04645025357604027,
-0.019671203568577766,
0.016394222155213356,
0.1209530457854271,
-0.12142765522003174,
0.06544718891382217,
-0.05655364319682121,
0.03295411169528961,
-0.017857838422060013,
-0.10910442471504211,
-0.21310479938983917,
-0.1903056502342224,
0.04063422977924347,
-0.11795572191476822,
0.02576291933655739,
-0.09541578590869904,
-0.014487016946077347,
-0.03351216018199921,
0.14629974961280823,
-0.16428586840629578,
-0.07230256497859955,
-0.13477137684822083,
-0.09425334632396698,
0.06584915518760681,
-0.025905916467308998,
0.0950915515422821,
0.00007830059621483088,
0.16434617340564728,
-0.04500513896346092,
-0.06867476552724838,
0.06398966163396835,
-0.08981205523014069,
-0.11369025707244873,
-0.08977796882390976,
0.11415883898735046,
0.11258009821176529,
0.01599472016096115,
0.011103284545242786,
0.0022120848298072815,
0.007907962426543236,
-0.06876851618289948,
-0.060282930731773376,
0.13962964713573456,
0.013377228751778603,
-0.025729576125741005,
-0.03324991464614868,
-0.027720537036657333,
-0.06959302723407745,
-0.009673522785305977,
0.005876463837921619,
0.10061866044998169,
-0.035136669874191284,
0.15986140072345734,
0.07721977680921555,
-0.08259773254394531,
-0.1850699484348297,
0.07815856486558914,
0.07444173097610474,
-0.024850238114595413,
0.06285370141267776,
-0.163298562169075,
0.09174038469791412,
-0.017454857006669044,
-0.014696450904011726,
0.07833501696586609,
-0.08538242429494858,
-0.11167369782924652,
0.007533274590969086,
0.03733339533209801,
-0.052815645933151245,
-0.08220373094081879,
-0.03675002604722977,
-0.02642311155796051,
-0.15638360381126404,
0.10169259458780289,
0.00985671952366829,
-0.03388412296772003,
0.05193959176540375,
0.04099452495574951,
0.0116733992472291,
-0.0018390445038676262,
0.12110935151576996,
-0.020412476733326912,
0.05643332749605179,
-0.06764163821935654,
-0.07827942073345184,
0.05320639908313751,
-0.06977712363004684,
0.029978040605783463,
-0.016771629452705383,
-0.023721439763903618,
-0.13006001710891724,
-0.04497751593589783,
-0.05396314710378647,
0.05173235014081001,
-0.06998354196548462,
-0.06775985658168793,
-0.056059520691633224,
0.07430966198444366,
0.05441275238990784,
-0.002688901498913765,
0.025083471089601517,
-0.11087138950824738,
-0.026530779898166656,
0.04559013247489929,
0.16360104084014893,
-0.04291906952857971,
-0.18557152152061462,
-0.009611845016479492,
-0.012049112468957901,
0.12349213659763336,
-0.15643039345741272,
0.05037399381399155,
0.027872011065483093,
-0.03518053516745567,
0.13263684511184692,
-0.02188810706138611,
-0.08536545932292938,
-0.002415696159005165,
0.07926277071237564,
-0.0016900165937840939,
-0.12151157110929489,
-0.03141620755195618,
0.0957849770784378,
-0.08028772473335266,
-0.07160287350416183,
0.07292275875806808,
-0.03135577216744423,
-0.028162552043795586,
-0.02180153876543045,
0.07014572620391846,
0.04301173985004425,
-0.035970576107501984,
0.022879336029291153,
0.046961233019828796,
-0.08882980048656464,
0.06861814856529236,
0.035319313406944275,
-0.17649084329605103,
0.01117175817489624,
0.051887866109609604,
-0.06986628472805023,
-0.04068387672305107,
0.032997388392686844,
0.05382242798805237,
0.028802718967199326,
-0.09092110395431519,
0.04100698605179787,
-0.11670470982789993,
0.02574079856276512,
0.07023919373750687,
0.010759911499917507,
-0.01006914209574461,
0.0036274949088692665,
0.02632412314414978,
-0.04952900856733322,
0.05904165282845497,
0.02158161625266075,
-0.0008999034762382507,
-0.119463711977005,
-0.06534227728843689,
0.07227110117673874,
-0.025811338797211647,
-0.022446388378739357,
-0.02221451699733734,
-0.030246548354625702,
-0.016248473897576332,
-0.034859925508499146,
0.007131906226277351,
-0.08484350144863129,
0.030366528779268265,
0.023937378078699112,
0.0071520134806632996,
0.00007400382310152054,
0.011604231782257557,
-0.0518743135035038,
-0.059754058718681335,
0.014451267197728157,
0.05629393830895424,
-0.08002667129039764,
0.061925359070301056,
0.09236400574445724,
-0.09053255617618561,
0.040600039064884186,
-0.05445728078484535,
-0.022994358092546463,
0.010185047052800655,
-0.09886486828327179,
0.025619400665163994,
-0.03348889946937561,
-0.0007367143407464027,
-0.0467752106487751,
-0.07359036803245544,
0.030285948887467384,
-0.010950440540909767,
-0.03373316302895546,
-0.04523588716983795,
0.02749643847346306,
-0.09338193386793137,
0.08040539175271988,
-0.005537188611924648,
-0.09296143054962158,
-0.031214315444231033,
0.021316587924957275,
0.05012361332774162,
-0.007033290341496468,
0.10640761256217957,
0.007288696244359016,
0.03625454753637314,
-0.14744830131530762,
-0.021364882588386536,
0.051265038549900055,
0.051399558782577515,
-0.0023572957143187523,
-0.06163270026445389,
0.014553908258676529,
-0.03193666413426399,
0.14820122718811035,
0.032874953001737595,
-0.016150787472724915,
0.051183946430683136,
-0.01034304965287447,
-0.030328253284096718,
0.06475469470024109,
0.022823479026556015,
0.004740063101053238,
0.01814337447285652,
-0.04227437078952789,
-0.06567370146512985,
-0.017886919900774956,
-0.14474646747112274,
0.0035061570815742016,
0.1960698366165161,
0.029705766588449478,
0.010192368179559708,
0.06256142258644104,
0.009097859263420105,
-0.08614898473024368,
0.1232798844575882,
-0.012368831783533096,
0.024372413754463196,
-0.05478420853614807,
0.0785481184720993,
0.15605708956718445,
-0.09981876611709595,
0.0961974561214447,
-0.0055357059463858604,
-0.012099653482437134,
-0.045604854822158813,
-0.21936258673667908,
-0.04906647652387619,
-0.0806695967912674,
0.022834133356809616,
-0.05574730783700943,
0.03301650658249855,
0.06571146845817566,
-0.0010212212800979614,
-0.01724725216627121,
0.09332305192947388,
-0.08457496017217636,
-0.10196061432361603,
0.09587369114160538,
0.02772526815533638,
0.03018432855606079,
0.046229325234889984,
0.012650185264647007,
0.014609016478061676,
0.04228830337524414,
0.05441775172948837,
0.05851629003882408,
-0.06686944514513016,
0.02416367270052433,
-0.04764523357152939,
-0.07373316586017609,
0.038474395871162415,
-0.019185803830623627,
-0.03102186508476734,
0.15840522944927216,
0.0013911351561546326,
-0.020593004301190376,
-0.0371733158826828,
0.15342721343040466,
0.011405066587030888,
-0.02863839641213417,
-0.1291339099407196,
-0.03171253204345703,
-0.055021096020936966,
0.045754220336675644,
-0.010770108550786972,
-0.0893896073102951,
0.026611588895320892,
0.14332319796085358,
0.1797778457403183,
-0.016967520117759705,
-0.014865044504404068,
-0.002449512481689453,
0.00006727594882249832,
-0.027891622856259346,
0.08581462502479553,
-0.052382588386535645,
0.18311655521392822,
-0.029501749202609062,
0.039313904941082,
-0.022870324552059174,
-0.07708890736103058,
-0.02101316675543785,
0.07841900736093521,
0.00896571483463049,
0.00011069304309785366,
-0.08666259050369263,
0.1328219473361969,
0.00950547680258751,
-0.15046949684619904,
0.09217561036348343,
-0.0265054814517498,
-0.04925599321722984,
0.019730467349290848,
0.057594142854213715,
-0.05149248614907265,
0.001995774917304516,
0.04223053902387619,
-0.07827390730381012,
0.1387672871351242,
0.013742130249738693,
-0.030608167871832848,
-0.0019844602793455124,
0.10360434651374817,
-0.051976945251226425,
0.1744418740272522,
0.002551381941884756,
0.10158006846904755,
0.0363750159740448,
0.013966400176286697,
-0.06320083141326904,
0.010088946670293808,
0.049637287855148315,
-0.01712808385491371,
-0.0023993374779820442,
0.17729580402374268,
0.030833378434181213,
0.11141794919967651,
0.10157924145460129,
-0.035372696816921234,
0.05308292806148529,
-0.14722739160060883,
-0.07110384851694107,
-0.06790581345558167,
0.10278573632240295,
-0.11210156977176666,
0.12781929969787598,
0.08265876024961472,
-0.007314896211028099,
-0.007109250873327255,
-0.06264720857143402,
0.007965643890202045,
0.040274642407894135,
0.10478910803794861,
-0.011487843468785286,
-0.10937120020389557,
-0.0028631454333662987,
-0.012916295789182186,
0.04473203420639038,
-0.12448254972696304,
-0.08868931233882904,
-0.0038903318345546722,
-0.02071775123476982,
0.005536675453186035,
0.08177627623081207,
0.09150877594947815,
-0.013821443542838097,
-0.049218203872442245,
-0.12436100840568542,
0.049375858157873154,
0.10897219181060791,
-0.043826617300510406,
0.015095680952072144
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGB-b0_1
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5626
- Mean Iou: 0.4261
- Mean Accuracy: 0.7046
- Overall Accuracy: 0.9598
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.4247
- Accuracy Undropoff: 0.9846
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.3192
- Iou Undropoff: 0.9590
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.1029 | 3.33 | 10 | 1.0852 | 0.1637 | 0.3955 | 0.4522 | nan | 0.3333 | 0.4577 | 0.0 | 0.0410 | 0.4501 |
| 1.0856 | 6.67 | 20 | 1.0764 | 0.1911 | 0.5086 | 0.5025 | nan | 0.5153 | 0.5019 | 0.0 | 0.0761 | 0.4972 |
| 1.0755 | 10.0 | 30 | 1.0611 | 0.2252 | 0.6367 | 0.5749 | nan | 0.7045 | 0.5688 | 0.0 | 0.1104 | 0.5652 |
| 1.0285 | 13.33 | 40 | 1.0382 | 0.2622 | 0.7487 | 0.6568 | nan | 0.8494 | 0.6479 | 0.0 | 0.1420 | 0.6445 |
| 0.9935 | 16.67 | 50 | 1.0151 | 0.2893 | 0.7814 | 0.7201 | nan | 0.8486 | 0.7141 | 0.0 | 0.1580 | 0.7099 |
| 0.9927 | 20.0 | 60 | 0.9834 | 0.3160 | 0.7963 | 0.7816 | nan | 0.8124 | 0.7801 | 0.0 | 0.1735 | 0.7744 |
| 0.938 | 23.33 | 70 | 0.9585 | 0.3308 | 0.8084 | 0.8127 | nan | 0.8036 | 0.8131 | 0.0 | 0.1860 | 0.8065 |
| 0.9169 | 26.67 | 80 | 0.9376 | 0.3457 | 0.8169 | 0.8376 | nan | 0.7943 | 0.8396 | 0.0 | 0.2048 | 0.8324 |
| 0.8814 | 30.0 | 90 | 0.9003 | 0.3624 | 0.8086 | 0.8691 | nan | 0.7421 | 0.8750 | 0.0 | 0.2220 | 0.8651 |
| 0.8618 | 33.33 | 100 | 0.8894 | 0.3669 | 0.8184 | 0.8761 | nan | 0.7550 | 0.8817 | 0.0 | 0.2287 | 0.8720 |
| 0.8388 | 36.67 | 110 | 0.8618 | 0.3774 | 0.8096 | 0.8926 | nan | 0.7187 | 0.9006 | 0.0 | 0.2431 | 0.8892 |
| 0.8878 | 40.0 | 120 | 0.8269 | 0.3929 | 0.7937 | 0.9140 | nan | 0.6618 | 0.9257 | 0.0 | 0.2671 | 0.9116 |
| 0.8066 | 43.33 | 130 | 0.8074 | 0.4014 | 0.7955 | 0.9225 | nan | 0.6562 | 0.9348 | 0.0 | 0.2839 | 0.9202 |
| 0.8084 | 46.67 | 140 | 0.7919 | 0.4023 | 0.7932 | 0.9248 | nan | 0.6487 | 0.9376 | 0.0 | 0.2844 | 0.9226 |
| 0.7415 | 50.0 | 150 | 0.7707 | 0.4068 | 0.7850 | 0.9309 | nan | 0.6249 | 0.9451 | 0.0 | 0.2913 | 0.9290 |
| 0.7508 | 53.33 | 160 | 0.7326 | 0.4154 | 0.7660 | 0.9415 | nan | 0.5735 | 0.9585 | 0.0 | 0.3063 | 0.9400 |
| 0.7312 | 56.67 | 170 | 0.7126 | 0.4196 | 0.7636 | 0.9449 | nan | 0.5646 | 0.9625 | 0.0 | 0.3155 | 0.9435 |
| 0.6442 | 60.0 | 180 | 0.6869 | 0.4255 | 0.7500 | 0.9509 | nan | 0.5296 | 0.9704 | 0.0 | 0.3268 | 0.9497 |
| 0.6633 | 63.33 | 190 | 0.6765 | 0.4286 | 0.7524 | 0.9525 | nan | 0.5328 | 0.9719 | 0.0 | 0.3343 | 0.9513 |
| 0.7247 | 66.67 | 200 | 0.6557 | 0.4307 | 0.7335 | 0.9568 | nan | 0.4886 | 0.9785 | 0.0 | 0.3364 | 0.9558 |
| 0.6133 | 70.0 | 210 | 0.6369 | 0.4298 | 0.7279 | 0.9573 | nan | 0.4761 | 0.9796 | 0.0 | 0.3330 | 0.9564 |
| 0.6309 | 73.33 | 220 | 0.6309 | 0.4298 | 0.7437 | 0.9547 | nan | 0.5123 | 0.9752 | 0.0 | 0.3356 | 0.9536 |
| 0.6373 | 76.67 | 230 | 0.6094 | 0.4276 | 0.7197 | 0.9577 | nan | 0.4585 | 0.9808 | 0.0 | 0.3262 | 0.9568 |
| 0.8436 | 80.0 | 240 | 0.6195 | 0.4341 | 0.7438 | 0.9569 | nan | 0.5101 | 0.9776 | 0.0 | 0.3463 | 0.9559 |
| 0.6172 | 83.33 | 250 | 0.6207 | 0.4323 | 0.7384 | 0.9570 | nan | 0.4987 | 0.9782 | 0.0 | 0.3409 | 0.9560 |
| 0.6048 | 86.67 | 260 | 0.5949 | 0.4272 | 0.7136 | 0.9586 | nan | 0.4449 | 0.9824 | 0.0 | 0.3237 | 0.9578 |
| 0.7887 | 90.0 | 270 | 0.6007 | 0.4308 | 0.7282 | 0.9580 | nan | 0.4760 | 0.9803 | 0.0 | 0.3353 | 0.9571 |
| 0.605 | 93.33 | 280 | 0.5883 | 0.4284 | 0.7157 | 0.9589 | nan | 0.4489 | 0.9825 | 0.0 | 0.3271 | 0.9581 |
| 0.5964 | 96.67 | 290 | 0.5872 | 0.4277 | 0.7134 | 0.9590 | nan | 0.4439 | 0.9828 | 0.0 | 0.3251 | 0.9581 |
| 0.6097 | 100.0 | 300 | 0.5903 | 0.4300 | 0.7240 | 0.9582 | nan | 0.4669 | 0.9810 | 0.0 | 0.3325 | 0.9573 |
| 0.5886 | 103.33 | 310 | 0.5710 | 0.4250 | 0.7035 | 0.9594 | nan | 0.4227 | 0.9843 | 0.0 | 0.3162 | 0.9586 |
| 0.6079 | 106.67 | 320 | 0.5695 | 0.4277 | 0.7112 | 0.9594 | nan | 0.4390 | 0.9835 | 0.0 | 0.3245 | 0.9586 |
| 0.8054 | 110.0 | 330 | 0.5746 | 0.4308 | 0.7237 | 0.9588 | nan | 0.4657 | 0.9816 | 0.0 | 0.3344 | 0.9579 |
| 0.5496 | 113.33 | 340 | 0.5631 | 0.4285 | 0.7129 | 0.9595 | nan | 0.4424 | 0.9835 | 0.0 | 0.3269 | 0.9587 |
| 0.6271 | 116.67 | 350 | 0.5761 | 0.4302 | 0.7214 | 0.9589 | nan | 0.4608 | 0.9819 | 0.0 | 0.3326 | 0.9580 |
| 0.5511 | 120.0 | 360 | 0.5626 | 0.4261 | 0.7046 | 0.9598 | nan | 0.4247 | 0.9846 | 0.0 | 0.3192 | 0.9590 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGB-b0_1", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGB-b0_1 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:09:15+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGB-b0\_1
===================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5626
* Mean Iou: 0.4261
* Mean Accuracy: 0.7046
* Overall Accuracy: 0.9598
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.4247
* Accuracy Undropoff: 0.9846
* Iou Unlabeled: 0.0
* Iou Dropoff: 0.3192
* Iou Undropoff: 0.9590
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 9e-06
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10685057938098907,
0.03626430779695511,
-0.001935673295520246,
0.11635380983352661,
0.17187130451202393,
0.027859587222337723,
0.11530084162950516,
0.11533981561660767,
-0.10984227806329727,
0.03149544820189476,
0.10524952411651611,
0.143252432346344,
0.01654832810163498,
0.09448825567960739,
-0.019710686057806015,
-0.3064554035663605,
-0.02497927099466324,
0.032077379524707794,
-0.0844716802239418,
0.12509827315807343,
0.06434638053178787,
-0.16338524222373962,
0.09151484072208405,
-0.0040703509002923965,
-0.2204686552286148,
0.016166582703590393,
-0.005493758711963892,
-0.030991580337285995,
0.1595882922410965,
0.023075681179761887,
0.1162770614027977,
0.00943325087428093,
0.11400919407606125,
-0.2004028707742691,
0.0177561417222023,
0.05652876943349838,
-0.004191652871668339,
0.06761171668767929,
0.06547973304986954,
0.0025243882555514574,
0.15156961977481842,
-0.10634521394968033,
0.0674901083111763,
0.001579924370162189,
-0.14478418231010437,
-0.2124166041612625,
-0.0751550942659378,
0.02231522649526596,
0.07801754772663116,
0.09615685790777206,
-0.005620093550533056,
0.11418497562408447,
-0.09215269982814789,
0.11302897334098816,
0.2677992880344391,
-0.24256062507629395,
-0.08603566139936447,
0.038885246962308884,
0.002372533082962036,
0.06614953279495239,
-0.13294051587581635,
0.007953571155667305,
0.032668136060237885,
0.04610496014356613,
0.11620113253593445,
-0.03284338861703873,
-0.101409912109375,
0.02692420780658722,
-0.13881252706050873,
-0.03254663944244385,
0.05572222173213959,
0.053394194692373276,
-0.020368918776512146,
-0.03148704394698143,
-0.0676790252327919,
-0.1815362423658371,
-0.06611555069684982,
0.01311821211129427,
0.06654393672943115,
-0.06034168601036072,
-0.11456529796123505,
-0.015056446194648743,
-0.11003416031599045,
-0.08572620153427124,
-0.05005105957388878,
0.12854772806167603,
0.03410562872886658,
0.019352080300450325,
-0.03423167020082474,
0.12678693234920502,
-0.027194691821932793,
-0.14028315246105194,
0.017776047810912132,
0.030542144551873207,
-0.042405419051647186,
-0.031596340239048004,
-0.04922625422477722,
-0.06509648263454437,
-0.013548120856285095,
0.10681724548339844,
-0.05883427709341049,
0.06821789592504501,
0.03543945029377937,
0.05088932067155838,
-0.11458641290664673,
0.19146117568016052,
-0.06730221211910248,
-0.007298177573829889,
-0.036600541323423386,
0.05908145010471344,
0.0043907309882342815,
-0.02243092469871044,
-0.10571130365133286,
0.004728015512228012,
0.0698428750038147,
-0.00863322988152504,
-0.08780333399772644,
0.07071921229362488,
-0.03927977755665779,
-0.011372084729373455,
0.0008907864685170352,
-0.07625842839479446,
0.04624326899647713,
-0.0006231636507436633,
-0.08423244208097458,
-0.029023591428995132,
0.05135698243975639,
0.013989760540425777,
0.013417830690741539,
0.16561472415924072,
-0.08720922470092773,
0.0630008727312088,
-0.11391830444335938,
-0.1005091592669487,
0.0006727701984345913,
-0.08795957267284393,
0.0363733172416687,
-0.07745591551065445,
-0.15050405263900757,
-0.009602072648704052,
0.0712776929140091,
-0.04039333015680313,
0.0037425514310598373,
-0.05333979055285454,
-0.09146621823310852,
0.0031692148186266422,
-0.008615581318736076,
0.16386833786964417,
-0.06535213440656662,
0.12305673211812973,
0.037510260939598083,
0.07248316705226898,
-0.06562100350856781,
0.039912428706884384,
-0.08563850075006485,
0.01988103985786438,
-0.2224932163953781,
0.04311473295092583,
-0.051310792565345764,
0.06823128461837769,
-0.05971883237361908,
-0.12216491997241974,
0.007363898679614067,
0.002198630478233099,
0.09234175831079483,
0.1064518466591835,
-0.22506582736968994,
-0.07562999427318573,
0.1481783390045166,
-0.07299895584583282,
-0.0986948311328888,
0.1127622202038765,
-0.06419426947832108,
0.012417944148182869,
0.061098065227270126,
0.19991451501846313,
0.053843773901462555,
-0.1367926448583603,
0.021609637886285782,
-0.015871532261371613,
0.04890686273574829,
-0.02798183262348175,
0.050245851278305054,
0.022397510707378387,
0.08821465075016022,
0.019241036847233772,
-0.06581854820251465,
0.06778544932603836,
-0.1236887201666832,
-0.09655162692070007,
-0.02543337270617485,
-0.08594772219657898,
0.04241475462913513,
0.0907178595662117,
0.06137683242559433,
-0.10548804700374603,
-0.07827889919281006,
0.09143070131540298,
0.07605335861444473,
-0.06874499469995499,
0.03947000950574875,
-0.06556328386068344,
0.044134289026260376,
-0.01738561876118183,
-0.03630686178803444,
-0.17513135075569153,
-0.0254961010068655,
-0.02173396572470665,
0.03430168703198433,
0.030381524935364723,
0.022684089839458466,
0.09147888422012329,
0.08866636455059052,
-0.07124453783035278,
-0.025394883006811142,
-0.06514831632375717,
0.0025242348201572895,
-0.12216109782457352,
-0.22858741879463196,
-0.04355696588754654,
-0.00815774966031313,
0.08776155859231949,
-0.21201197803020477,
0.02407139725983143,
0.02383519895374775,
0.08827987313270569,
0.025346165522933006,
-0.031486958265304565,
-0.05262213200330734,
0.07697917520999908,
-0.01048391591757536,
-0.0657806470990181,
0.0699291080236435,
-0.005575011018663645,
-0.06857788562774658,
-0.055325381457805634,
-0.11398743838071823,
0.16212861239910126,
0.1343807429075241,
-0.14744463562965393,
-0.09236977249383926,
-0.010949798859655857,
-0.06368359923362732,
-0.0333402119576931,
-0.04263182729482651,
0.038925040513277054,
0.1805073618888855,
-0.00012792288907803595,
0.13281702995300293,
-0.0612633116543293,
-0.035053350031375885,
0.029012465849518776,
-0.027209581807255745,
0.027399972081184387,
0.1295827329158783,
0.12509913742542267,
-0.06315279752016068,
0.12457630038261414,
0.12523476779460907,
-0.08054209500551224,
0.14924952387809753,
-0.0337057001888752,
-0.08058784157037735,
-0.018124591559171677,
-0.01498951856046915,
-0.008144272491335869,
0.1774190068244934,
-0.15068712830543518,
-0.017481965944170952,
-0.004701419733464718,
0.014108945615589619,
0.015137489885091782,
-0.25119826197624207,
-0.056064672768116,
0.03871864080429077,
-0.04418681934475899,
-0.010122931562364101,
-0.024731511250138283,
-0.004263777751475573,
0.10469487309455872,
-0.006829763762652874,
-0.07522596418857574,
0.0009397919639013708,
-0.007544955238699913,
-0.04874071478843689,
0.20733216404914856,
-0.05867748707532883,
-0.11888998746871948,
-0.09063097834587097,
-0.07728288322687149,
-0.03643593192100525,
0.0032626588363200426,
0.05800759047269821,
-0.10886978358030319,
-0.018631264567375183,
-0.05953642725944519,
0.018379520624876022,
0.00653329212218523,
0.03581196814775467,
-0.0010796820279210806,
-0.008305853232741356,
0.05599924176931381,
-0.09695360064506531,
-0.009751099161803722,
-0.06640910357236862,
-0.05221163481473923,
0.0540371835231781,
0.0599382221698761,
0.1480494737625122,
0.13528180122375488,
-0.026023129001259804,
0.01946074888110161,
-0.032555028796195984,
0.25728389620780945,
-0.09593669325113297,
-0.027019720524549484,
0.1184641644358635,
-0.012993209064006805,
0.05643979832530022,
0.10674209147691727,
0.08224165439605713,
-0.10914762318134308,
-0.0021797730587422848,
0.06342672556638718,
-0.05210611969232559,
-0.15560463070869446,
-0.01492993999272585,
-0.05806518718600273,
-0.02997620962560177,
0.07656913250684738,
0.02729225903749466,
-0.0039429329335689545,
0.055973973125219345,
0.0485488697886467,
0.0422152541577816,
-0.024820512160658836,
0.05030658096075058,
0.08844783157110214,
0.031934622675180435,
0.10927179455757141,
-0.04496106877923012,
-0.06661403179168701,
0.03137355297803879,
0.003460187464952469,
0.2442261427640915,
-0.01649155281484127,
0.09702381491661072,
0.07324043661355972,
0.1625985950231552,
-0.012576618231832981,
0.04870334640145302,
-0.016265448182821274,
-0.06828615814447403,
-0.019438516348600388,
-0.0442563071846962,
-0.017400460317730904,
0.009829409420490265,
-0.052364785224199295,
0.03955406695604324,
-0.12583647668361664,
0.00892991479486227,
0.06750801205635071,
0.24912838637828827,
0.029018839821219444,
-0.3183880150318146,
-0.06565827131271362,
-0.0056962138041853905,
-0.010914385318756104,
-0.009167463518679142,
0.006561817601323128,
0.15286700427532196,
-0.08080077916383743,
0.05641217902302742,
-0.08472999930381775,
0.08544473350048065,
-0.036699049174785614,
0.05069311335682869,
0.07709122449159622,
0.07387557625770569,
-0.004402882419526577,
0.05627644807100296,
-0.2844226062297821,
0.30184653401374817,
0.0019412569236010313,
0.08472838252782822,
-0.06408196687698364,
-0.03190705552697182,
0.03334270417690277,
0.08087810128927231,
0.08644286543130875,
-0.015275918878614902,
-0.02278541401028633,
-0.21447885036468506,
-0.021816356107592583,
0.03074919618666172,
0.12930479645729065,
-0.017003260552883148,
0.10420249402523041,
-0.009626083076000214,
-0.0053878407925367355,
0.07411561906337738,
-0.0013217201922088861,
-0.032375771552324295,
-0.09013956040143967,
-0.02635461464524269,
-0.025063641369342804,
-0.04987993463873863,
-0.0583735927939415,
-0.10665848106145859,
-0.1148701012134552,
0.11142092943191528,
0.01925581507384777,
-0.013891641981899738,
-0.1200677752494812,
0.09841684997081757,
0.07931535691022873,
-0.07563216239213943,
0.04060979560017586,
0.031621385365724564,
0.0567694790661335,
0.03328625485301018,
-0.057950254529714584,
0.11829067021608353,
-0.059832677245140076,
-0.15997421741485596,
-0.0567440502345562,
0.09150678664445877,
0.05084343999624252,
0.05709720030426979,
-0.024659184738993645,
0.016479352489113808,
-0.017775993794202805,
-0.09203257411718369,
0.05514802038669586,
-0.044719912111759186,
0.06380371004343033,
0.01107920054346323,
-0.020117932930588722,
0.05097772553563118,
-0.056350190192461014,
-0.012211905792355537,
0.14652156829833984,
0.28519129753112793,
-0.08895742893218994,
0.013015838339924812,
0.018084168434143066,
-0.06592965126037598,
-0.19147610664367676,
0.08019894361495972,
0.05769340693950653,
0.000294802593998611,
0.08597011864185333,
-0.16697928309440613,
0.09797414392232895,
0.10399558395147324,
0.0006753257475793362,
0.11591461300849915,
-0.3673976957798004,
-0.12822552025318146,
0.08067037910223007,
0.1909283697605133,
0.07638444751501083,
-0.1555989533662796,
0.0011150550562888384,
-0.0020477990619838238,
-0.14684085547924042,
0.0915546864271164,
-0.0774684026837349,
0.1356402039527893,
-0.019881173968315125,
0.08665145933628082,
0.016510190442204475,
-0.06163441762328148,
0.12189887464046478,
-0.003514062613248825,
0.14098508656024933,
-0.06979281455278397,
-0.03949557989835739,
0.05547715723514557,
-0.03780345246195793,
-0.012532943859696388,
-0.046548739075660706,
0.027314508333802223,
-0.061505403369665146,
-0.011533861048519611,
-0.10497517883777618,
0.01234265137463808,
-0.03862180560827255,
-0.0665336400270462,
-0.046126287430524826,
0.043440841138362885,
0.04481000825762749,
-0.004333494696766138,
0.15160907804965973,
-0.009838461875915527,
0.11446475982666016,
0.048616085201501846,
0.059357717633247375,
-0.0625092014670372,
-0.10644620656967163,
-0.017419226467609406,
0.009137879125773907,
0.04812520742416382,
-0.13459797203540802,
0.014520320110023022,
0.15316778421401978,
0.05033844709396362,
0.12206333130598068,
0.08668199926614761,
-0.032152093946933746,
0.032478272914886475,
0.06915321201086044,
-0.15726125240325928,
-0.11348368972539902,
0.002647005720064044,
-0.0666738972067833,
-0.07283763587474823,
0.05314626917243004,
0.07690896093845367,
-0.07524632662534714,
0.01250340323895216,
-0.006433835253119469,
0.00623489823192358,
-0.06747746467590332,
0.20530791580677032,
0.05628814920783043,
0.04134016111493111,
-0.10373891890048981,
0.07344047725200653,
0.018822547048330307,
-0.08832450956106186,
-0.001273069647140801,
0.091876320540905,
-0.06927075237035751,
-0.02488582767546177,
0.08068958669900894,
0.19198188185691833,
-0.0760442391037941,
-0.022711295634508133,
-0.15018387138843536,
-0.10663120448589325,
0.0696946233510971,
0.1859212964773178,
0.10009218007326126,
-0.0066929589956998825,
-0.05261373147368431,
0.04718083515763283,
-0.11769203096628189,
0.07785926759243011,
0.02375919185578823,
0.08150064945220947,
-0.149414524435997,
0.18238325417041779,
0.011515012942254543,
0.05549228936433792,
-0.0262591689825058,
0.03299567103385925,
-0.11901942640542984,
0.04042082652449608,
-0.11323829740285873,
-0.03656476363539696,
-0.015819627791643143,
0.004982688929885626,
-0.013605719432234764,
-0.06250959634780884,
-0.06261712312698364,
0.004969872068613768,
-0.12757059931755066,
-0.022049032151699066,
0.04597632214426994,
0.0226356890052557,
-0.12621454894542694,
-0.039179492741823196,
0.027957504615187645,
-0.0634961947798729,
0.05570286884903908,
0.0362657755613327,
0.014641729183495045,
0.06589099019765854,
-0.1722085177898407,
-0.021717514842748642,
0.06956680864095688,
-0.00652063125744462,
0.06333942711353302,
-0.03564884886145592,
-0.026136288419365883,
-0.02971090003848076,
0.08751439303159714,
0.01264885812997818,
0.06254559755325317,
-0.13715514540672302,
0.005597017705440521,
-0.03272739425301552,
-0.09302836656570435,
-0.05889085307717323,
0.053976111114025116,
0.06266439706087112,
0.03673491254448891,
0.1626386046409607,
-0.08305823802947998,
0.04478827118873596,
-0.21874047815799713,
-0.016379784792661667,
0.0019039716571569443,
-0.10824413597583771,
-0.08191505819559097,
-0.07234217971563339,
0.08313220739364624,
-0.07516274601221085,
0.1098691001534462,
0.03700150176882744,
0.06477745622396469,
0.031236430630087852,
-0.03266031667590141,
-0.0035936387721449137,
0.03474467247724533,
0.21087779104709625,
0.010864556767046452,
-0.033186934888362885,
0.08911871165037155,
0.07923591136932373,
0.09979009628295898,
0.13633733987808228,
0.22690275311470032,
0.15538907051086426,
-0.025031035766005516,
0.0895581841468811,
0.05219664424657822,
-0.06445913761854172,
-0.17318131029605865,
0.037061262875795364,
-0.052631620317697525,
0.0983063206076622,
-0.06133917346596718,
0.20385372638702393,
0.08528558909893036,
-0.18248873949050903,
0.06686433404684067,
-0.04619051143527031,
-0.1014547124505043,
-0.07912042737007141,
-0.036780521273612976,
-0.06989338994026184,
-0.14807648956775665,
0.025808745995163918,
-0.10289522260427475,
0.043146952986717224,
0.15061259269714355,
0.010142462328076363,
-0.013186412863433361,
0.21496398746967316,
0.03361218795180321,
0.03597399592399597,
0.05797773599624634,
0.01461877766996622,
-0.02994365803897381,
-0.09197406470775604,
-0.060405321419239044,
0.01852886751294136,
-0.02963108941912651,
0.018435228615999222,
-0.06895328313112259,
-0.07708732038736343,
0.026520729064941406,
0.005244007334113121,
-0.09371335804462433,
0.023878421634435654,
0.020549390465021133,
0.09170868992805481,
0.026401499286293983,
0.006316365208476782,
0.016879843547940254,
-0.02882014960050583,
0.24586759507656097,
-0.09307842701673508,
-0.08082079142332077,
-0.08126507699489594,
0.21629635989665985,
0.031283751130104065,
0.00038532153121195734,
0.008487934246659279,
-0.08194472640752792,
0.009446374140679836,
0.22900176048278809,
0.17176134884357452,
-0.13262340426445007,
-0.010672206990420818,
0.00022125232499092817,
0.0014990816125646234,
-0.030007947236299515,
0.11874573677778244,
0.12176579236984253,
0.05226140096783638,
-0.1143321618437767,
-0.053072117269039154,
-0.05319010466337204,
-0.01884022355079651,
-0.026799771934747696,
0.04893719404935837,
0.06688852608203888,
0.021754715591669083,
-0.07005330920219421,
0.07569805532693863,
-0.058277588337659836,
-0.14415089786052704,
0.1064227893948555,
-0.22888155281543732,
-0.1564146727323532,
-0.007347749080508947,
0.12239295989274979,
0.0041984254494309425,
0.06011074781417847,
-0.04187969118356705,
0.0014577142428606749,
0.04891619831323624,
-0.0053495257161557674,
-0.07872701436281204,
-0.1040332019329071,
0.08433252573013306,
-0.1156696304678917,
0.21760204434394836,
-0.05983545631170273,
0.03409330174326897,
0.11338185518980026,
0.06366975605487823,
-0.05050501599907875,
0.05644633620977402,
0.04183590039610863,
-0.12415521591901779,
-0.004520765971392393,
0.12383025139570236,
-0.03830607607960701,
0.05426766723394394,
0.032783906906843185,
-0.13324357569217682,
0.03243369236588478,
-0.056396275758743286,
-0.04118708148598671,
-0.027750611305236816,
-0.050637196749448776,
-0.06415977329015732,
0.1156311109662056,
0.20928682386875153,
-0.008331581018865108,
0.023522816598415375,
-0.08692961931228638,
0.015440301969647408,
0.06543692201375961,
0.04765712842345238,
-0.07815010845661163,
-0.2155001014471054,
0.006871576886624098,
0.07048681378364563,
-0.04139501601457596,
-0.20594075322151184,
-0.11099885404109955,
0.03691459447145462,
-0.054492104798555374,
-0.07167309522628784,
0.09040647745132446,
0.0892757773399353,
0.05621092766523361,
-0.055385008454322815,
-0.10468907654285431,
-0.05855516716837883,
0.17021453380584717,
-0.14701642096042633,
-0.0777096375823021
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGB-b0_2
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6222
- Mean Iou: 0.4086
- Mean Accuracy: 0.6638
- Overall Accuracy: 0.9583
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3408
- Accuracy Undropoff: 0.9869
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.2682
- Iou Undropoff: 0.9576
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.1857 | 3.33 | 10 | 1.1215 | 0.0852 | 0.2039 | 0.0995 | nan | 0.3183 | 0.0894 | 0.0 | 0.1663 | 0.0893 |
| 1.1597 | 6.67 | 20 | 1.1165 | 0.1437 | 0.3568 | 0.2326 | nan | 0.4930 | 0.2206 | 0.0 | 0.2108 | 0.2203 |
| 1.1528 | 10.0 | 30 | 1.1040 | 0.1938 | 0.5140 | 0.3464 | nan | 0.6978 | 0.3301 | 0.0 | 0.2517 | 0.3297 |
| 1.0852 | 13.33 | 40 | 1.0896 | 0.2243 | 0.6289 | 0.4219 | nan | 0.8561 | 0.4018 | 0.0 | 0.2717 | 0.4011 |
| 1.0388 | 16.67 | 50 | 1.0511 | 0.2700 | 0.6748 | 0.5498 | nan | 0.8120 | 0.5377 | 0.0 | 0.2744 | 0.5357 |
| 1.0426 | 20.0 | 60 | 1.0089 | 0.3147 | 0.6787 | 0.6640 | nan | 0.6949 | 0.6625 | 0.0 | 0.2857 | 0.6583 |
| 0.9621 | 23.33 | 70 | 0.9921 | 0.3374 | 0.7060 | 0.7392 | nan | 0.6695 | 0.7424 | 0.0 | 0.2760 | 0.7361 |
| 0.925 | 26.67 | 80 | 0.9464 | 0.3591 | 0.7031 | 0.7964 | nan | 0.6007 | 0.8054 | 0.0 | 0.2807 | 0.7965 |
| 0.8872 | 30.0 | 90 | 0.8993 | 0.3858 | 0.7074 | 0.8676 | nan | 0.5316 | 0.8831 | 0.0 | 0.2888 | 0.8686 |
| 0.8751 | 33.33 | 100 | 0.8974 | 0.3896 | 0.7177 | 0.8817 | nan | 0.5379 | 0.8976 | 0.0 | 0.2866 | 0.8822 |
| 0.8571 | 36.67 | 110 | 0.8501 | 0.4028 | 0.7162 | 0.9122 | nan | 0.5011 | 0.9312 | 0.0 | 0.2953 | 0.9131 |
| 0.8866 | 40.0 | 120 | 0.8434 | 0.4072 | 0.7240 | 0.9252 | nan | 0.5032 | 0.9448 | 0.0 | 0.2963 | 0.9254 |
| 0.8127 | 43.33 | 130 | 0.7922 | 0.4142 | 0.7089 | 0.9404 | nan | 0.4548 | 0.9629 | 0.0 | 0.3025 | 0.9402 |
| 0.8062 | 46.67 | 140 | 0.7917 | 0.4123 | 0.7103 | 0.9432 | nan | 0.4548 | 0.9658 | 0.0 | 0.2943 | 0.9425 |
| 0.7512 | 50.0 | 150 | 0.7646 | 0.4142 | 0.7059 | 0.9478 | nan | 0.4404 | 0.9713 | 0.0 | 0.2955 | 0.9470 |
| 0.7554 | 53.33 | 160 | 0.7497 | 0.4161 | 0.7001 | 0.9510 | nan | 0.4248 | 0.9754 | 0.0 | 0.2981 | 0.9502 |
| 0.7468 | 56.67 | 170 | 0.7326 | 0.4177 | 0.6989 | 0.9535 | nan | 0.4195 | 0.9782 | 0.0 | 0.3005 | 0.9527 |
| 0.6506 | 60.0 | 180 | 0.7184 | 0.4173 | 0.6992 | 0.9541 | nan | 0.4196 | 0.9789 | 0.0 | 0.2987 | 0.9533 |
| 0.6761 | 63.33 | 190 | 0.7037 | 0.4142 | 0.6884 | 0.9546 | nan | 0.3964 | 0.9805 | 0.0 | 0.2886 | 0.9539 |
| 0.7245 | 66.67 | 200 | 0.6960 | 0.4122 | 0.6821 | 0.9553 | nan | 0.3824 | 0.9818 | 0.0 | 0.2820 | 0.9545 |
| 0.6514 | 70.0 | 210 | 0.6755 | 0.4104 | 0.6705 | 0.9573 | nan | 0.3559 | 0.9852 | 0.0 | 0.2746 | 0.9566 |
| 0.6433 | 73.33 | 220 | 0.6804 | 0.4180 | 0.6954 | 0.9556 | nan | 0.4100 | 0.9809 | 0.0 | 0.2991 | 0.9548 |
| 0.6686 | 76.67 | 230 | 0.6608 | 0.4107 | 0.6694 | 0.9578 | nan | 0.3531 | 0.9858 | 0.0 | 0.2749 | 0.9571 |
| 0.9091 | 80.0 | 240 | 0.6701 | 0.4160 | 0.6922 | 0.9557 | nan | 0.4031 | 0.9813 | 0.0 | 0.2930 | 0.9549 |
| 0.6346 | 83.33 | 250 | 0.6725 | 0.4166 | 0.6904 | 0.9563 | nan | 0.3987 | 0.9821 | 0.0 | 0.2944 | 0.9555 |
| 0.6303 | 86.67 | 260 | 0.6460 | 0.4090 | 0.6670 | 0.9576 | nan | 0.3481 | 0.9858 | 0.0 | 0.2702 | 0.9569 |
| 0.8923 | 90.0 | 270 | 0.6550 | 0.4131 | 0.6799 | 0.9568 | nan | 0.3760 | 0.9837 | 0.0 | 0.2832 | 0.9561 |
| 0.6334 | 93.33 | 280 | 0.6468 | 0.4100 | 0.6708 | 0.9572 | nan | 0.3566 | 0.9851 | 0.0 | 0.2734 | 0.9565 |
| 0.6242 | 96.67 | 290 | 0.6483 | 0.4106 | 0.6728 | 0.9572 | nan | 0.3607 | 0.9848 | 0.0 | 0.2754 | 0.9565 |
| 0.7401 | 100.0 | 300 | 0.6470 | 0.4129 | 0.6796 | 0.9569 | nan | 0.3755 | 0.9838 | 0.0 | 0.2825 | 0.9561 |
| 0.6148 | 103.33 | 310 | 0.6242 | 0.4081 | 0.6633 | 0.9582 | nan | 0.3397 | 0.9868 | 0.0 | 0.2668 | 0.9575 |
| 0.6345 | 106.67 | 320 | 0.6287 | 0.4093 | 0.6670 | 0.9579 | nan | 0.3478 | 0.9862 | 0.0 | 0.2708 | 0.9573 |
| 0.8711 | 110.0 | 330 | 0.6396 | 0.4130 | 0.6782 | 0.9572 | nan | 0.3720 | 0.9843 | 0.0 | 0.2826 | 0.9565 |
| 0.5812 | 113.33 | 340 | 0.6266 | 0.4101 | 0.6689 | 0.9580 | nan | 0.3517 | 0.9861 | 0.0 | 0.2731 | 0.9573 |
| 0.6503 | 116.67 | 350 | 0.6384 | 0.4130 | 0.6775 | 0.9573 | nan | 0.3706 | 0.9845 | 0.0 | 0.2824 | 0.9566 |
| 0.5923 | 120.0 | 360 | 0.6222 | 0.4086 | 0.6638 | 0.9583 | nan | 0.3408 | 0.9869 | 0.0 | 0.2682 | 0.9576 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGB-b0_2", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGB-b0_2 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:10:23+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGB-b0\_2
===================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6222
* Mean Iou: 0.4086
* Mean Accuracy: 0.6638
* Overall Accuracy: 0.9583
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3408
* Accuracy Undropoff: 0.9869
* Iou Unlabeled: 0.0
* Iou Dropoff: 0.2682
* Iou Undropoff: 0.9576
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10615549236536026,
0.034549228847026825,
-0.002015189966186881,
0.11596225947141647,
0.1724904179573059,
0.028324618935585022,
0.11595147848129272,
0.1152583584189415,
-0.10936398804187775,
0.03132639452815056,
0.10492962598800659,
0.14256051182746887,
0.015924304723739624,
0.09558205306529999,
-0.01994134671986103,
-0.30676260590553284,
-0.02533341571688652,
0.031449608504772186,
-0.08570817857980728,
0.12532007694244385,
0.0652659684419632,
-0.16204430162906647,
0.09204914420843124,
-0.0032646451145410538,
-0.21997234225273132,
0.015687011182308197,
-0.0048119365237653255,
-0.031015774235129356,
0.1604495495557785,
0.023432228714227676,
0.11681889742612839,
0.009655559435486794,
0.11462084203958511,
-0.20346853137016296,
0.01757376827299595,
0.05628377944231033,
-0.00414658198133111,
0.0672760084271431,
0.06494361907243729,
0.0006157298339530826,
0.15117046236991882,
-0.10607803612947464,
0.06793180853128433,
0.0027102846652269363,
-0.14507928490638733,
-0.21229998767375946,
-0.07366984337568283,
0.022155851125717163,
0.07778377830982208,
0.0958961471915245,
-0.005320859141647816,
0.1115812212228775,
-0.09126447886228561,
0.11303499341011047,
0.26949888467788696,
-0.24267087876796722,
-0.08610785007476807,
0.03877318277955055,
0.0009902161546051502,
0.06702575832605362,
-0.1319909244775772,
0.008605538867413998,
0.033527690917253494,
0.04700677469372749,
0.1176837757229805,
-0.03355768695473671,
-0.09903495758771896,
0.026140781119465828,
-0.13851803541183472,
-0.03379746153950691,
0.05401745066046715,
0.05272345989942551,
-0.020843109115958214,
-0.0316823348402977,
-0.06813337653875351,
-0.1834120899438858,
-0.06605791300535202,
0.013045327737927437,
0.0663665235042572,
-0.05953836068511009,
-0.11440175771713257,
-0.015355829149484634,
-0.11048942804336548,
-0.08516010642051697,
-0.04989388957619667,
0.1279105693101883,
0.0341605469584465,
0.019266396760940552,
-0.03436417877674103,
0.126907080411911,
-0.027227502316236496,
-0.13966375589370728,
0.01740981824696064,
0.03052498772740364,
-0.04339626431465149,
-0.03191659599542618,
-0.0492953397333622,
-0.06416106224060059,
-0.012302081100642681,
0.10844457149505615,
-0.060757625848054886,
0.06745901703834534,
0.03637773171067238,
0.05109861493110657,
-0.1133987307548523,
0.18974751234054565,
-0.06683192402124405,
-0.00808845553547144,
-0.03678957000374794,
0.057970453053712845,
0.004003545735031366,
-0.02246517315506935,
-0.10548961907625198,
0.004951023496687412,
0.07057365030050278,
-0.008169595152139664,
-0.0873979777097702,
0.06975660473108292,
-0.03963833302259445,
-0.011401748284697533,
-0.001129724201746285,
-0.07583677023649216,
0.045580677688121796,
-0.0005305470549501479,
-0.08303745090961456,
-0.029412169009447098,
0.05155031010508537,
0.014563435688614845,
0.01364288292825222,
0.1653718799352646,
-0.08648357540369034,
0.0640522763133049,
-0.11212904006242752,
-0.10033898055553436,
0.00046756278607062995,
-0.08666474372148514,
0.03568391874432564,
-0.07835130393505096,
-0.14968322217464447,
-0.009310989640653133,
0.07248072326183319,
-0.040435757488012314,
0.002608849434182048,
-0.05331356078386307,
-0.09044177830219269,
0.002556283725425601,
-0.008747199550271034,
0.16355140507221222,
-0.06515469402074814,
0.12268954515457153,
0.03723317012190819,
0.0722278356552124,
-0.0642285943031311,
0.03942720219492912,
-0.08507174998521805,
0.020098529756069183,
-0.22180622816085815,
0.04384911432862282,
-0.05074900761246681,
0.06702230125665665,
-0.060603998601436615,
-0.12201602011919022,
0.007716446649283171,
0.0025147623382508755,
0.09148777276277542,
0.10622569918632507,
-0.2242623269557953,
-0.07656379044055939,
0.14966335892677307,
-0.0737258642911911,
-0.09891897439956665,
0.11332890391349792,
-0.06392455846071243,
0.012724139727652073,
0.06091506406664848,
0.19768638908863068,
0.05277741327881813,
-0.13794299960136414,
0.022614842280745506,
-0.015976455062627792,
0.04875566065311432,
-0.02762465365231037,
0.050664547830820084,
0.021967463195323944,
0.08733122050762177,
0.018847769126296043,
-0.06607185304164886,
0.06786906719207764,
-0.12505312263965607,
-0.09724634885787964,
-0.025988688692450523,
-0.0863901749253273,
0.041626978665590286,
0.08933504670858383,
0.06143996864557266,
-0.10433489084243774,
-0.07931610196828842,
0.09148655086755753,
0.07589348405599594,
-0.06914868950843811,
0.040003277361392975,
-0.06468020379543304,
0.043274618685245514,
-0.018098874017596245,
-0.03661308437585831,
-0.1757075935602188,
-0.025232965126633644,
-0.022513115778565407,
0.033382076770067215,
0.02993042767047882,
0.022261200472712517,
0.09150508046150208,
0.08865048736333847,
-0.07155928760766983,
-0.0242618340998888,
-0.06548820436000824,
0.0026501936372369528,
-0.1213735044002533,
-0.2283743917942047,
-0.0439150370657444,
-0.008127299137413502,
0.09022817760705948,
-0.21238380670547485,
0.024485860019922256,
0.022513877600431442,
0.08732793480157852,
0.02531016431748867,
-0.03158751502633095,
-0.053104955703020096,
0.07721326500177383,
-0.01104468572884798,
-0.06582532823085785,
0.06961941719055176,
-0.005414709914475679,
-0.06866198033094406,
-0.05504460260272026,
-0.11220917850732803,
0.16261166334152222,
0.13317357003688812,
-0.1483808159828186,
-0.09354624152183533,
-0.012097951956093311,
-0.06427488476037979,
-0.03289159759879112,
-0.044439736753702164,
0.039308272302150726,
0.1803646683692932,
0.0006982076447457075,
0.13294067978858948,
-0.060217585414648056,
-0.03427502140402794,
0.029750289395451546,
-0.027991799637675285,
0.028016000986099243,
0.1291053742170334,
0.12672372162342072,
-0.061867911368608475,
0.12405013293027878,
0.12256383150815964,
-0.08111409842967987,
0.1482008993625641,
-0.03361791372299194,
-0.0803707093000412,
-0.018443340435624123,
-0.014288871549069881,
-0.007811937481164932,
0.17655158042907715,
-0.15019312500953674,
-0.017194382846355438,
-0.004443110898137093,
0.013975610956549644,
0.01497561763972044,
-0.25116488337516785,
-0.056271594017744064,
0.039784085005521774,
-0.04558330774307251,
-0.009484867565333843,
-0.024876579642295837,
-0.003371994476765394,
0.10455812513828278,
-0.00788955856114626,
-0.07449851930141449,
0.0010893235448747873,
-0.0073651401326060295,
-0.04909661412239075,
0.20769210159778595,
-0.05765533447265625,
-0.11976925283670425,
-0.08961967378854752,
-0.07694155722856522,
-0.03600287809967995,
0.0028745715972036123,
0.058862555772066116,
-0.10816103965044022,
-0.018454590812325478,
-0.05915059149265289,
0.020157339051365852,
0.007346513215452433,
0.03648880124092102,
-0.000805462128482759,
-0.008327646180987358,
0.05636249855160713,
-0.09701560437679291,
-0.010243967175483704,
-0.06555932760238647,
-0.05390266701579094,
0.055303264409303665,
0.05935758352279663,
0.14800769090652466,
0.13557137548923492,
-0.025858767330646515,
0.020304255187511444,
-0.03295598179101944,
0.25886616110801697,
-0.09532655775547028,
-0.02710954286158085,
0.11757361888885498,
-0.011744926683604717,
0.05646217241883278,
0.10706541687250137,
0.08301784843206406,
-0.10911455005407333,
-0.0020018748473376036,
0.06447924673557281,
-0.05241645872592926,
-0.1547957807779312,
-0.015385543927550316,
-0.057469721883535385,
-0.02983432449400425,
0.07727380841970444,
0.02727537229657173,
-0.00297696515917778,
0.05576903373003006,
0.0487079992890358,
0.042529668658971786,
-0.02381223253905773,
0.05036962777376175,
0.08799774199724197,
0.03212311863899231,
0.10913706570863724,
-0.044574204832315445,
-0.06651919335126877,
0.03205360472202301,
0.0030840642284601927,
0.2461978644132614,
-0.015672996640205383,
0.0983850359916687,
0.07424043118953705,
0.16082549095153809,
-0.012169976718723774,
0.047682467848062515,
-0.015183772891759872,
-0.06776874512434006,
-0.01887223683297634,
-0.04426401108503342,
-0.01708643138408661,
0.010107086971402168,
-0.052579861134290695,
0.03890502080321312,
-0.12657855451107025,
0.010336573235690594,
0.06798089295625687,
0.24976937472820282,
0.028126928955316544,
-0.3172444999217987,
-0.06546884030103683,
-0.005950235761702061,
-0.01106688566505909,
-0.009057823568582535,
0.006409906316548586,
0.15235763788223267,
-0.08189140260219574,
0.05576051026582718,
-0.08572468906641006,
0.08636712282896042,
-0.03750228136777878,
0.050741225481033325,
0.07725784927606583,
0.07333455234766006,
-0.004218323156237602,
0.055426910519599915,
-0.28554096817970276,
0.3030182421207428,
0.0015020552091300488,
0.08440767228603363,
-0.06432841718196869,
-0.03191444277763367,
0.03285723924636841,
0.08214209973812103,
0.08709926903247833,
-0.015192214399576187,
-0.021564137190580368,
-0.2152254432439804,
-0.021985042840242386,
0.03075752593576908,
0.1289549320936203,
-0.01781647652387619,
0.10425067692995071,
-0.009896540082991123,
-0.005252031609416008,
0.07402264326810837,
-0.00008255021384684369,
-0.03374115005135536,
-0.09011442214250565,
-0.02642729878425598,
-0.024617644026875496,
-0.04940875992178917,
-0.05911754071712494,
-0.10650285333395004,
-0.11455155164003372,
0.11226416379213333,
0.01849014312028885,
-0.014466832391917706,
-0.12075082212686539,
0.0994294136762619,
0.07886676490306854,
-0.07615210115909576,
0.04168633744120598,
0.03144078329205513,
0.057211484760046005,
0.031465284526348114,
-0.05761649087071419,
0.11721483618021011,
-0.05978243425488472,
-0.15926826000213623,
-0.05653788149356842,
0.09058526903390884,
0.051228027790784836,
0.057427797466516495,
-0.024632278829813004,
0.015576664358377457,
-0.017753923311829567,
-0.09243330359458923,
0.05409664660692215,
-0.04359959065914154,
0.06251207739114761,
0.009358733892440796,
-0.019946042448282242,
0.05175432190299034,
-0.05612831190228462,
-0.012572364881634712,
0.1474774181842804,
0.28656551241874695,
-0.08987992256879807,
0.013361443765461445,
0.015774693340063095,
-0.06567376106977463,
-0.19027814269065857,
0.07880068570375443,
0.0592377744615078,
-0.00010994865442626178,
0.08638576418161392,
-0.16753387451171875,
0.09864048659801483,
0.10382230579853058,
-0.000029601102141896263,
0.11598095297813416,
-0.36847761273384094,
-0.12830577790737152,
0.08047837018966675,
0.19119085371494293,
0.07811441272497177,
-0.15562085807323456,
0.0009542981279082596,
-0.0016975983744487166,
-0.15010304749011993,
0.09188086539506912,
-0.07547491043806076,
0.13615989685058594,
-0.018812861293554306,
0.08753051608800888,
0.016159681603312492,
-0.06127746403217316,
0.12246260046958923,
-0.003324004588648677,
0.1403193324804306,
-0.06856578588485718,
-0.04055264964699745,
0.053805239498615265,
-0.03764966130256653,
-0.012845387682318687,
-0.046697091311216354,
0.02666163630783558,
-0.0606122687458992,
-0.012645470909774303,
-0.10449526458978653,
0.01335451751947403,
-0.038993388414382935,
-0.06767627596855164,
-0.04538749158382416,
0.04343273490667343,
0.045279618352651596,
-0.003775491379201412,
0.15323926508426666,
-0.01086647529155016,
0.11361533403396606,
0.04926855117082596,
0.05752478912472725,
-0.06266096234321594,
-0.10742387920618057,
-0.01799890026450157,
0.008621232584118843,
0.04793907701969147,
-0.134986013174057,
0.015914082527160645,
0.15277710556983948,
0.05040394514799118,
0.12263007462024689,
0.08637941628694534,
-0.03162634000182152,
0.03221898153424263,
0.06903273612260818,
-0.1561650037765503,
-0.1143009141087532,
0.0020892496686428785,
-0.06717460602521896,
-0.07233736664056778,
0.052758075296878815,
0.0781174898147583,
-0.07572031766176224,
0.012066111899912357,
-0.006846295669674873,
0.006371372379362583,
-0.06799457222223282,
0.2046886682510376,
0.05591658502817154,
0.04178239405155182,
-0.10358007997274399,
0.07348375767469406,
0.0193396657705307,
-0.08695991337299347,
-0.0017643030732870102,
0.09107360988855362,
-0.06895945221185684,
-0.02495991438627243,
0.08141166716814041,
0.1924275904893875,
-0.07836577296257019,
-0.022205056622624397,
-0.1504652202129364,
-0.10686353594064713,
0.0698811486363411,
0.1860353946685791,
0.10026531666517258,
-0.006363802123814821,
-0.052319567650556564,
0.04756729677319527,
-0.1175348162651062,
0.07739392668008804,
0.025189166888594627,
0.08100908994674683,
-0.14918823540210724,
0.18353702127933502,
0.011878879740834236,
0.05554385855793953,
-0.02611459232866764,
0.032064203172922134,
-0.1190553829073906,
0.0407894141972065,
-0.11536791175603867,
-0.0356350839138031,
-0.01580701768398285,
0.0044175987131893635,
-0.012738974764943123,
-0.06231805682182312,
-0.06299253553152084,
0.005278932861983776,
-0.1274370700120926,
-0.022549381479620934,
0.04508330300450325,
0.022677045315504074,
-0.1268141269683838,
-0.03937719389796257,
0.027523791417479515,
-0.06361120939254761,
0.05599133297801018,
0.03592256084084511,
0.013751386664807796,
0.06504359096288681,
-0.17301997542381287,
-0.021260187029838562,
0.07010190933942795,
-0.007519071456044912,
0.063301682472229,
-0.03695668280124664,
-0.02605494111776352,
-0.030021706596016884,
0.08762124180793762,
0.01238918025046587,
0.06232844665646553,
-0.13699761033058167,
0.0064633688889443874,
-0.03242215886712074,
-0.09313912689685822,
-0.05863233655691147,
0.05265410244464874,
0.061613235622644424,
0.03642856702208519,
0.16245050728321075,
-0.08279261738061905,
0.04562783241271973,
-0.21917563676834106,
-0.016244666650891304,
0.0020072467159479856,
-0.10660851001739502,
-0.0820775255560875,
-0.0725497156381607,
0.08238223940134048,
-0.07543973624706268,
0.1104784607887268,
0.0371805764734745,
0.06536506861448288,
0.031283941119909286,
-0.03303464129567146,
-0.00439279293641448,
0.03424404188990593,
0.21143515408039093,
0.011951717548072338,
-0.03286031261086464,
0.08853647857904434,
0.07899757474660873,
0.10025423765182495,
0.13628844916820526,
0.22805629670619965,
0.15500251948833466,
-0.02587813511490822,
0.08921726047992706,
0.05112987756729126,
-0.06523573398590088,
-0.17293788492679596,
0.036659400910139084,
-0.05183366313576698,
0.09812317043542862,
-0.06105085462331772,
0.20321357250213623,
0.0862533375620842,
-0.18307773768901825,
0.06642911583185196,
-0.0446329228579998,
-0.10134194046258926,
-0.08134974539279938,
-0.03837316483259201,
-0.07005322724580765,
-0.148284912109375,
0.02581493742763996,
-0.1023474782705307,
0.04352521523833275,
0.1505003571510315,
0.01043553464114666,
-0.012038595974445343,
0.2132015973329544,
0.03352287784218788,
0.03600631654262543,
0.057635724544525146,
0.014849923551082611,
-0.028636125847697258,
-0.09322601556777954,
-0.0602775402367115,
0.017697708681225777,
-0.029522612690925598,
0.018102657049894333,
-0.06850984692573547,
-0.07702819257974625,
0.027126405388116837,
0.004883017390966415,
-0.09389432519674301,
0.023360121995210648,
0.02039204351603985,
0.09017205983400345,
0.027402715757489204,
0.00582334166392684,
0.016931945458054543,
-0.028438054025173187,
0.2468305230140686,
-0.0925527960062027,
-0.08303133398294449,
-0.08242250233888626,
0.2182857096195221,
0.031008996069431305,
0.0010687571484595537,
0.008656688965857029,
-0.08140299469232559,
0.009109694510698318,
0.2299584448337555,
0.17251057922840118,
-0.13191713392734528,
-0.010438171215355396,
0.00030778057407587767,
0.0011968482285737991,
-0.029137061908841133,
0.11790623515844345,
0.12049435824155807,
0.05312984064221382,
-0.11516574770212173,
-0.05246111378073692,
-0.05267689377069473,
-0.018375830724835396,
-0.02728189155459404,
0.050193775445222855,
0.06649964302778244,
0.022625859826803207,
-0.07021383196115494,
0.0754256322979927,
-0.05916743353009224,
-0.1432315558195114,
0.1066882386803627,
-0.22740834951400757,
-0.15608389675617218,
-0.0071709337644279,
0.12180298566818237,
0.003935003653168678,
0.059912532567977905,
-0.04155425354838371,
0.0010170249734073877,
0.05134904012084007,
-0.005171232856810093,
-0.07877297699451447,
-0.10369795560836792,
0.08410995453596115,
-0.1155267208814621,
0.2175501435995102,
-0.059185776859521866,
0.033598095178604126,
0.11292296648025513,
0.06378189474344254,
-0.05049547180533409,
0.05649588257074356,
0.04140991345047951,
-0.12338270246982574,
-0.004858641419559717,
0.12386944890022278,
-0.039172764867544174,
0.05419846251606941,
0.03249385952949524,
-0.13288629055023193,
0.032313015311956406,
-0.054545581340789795,
-0.04130522534251213,
-0.027001667767763138,
-0.05199307203292847,
-0.06356500834226608,
0.11501448601484299,
0.20837725698947906,
-0.008065695874392986,
0.024855956435203552,
-0.0874844640493393,
0.01648087054491043,
0.06626511365175247,
0.049007367342710495,
-0.07855073362588882,
-0.2160595953464508,
0.007378190290182829,
0.07167459279298782,
-0.041663751006126404,
-0.20452256500720978,
-0.11221330612897873,
0.03727894276380539,
-0.053923118859529495,
-0.07129895687103271,
0.09038521349430084,
0.09070087969303131,
0.05689048767089844,
-0.05490684136748314,
-0.10468753427267075,
-0.05812463536858559,
0.17075812816619873,
-0.14632177352905273,
-0.07751011103391647
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGB-b0_3
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3958
- Mean Iou: 0.6134
- Mean Accuracy: 0.6480
- Overall Accuracy: 0.9627
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3026
- Accuracy Undropoff: 0.9933
- Iou Unlabeled: nan
- Iou Dropoff: 0.2645
- Iou Undropoff: 0.9622
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.1015 | 3.33 | 10 | 1.0990 | 0.1184 | 0.4572 | 0.3294 | nan | 0.5975 | 0.3170 | 0.0 | 0.0427 | 0.3124 |
| 1.0478 | 6.67 | 20 | 1.0756 | 0.2121 | 0.7082 | 0.5654 | nan | 0.8648 | 0.5515 | 0.0 | 0.0879 | 0.5482 |
| 1.0451 | 10.0 | 30 | 1.0269 | 0.2846 | 0.8053 | 0.7334 | nan | 0.8842 | 0.7264 | 0.0 | 0.1313 | 0.7226 |
| 0.9095 | 13.33 | 40 | 0.9476 | 0.3360 | 0.7905 | 0.8411 | nan | 0.7349 | 0.8460 | 0.0 | 0.1723 | 0.8358 |
| 0.8091 | 16.67 | 50 | 0.8425 | 0.3858 | 0.7645 | 0.9167 | nan | 0.5975 | 0.9315 | 0.0 | 0.2429 | 0.9145 |
| 0.8094 | 20.0 | 60 | 0.7489 | 0.4090 | 0.7445 | 0.9417 | nan | 0.5281 | 0.9608 | 0.0 | 0.2866 | 0.9403 |
| 0.6945 | 23.33 | 70 | 0.7005 | 0.4148 | 0.7472 | 0.9453 | nan | 0.5298 | 0.9646 | 0.0 | 0.3004 | 0.9440 |
| 0.6337 | 26.67 | 80 | 0.6331 | 0.6267 | 0.7334 | 0.9499 | nan | 0.4958 | 0.9709 | nan | 0.3047 | 0.9488 |
| 0.603 | 30.0 | 90 | 0.5726 | 0.6222 | 0.6935 | 0.9559 | nan | 0.4057 | 0.9814 | nan | 0.2894 | 0.9551 |
| 0.5903 | 33.33 | 100 | 0.5841 | 0.6248 | 0.7151 | 0.9526 | nan | 0.4546 | 0.9757 | nan | 0.2980 | 0.9516 |
| 0.5514 | 36.67 | 110 | 0.5157 | 0.6227 | 0.6818 | 0.9585 | nan | 0.3781 | 0.9854 | nan | 0.2875 | 0.9578 |
| 0.6464 | 40.0 | 120 | 0.5141 | 0.6240 | 0.6889 | 0.9575 | nan | 0.3941 | 0.9836 | nan | 0.2912 | 0.9568 |
| 0.5198 | 43.33 | 130 | 0.4890 | 0.4141 | 0.6762 | 0.9591 | nan | 0.3657 | 0.9866 | 0.0 | 0.2838 | 0.9585 |
| 0.5077 | 46.67 | 140 | 0.4855 | 0.4118 | 0.6719 | 0.9588 | nan | 0.3572 | 0.9866 | 0.0 | 0.2773 | 0.9581 |
| 0.4817 | 50.0 | 150 | 0.4710 | 0.6182 | 0.6733 | 0.9587 | nan | 0.3602 | 0.9864 | nan | 0.2784 | 0.9580 |
| 0.4713 | 53.33 | 160 | 0.4669 | 0.6196 | 0.6683 | 0.9603 | nan | 0.3479 | 0.9887 | nan | 0.2795 | 0.9597 |
| 0.4516 | 56.67 | 170 | 0.4486 | 0.4107 | 0.6586 | 0.9612 | nan | 0.3265 | 0.9906 | 0.0 | 0.2715 | 0.9606 |
| 0.4059 | 60.0 | 180 | 0.4361 | 0.6136 | 0.6548 | 0.9612 | nan | 0.3187 | 0.9909 | nan | 0.2665 | 0.9606 |
| 0.4142 | 63.33 | 190 | 0.4267 | 0.6115 | 0.6503 | 0.9615 | nan | 0.3089 | 0.9917 | nan | 0.2621 | 0.9610 |
| 0.4393 | 66.67 | 200 | 0.4188 | 0.6035 | 0.6354 | 0.9623 | nan | 0.2768 | 0.9940 | nan | 0.2452 | 0.9618 |
| 0.4071 | 70.0 | 210 | 0.4224 | 0.6137 | 0.6528 | 0.9617 | nan | 0.3138 | 0.9917 | nan | 0.2663 | 0.9612 |
| 0.4009 | 73.33 | 220 | 0.4205 | 0.6136 | 0.6540 | 0.9614 | nan | 0.3167 | 0.9912 | nan | 0.2664 | 0.9608 |
| 0.4043 | 76.67 | 230 | 0.4148 | 0.6132 | 0.6514 | 0.9619 | nan | 0.3108 | 0.9920 | nan | 0.2651 | 0.9613 |
| 0.6302 | 80.0 | 240 | 0.4116 | 0.6133 | 0.6513 | 0.9619 | nan | 0.3105 | 0.9921 | nan | 0.2653 | 0.9614 |
| 0.3859 | 83.33 | 250 | 0.4113 | 0.6141 | 0.6543 | 0.9615 | nan | 0.3174 | 0.9913 | nan | 0.2673 | 0.9609 |
| 0.3791 | 86.67 | 260 | 0.4033 | 0.6042 | 0.6361 | 0.9623 | nan | 0.2782 | 0.9940 | nan | 0.2465 | 0.9619 |
| 0.5716 | 90.0 | 270 | 0.4088 | 0.6168 | 0.6575 | 0.9617 | nan | 0.3237 | 0.9913 | nan | 0.2724 | 0.9612 |
| 0.3803 | 93.33 | 280 | 0.4024 | 0.6171 | 0.6565 | 0.9621 | nan | 0.3211 | 0.9918 | nan | 0.2727 | 0.9615 |
| 0.371 | 96.67 | 290 | 0.3979 | 0.6166 | 0.6539 | 0.9625 | nan | 0.3154 | 0.9925 | nan | 0.2713 | 0.9620 |
| 0.3656 | 100.0 | 300 | 0.3992 | 0.6204 | 0.6615 | 0.9621 | nan | 0.3316 | 0.9913 | nan | 0.2793 | 0.9615 |
| 0.3674 | 103.33 | 310 | 0.3930 | 0.6110 | 0.6433 | 0.9630 | nan | 0.2925 | 0.9941 | nan | 0.2594 | 0.9625 |
| 0.378 | 106.67 | 320 | 0.3925 | 0.6124 | 0.6459 | 0.9629 | nan | 0.2981 | 0.9937 | nan | 0.2623 | 0.9624 |
| 0.5766 | 110.0 | 330 | 0.3965 | 0.6192 | 0.6594 | 0.9621 | nan | 0.3272 | 0.9916 | nan | 0.2768 | 0.9616 |
| 0.3513 | 113.33 | 340 | 0.3927 | 0.6161 | 0.6523 | 0.9627 | nan | 0.3118 | 0.9928 | nan | 0.2701 | 0.9622 |
| 0.3731 | 116.67 | 350 | 0.3975 | 0.6200 | 0.6613 | 0.9620 | nan | 0.3315 | 0.9912 | nan | 0.2785 | 0.9614 |
| 0.3489 | 120.0 | 360 | 0.3958 | 0.6134 | 0.6480 | 0.9627 | nan | 0.3026 | 0.9933 | nan | 0.2645 | 0.9622 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGB-b0_3", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGB-b0_3 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:10:25+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGB-b0\_3
===================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3958
* Mean Iou: 0.6134
* Mean Accuracy: 0.6480
* Overall Accuracy: 0.9627
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3026
* Accuracy Undropoff: 0.9933
* Iou Unlabeled: nan
* Iou Dropoff: 0.2645
* Iou Undropoff: 0.9622
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10591059178113937,
0.036406707018613815,
-0.002041781088337302,
0.11596930027008057,
0.1714659482240677,
0.028318241238594055,
0.11641968786716461,
0.11560021340847015,
-0.10877452790737152,
0.030912652611732483,
0.10430896282196045,
0.14163099229335785,
0.01629962958395481,
0.09532219916582108,
-0.020085278898477554,
-0.30744001269340515,
-0.02539004385471344,
0.03143095597624779,
-0.08681149780750275,
0.12519687414169312,
0.06513086706399918,
-0.16188034415245056,
0.09207220375537872,
-0.0032083075493574142,
-0.22071543335914612,
0.01589730754494667,
-0.004114775452762842,
-0.03122878633439541,
0.16029687225818634,
0.02357633225619793,
0.11608259379863739,
0.010143724270164967,
0.11489138752222061,
-0.20352774858474731,
0.017524294555187225,
0.05539601296186447,
-0.0043795364908874035,
0.06756952404975891,
0.06578890234231949,
0.0016288068145513535,
0.15191982686519623,
-0.1059940904378891,
0.06725753843784332,
0.0020551886409521103,
-0.14449739456176758,
-0.21125079691410065,
-0.07391216605901718,
0.022725099697709084,
0.07813199609518051,
0.09479884803295135,
-0.004944729618728161,
0.11241678893566132,
-0.09166383743286133,
0.1124383807182312,
0.26984864473342896,
-0.24355480074882507,
-0.0860433354973793,
0.037472907453775406,
0.0015846790047362447,
0.06687662750482559,
-0.13214673101902008,
0.008929550647735596,
0.03309425339102745,
0.04682445526123047,
0.11712583154439926,
-0.03376404941082001,
-0.09935519844293594,
0.02651998959481716,
-0.13818451762199402,
-0.03384209796786308,
0.05395243316888809,
0.052767518907785416,
-0.021198933944106102,
-0.03161989524960518,
-0.06856082379817963,
-0.18060626089572906,
-0.06563921272754669,
0.012051342986524105,
0.06658738106489182,
-0.05934448167681694,
-0.11524756997823715,
-0.01712081767618656,
-0.11102376133203506,
-0.08600760251283646,
-0.04974017292261124,
0.12833237648010254,
0.03393315151333809,
0.019822686910629272,
-0.03449401631951332,
0.12721192836761475,
-0.027143806219100952,
-0.13947705924510956,
0.016818301752209663,
0.02969534695148468,
-0.04308154061436653,
-0.03135136887431145,
-0.049116622656583786,
-0.06412902474403381,
-0.012727650813758373,
0.10901389271020889,
-0.060329437255859375,
0.06700942665338516,
0.03596173971891403,
0.0513516440987587,
-0.11418701708316803,
0.18956543505191803,
-0.06669562309980392,
-0.008179159834980965,
-0.0371287576854229,
0.057560257613658905,
0.00442540692165494,
-0.022692175582051277,
-0.10595186054706573,
0.003773114178329706,
0.07032424956560135,
-0.00843335036188364,
-0.08732932060956955,
0.06882526725530624,
-0.039810728281736374,
-0.011808953247964382,
-0.0013848122907802463,
-0.07570318877696991,
0.04621141403913498,
-0.00037272917688824236,
-0.083360955119133,
-0.029091104865074158,
0.05189743638038635,
0.015402769669890404,
0.014119260013103485,
0.16494019329547882,
-0.08581560105085373,
0.06393270194530487,
-0.11169346421957016,
-0.0995871052145958,
0.0009529309463687241,
-0.08649130165576935,
0.03577325493097305,
-0.07892858982086182,
-0.14939658343791962,
-0.008248391561210155,
0.07200909405946732,
-0.04051196202635765,
0.0025248501915484667,
-0.05283036455512047,
-0.09021950513124466,
0.002668651519343257,
-0.008688265457749367,
0.16225291788578033,
-0.06490723043680191,
0.12246591597795486,
0.03718114271759987,
0.07248258590698242,
-0.06544356048107147,
0.03855926916003227,
-0.08517038077116013,
0.02032848261296749,
-0.22071704268455505,
0.04320244863629341,
-0.04951300844550133,
0.06566619873046875,
-0.060653574764728546,
-0.12212702631950378,
0.0072395578026771545,
0.0020949707832187414,
0.09128567576408386,
0.10670078545808792,
-0.22462882101535797,
-0.0764412209391594,
0.1502048522233963,
-0.07392831146717072,
-0.09839023649692535,
0.11358591169118881,
-0.06351952999830246,
0.013043041341006756,
0.0610421858727932,
0.19783717393875122,
0.05328095331788063,
-0.1377512514591217,
0.023082606494426727,
-0.015367588959634304,
0.049398116767406464,
-0.027165455743670464,
0.052065908908843994,
0.02162730135023594,
0.08811955153942108,
0.018704550340771675,
-0.06750258058309555,
0.0677628293633461,
-0.12442849576473236,
-0.0973825752735138,
-0.02574753575026989,
-0.08652006089687347,
0.04356151819229126,
0.08868462592363358,
0.06171506643295288,
-0.10411621630191803,
-0.07881054282188416,
0.09323766827583313,
0.07615721970796585,
-0.06921514123678207,
0.040464919060468674,
-0.0656096488237381,
0.04356173798441887,
-0.01758229359984398,
-0.036990679800510406,
-0.174820214509964,
-0.024669533595442772,
-0.022326266393065453,
0.033140379935503006,
0.029954442754387856,
0.022955268621444702,
0.09186587482690811,
0.08934277296066284,
-0.07147775590419769,
-0.02469678223133087,
-0.06682298332452774,
0.0025075539015233517,
-0.12117345631122589,
-0.22780826687812805,
-0.04391269013285637,
-0.007835295982658863,
0.09110807627439499,
-0.21234571933746338,
0.024492016062140465,
0.02387630194425583,
0.08892504125833511,
0.025946561247110367,
-0.03145389258861542,
-0.053651172667741776,
0.07619374990463257,
-0.010972054675221443,
-0.06552385538816452,
0.06957029551267624,
-0.005011508706957102,
-0.06881695985794067,
-0.05457460507750511,
-0.11221367865800858,
0.1631614714860916,
0.13315021991729736,
-0.1470557600259781,
-0.09176328033208847,
-0.012604077346622944,
-0.0641164630651474,
-0.03307487815618515,
-0.04428306594491005,
0.038533080369234085,
0.17923392355442047,
0.0007990925805643201,
0.13288265466690063,
-0.060702238231897354,
-0.03450474143028259,
0.028916707262396812,
-0.028426647186279297,
0.027541212737560272,
0.12790487706661224,
0.12613020837306976,
-0.06276159733533859,
0.12355757504701614,
0.12424010783433914,
-0.08129942417144775,
0.14877618849277496,
-0.032862648367881775,
-0.08071060478687286,
-0.018813837319612503,
-0.0140802301466465,
-0.007860261015594006,
0.17658783495426178,
-0.14840620756149292,
-0.016807327046990395,
-0.004256419837474823,
0.013818185776472092,
0.014899618923664093,
-0.2512436509132385,
-0.05613352730870247,
0.03986582159996033,
-0.04548269510269165,
-0.009494428522884846,
-0.024595370516180992,
-0.0038176560774445534,
0.10482000559568405,
-0.007377767004072666,
-0.07433445006608963,
0.0012401690473780036,
-0.007497822400182486,
-0.049883194267749786,
0.2067444920539856,
-0.05796069651842117,
-0.12030388414859772,
-0.08982103317975998,
-0.07567491382360458,
-0.03568755090236664,
0.0035153538919985294,
0.05847974866628647,
-0.10734867304563522,
-0.018122749403119087,
-0.05908204987645149,
0.019605427980422974,
0.007381477393209934,
0.03708811476826668,
-0.0007159020169638097,
-0.0075367651879787445,
0.05688917264342308,
-0.09645732492208481,
-0.009906535036861897,
-0.06527683883905411,
-0.053648751229047775,
0.05491875112056732,
0.059240926057100296,
0.14802519977092743,
0.1349872648715973,
-0.02579585649073124,
0.01997743546962738,
-0.032591044902801514,
0.25795280933380127,
-0.09483391046524048,
-0.027106285095214844,
0.11867467314004898,
-0.013237598352134228,
0.05646909028291702,
0.1070772260427475,
0.08230385184288025,
-0.10863186419010162,
-0.0021534692496061325,
0.06373824924230576,
-0.05230473354458809,
-0.15508690476417542,
-0.015050246380269527,
-0.057802021503448486,
-0.02989283762872219,
0.07719714194536209,
0.027087148278951645,
-0.0038367193192243576,
0.05564122274518013,
0.04802238196134567,
0.04267096146941185,
-0.023051314055919647,
0.05042608082294464,
0.08746105432510376,
0.03245827928185463,
0.1090887263417244,
-0.04443301260471344,
-0.06665770709514618,
0.032430410385131836,
0.0034270044416189194,
0.24427002668380737,
-0.01566135697066784,
0.0969657152891159,
0.07355113327503204,
0.16159577667713165,
-0.012215204536914825,
0.04744432121515274,
-0.014521806500852108,
-0.0670519545674324,
-0.01921711303293705,
-0.04441175237298012,
-0.01764151081442833,
0.011651534587144852,
-0.05094140022993088,
0.038839858025312424,
-0.12640246748924255,
0.01144858356565237,
0.06839627027511597,
0.2500508725643158,
0.029192989692091942,
-0.3174043297767639,
-0.06564193964004517,
-0.004925642162561417,
-0.011179664172232151,
-0.008617421612143517,
0.0061102197505533695,
0.1544247567653656,
-0.08261898159980774,
0.0563044436275959,
-0.08604864776134491,
0.08553512394428253,
-0.036846838891506195,
0.05086645483970642,
0.07695168256759644,
0.07221728563308716,
-0.0032013095915317535,
0.05564851686358452,
-0.2847530245780945,
0.30179959535598755,
0.0014939256943762302,
0.08370096236467361,
-0.06397277116775513,
-0.031934574246406555,
0.0330696664750576,
0.08131279051303864,
0.08739541471004486,
-0.01500504370778799,
-0.022958464920520782,
-0.21385733783245087,
-0.022049281746149063,
0.030934831127524376,
0.12974496185779572,
-0.018540095537900925,
0.10405240207910538,
-0.010432885028421879,
-0.005348693113774061,
0.07394920289516449,
-0.0008029399323277175,
-0.03410647064447403,
-0.09015288203954697,
-0.026033926755189896,
-0.02589907869696617,
-0.049240998923778534,
-0.059444017708301544,
-0.10635793209075928,
-0.112869031727314,
0.11381607502698898,
0.016569698229432106,
-0.014265110716223717,
-0.1209530159831047,
0.09916826337575912,
0.07826872169971466,
-0.07615713775157928,
0.04145707190036774,
0.032202497124671936,
0.05965138599276543,
0.03166025131940842,
-0.05790642276406288,
0.11721611022949219,
-0.05998785048723221,
-0.16017937660217285,
-0.056180190294981,
0.09191469103097916,
0.05136469379067421,
0.05751107633113861,
-0.023572638630867004,
0.015342534519731998,
-0.016443448141217232,
-0.0919826477766037,
0.05438494682312012,
-0.04167114198207855,
0.062491871416568756,
0.009315837174654007,
-0.01950502209365368,
0.052434101700782776,
-0.05579143017530441,
-0.012027722783386707,
0.1481226235628128,
0.2866939604282379,
-0.0898970365524292,
0.014696088619530201,
0.014996358193457127,
-0.0657133013010025,
-0.1908927857875824,
0.07903271913528442,
0.059024542570114136,
0.0003236344491597265,
0.08764078468084335,
-0.16696979105472565,
0.09721887856721878,
0.10313831269741058,
-0.00018088877550326288,
0.11356144398450851,
-0.3685600757598877,
-0.1283271759748459,
0.07952839881181717,
0.19072848558425903,
0.07684752345085144,
-0.15424127876758575,
0.000239272034377791,
-0.0026929150335490704,
-0.1495710015296936,
0.09243884682655334,
-0.07779091596603394,
0.134884774684906,
-0.01935102604329586,
0.0875285193324089,
0.01665026694536209,
-0.0608099028468132,
0.12345021218061447,
-0.00443003186956048,
0.1392066925764084,
-0.06876465678215027,
-0.04000568017363548,
0.053715817630290985,
-0.03778083249926567,
-0.013550779782235622,
-0.04685073345899582,
0.02681855857372284,
-0.060428131371736526,
-0.012398848310112953,
-0.10420776903629303,
0.013147079385817051,
-0.03861471638083458,
-0.06746852397918701,
-0.045386649668216705,
0.043276671320199966,
0.04539574310183525,
-0.0031612433958798647,
0.15219777822494507,
-0.010296742431819439,
0.11267101764678955,
0.05012974143028259,
0.059254422783851624,
-0.06078752502799034,
-0.10647626966238022,
-0.01841866783797741,
0.008730622008442879,
0.04805048182606697,
-0.13453665375709534,
0.016422748565673828,
0.1524914801120758,
0.050543252378702164,
0.12226856499910355,
0.08626432716846466,
-0.03241485357284546,
0.0326315201818943,
0.06950843334197998,
-0.15724165737628937,
-0.1142343059182167,
0.0016000923933461308,
-0.06595996022224426,
-0.07277495414018631,
0.05207930505275726,
0.07878997176885605,
-0.07562381774187088,
0.011810862459242344,
-0.006615807767957449,
0.006621858105063438,
-0.06877296417951584,
0.204387828707695,
0.05550229921936989,
0.04115636274218559,
-0.10286250710487366,
0.07344283163547516,
0.018016142770648003,
-0.08592265099287033,
-0.001509714056737721,
0.09055889397859573,
-0.06869915872812271,
-0.02459602802991867,
0.08121182024478912,
0.19019687175750732,
-0.07842886447906494,
-0.023111004382371902,
-0.1500396728515625,
-0.10630752891302109,
0.06907854229211807,
0.18417076766490936,
0.10038203746080399,
-0.006512294057756662,
-0.05241300165653229,
0.04744500666856766,
-0.1165413036942482,
0.07704318314790726,
0.02548481896519661,
0.08105357736349106,
-0.14998933672904968,
0.18408077955245972,
0.011139607056975365,
0.05580996349453926,
-0.026211272925138474,
0.03276738151907921,
-0.11876556277275085,
0.04053877294063568,
-0.11572045087814331,
-0.03540515899658203,
-0.015525621362030506,
0.0042603593319654465,
-0.012991220690310001,
-0.06306900829076767,
-0.06357935816049576,
0.004990594461560249,
-0.12769334018230438,
-0.022463973611593246,
0.04468223452568054,
0.0224167387932539,
-0.12709692120552063,
-0.039472274482250214,
0.028656914830207825,
-0.06417880207300186,
0.05620552971959114,
0.036458492279052734,
0.014389386400580406,
0.06632015854120255,
-0.17220503091812134,
-0.022627338767051697,
0.07081697136163712,
-0.006781075149774551,
0.06253356486558914,
-0.03669579699635506,
-0.02614782750606537,
-0.02933921478688717,
0.08741414546966553,
0.012411856092512608,
0.06344663351774216,
-0.13639067113399506,
0.005803209729492664,
-0.03352953493595123,
-0.09334559738636017,
-0.05869465693831444,
0.052820511162281036,
0.061405763030052185,
0.037500809878110886,
0.16263720393180847,
-0.0828312337398529,
0.04512277618050575,
-0.21882659196853638,
-0.016123522073030472,
0.0019690608605742455,
-0.10597095638513565,
-0.08253838866949081,
-0.07247085124254227,
0.08213773369789124,
-0.07532702386379242,
0.10943756252527237,
0.03624960035085678,
0.06528622657060623,
0.031573258340358734,
-0.0334358774125576,
-0.003328854450955987,
0.03439907357096672,
0.21135494112968445,
0.011939233168959618,
-0.03267694637179375,
0.0883578434586525,
0.07814532518386841,
0.10031159967184067,
0.1352544128894806,
0.22853423655033112,
0.1549084633588791,
-0.02621779963374138,
0.08982422947883606,
0.052107393741607666,
-0.06428554654121399,
-0.17406204342842102,
0.03619799762964249,
-0.05177341774106026,
0.09734605252742767,
-0.061150599271059036,
0.20368969440460205,
0.08749991655349731,
-0.1844257265329361,
0.0654185488820076,
-0.04458949714899063,
-0.10134459286928177,
-0.0817386656999588,
-0.038064587861299515,
-0.07051079720258713,
-0.14805366098880768,
0.025765161961317062,
-0.10210359841585159,
0.04328684136271477,
0.14954687654972076,
0.009621428325772285,
-0.012140410020947456,
0.21240055561065674,
0.0323977954685688,
0.03548222407698631,
0.05726966634392738,
0.01483248732984066,
-0.02861449122428894,
-0.0914607122540474,
-0.060975588858127594,
0.018010711297392845,
-0.028469564393162727,
0.018588028848171234,
-0.06901375949382782,
-0.07651720941066742,
0.026613211259245872,
0.00447197025641799,
-0.09357792139053345,
0.023254575207829475,
0.020589983090758324,
0.09027770161628723,
0.02563934214413166,
0.006544429808855057,
0.016347095370292664,
-0.028143852949142456,
0.24694928526878357,
-0.09354884177446365,
-0.08238384127616882,
-0.08273855596780777,
0.21722160279750824,
0.03111916407942772,
0.001720828702673316,
0.009150993078947067,
-0.08181167393922806,
0.008721832185983658,
0.22852452099323273,
0.17236484587192535,
-0.13280828297138214,
-0.010138200595974922,
-0.00008552828512620181,
0.0014475751668214798,
-0.029257183894515038,
0.1178811565041542,
0.12015199661254883,
0.05330536887049675,
-0.11473056674003601,
-0.05287366360425949,
-0.05217143893241882,
-0.01830500364303589,
-0.026758121326565742,
0.05119778960943222,
0.0652909055352211,
0.022434907034039497,
-0.07022004574537277,
0.07567879557609558,
-0.058515407145023346,
-0.14318467676639557,
0.10563810169696808,
-0.2276010364294052,
-0.15647411346435547,
-0.006186052691191435,
0.12145841866731644,
0.003695683553814888,
0.0602986142039299,
-0.04246499389410019,
0.0008722473867237568,
0.05150967091321945,
-0.004567287862300873,
-0.079880490899086,
-0.10231195390224457,
0.08472739160060883,
-0.11505527794361115,
0.21724817156791687,
-0.05875490605831146,
0.035077739506959915,
0.1125645637512207,
0.06407725065946579,
-0.049887288361787796,
0.055466264486312866,
0.041938669979572296,
-0.12384229898452759,
-0.004967229440808296,
0.12469584494829178,
-0.039480820298194885,
0.05403485894203186,
0.0337943434715271,
-0.13283227384090424,
0.031666163355112076,
-0.054909028112888336,
-0.041300851851701736,
-0.027253326028585434,
-0.05152781680226326,
-0.06417661905288696,
0.1148044764995575,
0.2075551152229309,
-0.008365480229258537,
0.024621890857815742,
-0.08690034598112106,
0.01690840907394886,
0.06606423854827881,
0.04895693063735962,
-0.07765200734138489,
-0.21532177925109863,
0.006690898910164833,
0.0703587755560875,
-0.04111383110284805,
-0.2040543109178543,
-0.1128106415271759,
0.036610137671232224,
-0.0538211353123188,
-0.07152272015810013,
0.09118351340293884,
0.09030050039291382,
0.055649980902671814,
-0.05476338416337967,
-0.10570818930864334,
-0.05848672240972519,
0.1701793223619461,
-0.14630326628684998,
-0.07738175243139267
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGB-b0_4
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3032
- Mean Iou: 0.6301
- Mean Accuracy: 0.6710
- Overall Accuracy: 0.9634
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3502
- Accuracy Undropoff: 0.9918
- Iou Unlabeled: nan
- Iou Dropoff: 0.2973
- Iou Undropoff: 0.9628
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.0311 | 3.33 | 10 | 1.0742 | 0.2063 | 0.6373 | 0.5492 | nan | 0.7339 | 0.5406 | 0.0 | 0.0848 | 0.5342 |
| 0.9741 | 6.67 | 20 | 1.0151 | 0.3072 | 0.8067 | 0.7686 | nan | 0.8485 | 0.7649 | 0.0 | 0.1619 | 0.7596 |
| 0.9441 | 10.0 | 30 | 0.9345 | 0.3432 | 0.8327 | 0.8408 | nan | 0.8239 | 0.8416 | 0.0 | 0.1947 | 0.8348 |
| 0.8222 | 13.33 | 40 | 0.8358 | 0.3643 | 0.8236 | 0.8773 | nan | 0.7646 | 0.8825 | 0.0 | 0.2199 | 0.8731 |
| 0.7243 | 16.67 | 50 | 0.7135 | 0.3924 | 0.7838 | 0.9194 | nan | 0.6350 | 0.9325 | 0.0 | 0.2603 | 0.9170 |
| 0.7213 | 20.0 | 60 | 0.6358 | 0.4054 | 0.7528 | 0.9374 | nan | 0.5502 | 0.9554 | 0.0 | 0.2805 | 0.9359 |
| 0.5836 | 23.33 | 70 | 0.5604 | 0.4211 | 0.7412 | 0.9505 | nan | 0.5115 | 0.9708 | 0.0 | 0.3139 | 0.9493 |
| 0.5285 | 26.67 | 80 | 0.5227 | 0.4281 | 0.7570 | 0.9519 | nan | 0.5432 | 0.9708 | 0.0 | 0.3335 | 0.9507 |
| 0.4955 | 30.0 | 90 | 0.4478 | 0.4191 | 0.6945 | 0.9581 | nan | 0.4052 | 0.9837 | 0.0 | 0.2999 | 0.9573 |
| 0.4646 | 33.33 | 100 | 0.4537 | 0.4215 | 0.6998 | 0.9584 | nan | 0.4161 | 0.9835 | 0.0 | 0.3069 | 0.9576 |
| 0.4356 | 36.67 | 110 | 0.4454 | 0.4224 | 0.7105 | 0.9569 | nan | 0.4402 | 0.9808 | 0.0 | 0.3112 | 0.9560 |
| 0.4829 | 40.0 | 120 | 0.4099 | 0.4196 | 0.6901 | 0.9593 | nan | 0.3947 | 0.9854 | 0.0 | 0.3002 | 0.9585 |
| 0.4051 | 43.33 | 130 | 0.3911 | 0.6267 | 0.6784 | 0.9607 | nan | 0.3687 | 0.9881 | nan | 0.2933 | 0.9600 |
| 0.3916 | 46.67 | 140 | 0.3841 | 0.4183 | 0.6897 | 0.9586 | nan | 0.3946 | 0.9847 | 0.0 | 0.2969 | 0.9579 |
| 0.3713 | 50.0 | 150 | 0.3788 | 0.4248 | 0.7001 | 0.9600 | nan | 0.4149 | 0.9853 | 0.0 | 0.3150 | 0.9593 |
| 0.359 | 53.33 | 160 | 0.3719 | 0.6254 | 0.6761 | 0.9607 | nan | 0.3639 | 0.9883 | nan | 0.2908 | 0.9601 |
| 0.3459 | 56.67 | 170 | 0.3610 | 0.6245 | 0.6774 | 0.9601 | nan | 0.3673 | 0.9876 | nan | 0.2895 | 0.9594 |
| 0.3099 | 60.0 | 180 | 0.3455 | 0.6246 | 0.6687 | 0.9620 | nan | 0.3468 | 0.9905 | nan | 0.2879 | 0.9614 |
| 0.3124 | 63.33 | 190 | 0.3436 | 0.6277 | 0.6763 | 0.9615 | nan | 0.3634 | 0.9892 | nan | 0.2946 | 0.9608 |
| 0.3283 | 66.67 | 200 | 0.3344 | 0.6237 | 0.6607 | 0.9634 | nan | 0.3286 | 0.9928 | nan | 0.2845 | 0.9629 |
| 0.2974 | 70.0 | 210 | 0.3412 | 0.6312 | 0.6817 | 0.9616 | nan | 0.3746 | 0.9888 | nan | 0.3014 | 0.9609 |
| 0.3003 | 73.33 | 220 | 0.3322 | 0.6320 | 0.6877 | 0.9607 | nan | 0.3881 | 0.9872 | nan | 0.3041 | 0.9600 |
| 0.2968 | 76.67 | 230 | 0.3289 | 0.6344 | 0.6807 | 0.9628 | nan | 0.3712 | 0.9902 | nan | 0.3066 | 0.9622 |
| 0.4415 | 80.0 | 240 | 0.3333 | 0.6320 | 0.6800 | 0.9622 | nan | 0.3705 | 0.9896 | nan | 0.3024 | 0.9615 |
| 0.2836 | 83.33 | 250 | 0.3271 | 0.6287 | 0.6757 | 0.9619 | nan | 0.3617 | 0.9897 | nan | 0.2960 | 0.9613 |
| 0.2762 | 86.67 | 260 | 0.3203 | 0.6263 | 0.6673 | 0.9629 | nan | 0.3429 | 0.9916 | nan | 0.2903 | 0.9623 |
| 0.3901 | 90.0 | 270 | 0.3186 | 0.6290 | 0.6787 | 0.9614 | nan | 0.3685 | 0.9889 | nan | 0.2971 | 0.9608 |
| 0.2755 | 93.33 | 280 | 0.3086 | 0.6283 | 0.6693 | 0.9631 | nan | 0.3468 | 0.9917 | nan | 0.2940 | 0.9625 |
| 0.2652 | 96.67 | 290 | 0.3099 | 0.6302 | 0.6779 | 0.9620 | nan | 0.3661 | 0.9896 | nan | 0.2991 | 0.9614 |
| 0.2627 | 100.0 | 300 | 0.3056 | 0.6294 | 0.6728 | 0.9627 | nan | 0.3548 | 0.9909 | nan | 0.2966 | 0.9622 |
| 0.2647 | 103.33 | 310 | 0.3036 | 0.6292 | 0.6689 | 0.9635 | nan | 0.3458 | 0.9921 | nan | 0.2954 | 0.9629 |
| 0.2697 | 106.67 | 320 | 0.3043 | 0.6298 | 0.6713 | 0.9632 | nan | 0.3510 | 0.9916 | nan | 0.2970 | 0.9626 |
| 0.3878 | 110.0 | 330 | 0.3037 | 0.6297 | 0.6740 | 0.9626 | nan | 0.3573 | 0.9907 | nan | 0.2973 | 0.9620 |
| 0.2521 | 113.33 | 340 | 0.3013 | 0.6300 | 0.6714 | 0.9633 | nan | 0.3513 | 0.9916 | nan | 0.2974 | 0.9627 |
| 0.2663 | 116.67 | 350 | 0.3060 | 0.6298 | 0.6766 | 0.9621 | nan | 0.3634 | 0.9899 | nan | 0.2981 | 0.9615 |
| 0.2507 | 120.0 | 360 | 0.3032 | 0.6301 | 0.6710 | 0.9634 | nan | 0.3502 | 0.9918 | nan | 0.2973 | 0.9628 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGB-b0_4", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGB-b0_4 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:10:27+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGB-b0\_4
===================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3032
* Mean Iou: 0.6301
* Mean Accuracy: 0.6710
* Overall Accuracy: 0.9634
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3502
* Accuracy Undropoff: 0.9918
* Iou Unlabeled: nan
* Iou Dropoff: 0.2973
* Iou Undropoff: 0.9628
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10568203032016754,
0.034310873597860336,
-0.0020052888430655003,
0.11610046774148941,
0.17228668928146362,
0.028363551944494247,
0.11617675423622131,
0.11523397266864777,
-0.11099763214588165,
0.03049360029399395,
0.10458611696958542,
0.14283451437950134,
0.01627075858414173,
0.09541396796703339,
-0.01991966739296913,
-0.3067193627357483,
-0.025057120248675346,
0.03138473629951477,
-0.08510126173496246,
0.1253262758255005,
0.06550346314907074,
-0.1619582623243332,
0.09239703416824341,
-0.003596579423174262,
-0.21964815258979797,
0.015999557450413704,
-0.005329017993062735,
-0.03146868944168091,
0.15970826148986816,
0.02341117337346077,
0.11567100137472153,
0.009936717338860035,
0.11415968835353851,
-0.20344634354114532,
0.017679238691926003,
0.055597614496946335,
-0.004258429165929556,
0.06707211583852768,
0.06586253643035889,
0.0014985718298703432,
0.15110868215560913,
-0.10579033195972443,
0.06835782527923584,
0.0022776732221245766,
-0.14473670721054077,
-0.2100660800933838,
-0.07446174323558807,
0.022574272006750107,
0.07840698212385178,
0.09581136703491211,
-0.004728428088128567,
0.1130012720823288,
-0.09175640344619751,
0.1131497174501419,
0.2711831331253052,
-0.2422180026769638,
-0.0866069495677948,
0.03972857445478439,
0.002479933900758624,
0.06742198020219803,
-0.13131248950958252,
0.009007683023810387,
0.032906271517276764,
0.0477973110973835,
0.11796174943447113,
-0.03357322886586189,
-0.09860288351774216,
0.02659316547214985,
-0.13903304934501648,
-0.033298641443252563,
0.05523455888032913,
0.05271794646978378,
-0.02079673297703266,
-0.033362798392772675,
-0.0671728253364563,
-0.18194013833999634,
-0.06600288301706314,
0.01221657358109951,
0.06640360504388809,
-0.060101468116045,
-0.11538924276828766,
-0.016313115134835243,
-0.11030862480401993,
-0.08534715324640274,
-0.049041688442230225,
0.12745332717895508,
0.033965498208999634,
0.019251281395554543,
-0.03417006507515907,
0.12710566818714142,
-0.02634599432349205,
-0.1389843374490738,
0.016804032027721405,
0.02873995341360569,
-0.04225322604179382,
-0.03238746523857117,
-0.04932522401213646,
-0.06459038704633713,
-0.012606538832187653,
0.109328493475914,
-0.060348931699991226,
0.06769931316375732,
0.036480240523815155,
0.050864461809396744,
-0.11365550011396408,
0.18960613012313843,
-0.06753421574831009,
-0.008469657972455025,
-0.03723716363310814,
0.05861883610486984,
0.004194814711809158,
-0.02243168279528618,
-0.10548785328865051,
0.0049391635693609715,
0.07021109014749527,
-0.007454104721546173,
-0.08733302354812622,
0.06983554363250732,
-0.03953494131565094,
-0.012364836409687996,
0.0004808762460015714,
-0.07560261338949203,
0.045141857117414474,
-0.0008761744247749448,
-0.083174929022789,
-0.028285665437579155,
0.051485173404216766,
0.01520170271396637,
0.014083442278206348,
0.16533713042736053,
-0.08578535169363022,
0.06360255926847458,
-0.11210166662931442,
-0.09957769513130188,
0.00037678383523598313,
-0.08606544882059097,
0.035591594874858856,
-0.07912884652614594,
-0.1499486118555069,
-0.008778676390647888,
0.07213620096445084,
-0.04063490405678749,
0.003026821417734027,
-0.05252555385231972,
-0.0903196781873703,
0.0024763643741607666,
-0.008908649906516075,
0.16290920972824097,
-0.0651860460639,
0.12217126041650772,
0.03796898201107979,
0.07190880924463272,
-0.06495226919651031,
0.03906301409006119,
-0.08485566824674606,
0.02004924975335598,
-0.22168710827827454,
0.043134234845638275,
-0.05033760517835617,
0.06720215827226639,
-0.06040476635098457,
-0.12282755225896835,
0.007988572120666504,
0.0022983329836279154,
0.09129443764686584,
0.10614853352308273,
-0.22375546395778656,
-0.07584104686975479,
0.14787186682224274,
-0.07313519716262817,
-0.09966561943292618,
0.11313436180353165,
-0.06325682252645493,
0.011858033947646618,
0.06089091673493385,
0.19724565744400024,
0.052754130214452744,
-0.13853949308395386,
0.02326320856809616,
-0.015579626895487309,
0.0495004840195179,
-0.027750838547945023,
0.051581937819719315,
0.02157057262957096,
0.08754453808069229,
0.019136443734169006,
-0.06767737865447998,
0.06801313161849976,
-0.1243351399898529,
-0.09735546261072159,
-0.025388270616531372,
-0.08679729700088501,
0.04176992550492287,
0.08891592174768448,
0.061011720448732376,
-0.10475653409957886,
-0.0782274454832077,
0.09197782725095749,
0.07598263025283813,
-0.06920557469129562,
0.040173232555389404,
-0.0652095377445221,
0.044158581644296646,
-0.017552589997649193,
-0.03718860074877739,
-0.174536794424057,
-0.02554505690932274,
-0.022517019882798195,
0.03456355258822441,
0.029413586482405663,
0.02168627828359604,
0.09174060821533203,
0.08913848549127579,
-0.07175818085670471,
-0.024819908663630486,
-0.06567797064781189,
0.002686374355107546,
-0.12057187408208847,
-0.22752492129802704,
-0.04412039741873741,
-0.007968965917825699,
0.08866548538208008,
-0.2109946310520172,
0.02440466918051243,
0.02240639366209507,
0.08802163600921631,
0.025336291640996933,
-0.03154107555747032,
-0.05266375467181206,
0.07719859480857849,
-0.01098770834505558,
-0.0658455342054367,
0.07010015100240707,
-0.005256453063338995,
-0.06792617589235306,
-0.05507153645157814,
-0.11310558766126633,
0.16384564340114594,
0.13299763202667236,
-0.1474342942237854,
-0.09256324917078018,
-0.01349611859768629,
-0.06420727074146271,
-0.03276904299855232,
-0.04335766285657883,
0.03838436305522919,
0.17910966277122498,
0.0004090390575584024,
0.13274246454238892,
-0.06086982786655426,
-0.03500383347272873,
0.028675096109509468,
-0.027712905779480934,
0.026376595720648766,
0.127682164311409,
0.12660253047943115,
-0.06166522204875946,
0.12367600202560425,
0.12418309599161148,
-0.08067891746759415,
0.1488448977470398,
-0.03283599391579628,
-0.0800417810678482,
-0.017992252483963966,
-0.014102495275437832,
-0.007623208686709404,
0.1762656271457672,
-0.14839740097522736,
-0.016887729987502098,
-0.004641381558030844,
0.014189463108778,
0.015359614044427872,
-0.24998725950717926,
-0.05676104873418808,
0.039820894598960876,
-0.045143842697143555,
-0.007877825759351254,
-0.023854508996009827,
-0.004334282595664263,
0.10426000505685806,
-0.006898446008563042,
-0.07499462366104126,
0.0014738719910383224,
-0.007587512955069542,
-0.049762170761823654,
0.2071406990289688,
-0.05837604030966759,
-0.12102153897285461,
-0.08949463069438934,
-0.07839836925268173,
-0.037190522998571396,
0.002755607943981886,
0.05840720236301422,
-0.10830553621053696,
-0.01873757690191269,
-0.059783291071653366,
0.019399868324398994,
0.006756237708032131,
0.03639897704124451,
-0.0002487284364178777,
-0.008273708634078503,
0.056638363748788834,
-0.09683648496866226,
-0.010092961601912975,
-0.06549257040023804,
-0.05354372039437294,
0.05472349748015404,
0.05884351581335068,
0.14876136183738708,
0.1346484124660492,
-0.024849001318216324,
0.020248960703611374,
-0.033107731491327286,
0.2594600319862366,
-0.09442488104104996,
-0.027865400537848473,
0.11897386610507965,
-0.01188881229609251,
0.056266117841005325,
0.10684240609407425,
0.08272673189640045,
-0.10943450033664703,
-0.0014664501650258899,
0.06393436342477798,
-0.052600644528865814,
-0.15494459867477417,
-0.015240147709846497,
-0.0580441951751709,
-0.029980240389704704,
0.07653060555458069,
0.027349097654223442,
-0.0027386874426156282,
0.055826976895332336,
0.048215556889772415,
0.0418405756354332,
-0.0229563657194376,
0.05065071955323219,
0.08783916383981705,
0.032351333647966385,
0.10961855947971344,
-0.04406305402517319,
-0.06647936999797821,
0.03213553875684738,
0.003989717457443476,
0.243775874376297,
-0.015748774632811546,
0.09612256288528442,
0.07445143908262253,
0.16092947125434875,
-0.012775292620062828,
0.048031628131866455,
-0.01546204462647438,
-0.06777671724557877,
-0.01905171200633049,
-0.044707585126161575,
-0.01688350923359394,
0.009901583194732666,
-0.05110535770654678,
0.03950786590576172,
-0.12625817954540253,
0.010610254481434822,
0.0685291662812233,
0.25067001581192017,
0.027425672858953476,
-0.3183855414390564,
-0.06481905281543732,
-0.005749670788645744,
-0.01072122436016798,
-0.008127817884087563,
0.006525708362460136,
0.15171122550964355,
-0.08263228833675385,
0.05701193958520889,
-0.08584713190793991,
0.08545848727226257,
-0.03671751916408539,
0.05084918439388275,
0.07855608314275742,
0.07366984337568283,
-0.003862930228933692,
0.05514295771718025,
-0.28678765892982483,
0.30101078748703003,
0.0014081423869356513,
0.08434140682220459,
-0.06383278220891953,
-0.031907349824905396,
0.0332881435751915,
0.08187414705753326,
0.08728303760290146,
-0.015654537826776505,
-0.02024191804230213,
-0.21498098969459534,
-0.02206515520811081,
0.0307297445833683,
0.12941695749759674,
-0.017684750258922577,
0.10396631062030792,
-0.010228889063000679,
-0.005236118566244841,
0.07403544336557388,
0.000029118979000486434,
-0.03504656255245209,
-0.09080992639064789,
-0.025998810306191444,
-0.025739219039678574,
-0.05002981051802635,
-0.05900079756975174,
-0.10674583911895752,
-0.11585163325071335,
0.11343960464000702,
0.01860172115266323,
-0.01447988860309124,
-0.12113673985004425,
0.09700119495391846,
0.07841750234365463,
-0.07535451650619507,
0.041327331215143204,
0.03151172772049904,
0.05825412645936012,
0.03162510320544243,
-0.05837370082736015,
0.1170579344034195,
-0.06111983209848404,
-0.15937337279319763,
-0.05613251402974129,
0.09158256649971008,
0.051379863172769547,
0.05715322121977806,
-0.024234652519226074,
0.015338847413659096,
-0.01662769541144371,
-0.09233999252319336,
0.05401405692100525,
-0.04180057719349861,
0.06286649405956268,
0.009956397116184235,
-0.019821783527731895,
0.05227202549576759,
-0.056310418993234634,
-0.01219782792031765,
0.14759410917758942,
0.2859400510787964,
-0.08906932175159454,
0.014190033078193665,
0.015104818157851696,
-0.06532087177038193,
-0.1903555691242218,
0.07927858829498291,
0.058426305651664734,
-0.00036695797462016344,
0.08805356174707413,
-0.16665244102478027,
0.09746026247739792,
0.10335470736026764,
0.00010310610377928242,
0.11318225413560867,
-0.3682861626148224,
-0.12823761999607086,
0.07982020080089569,
0.19185775518417358,
0.07535739243030548,
-0.15506905317306519,
0.001190862967632711,
-0.002242852933704853,
-0.14933304488658905,
0.09208282828330994,
-0.07634279131889343,
0.13554194569587708,
-0.019897058606147766,
0.08622775971889496,
0.01612669788300991,
-0.06148972734808922,
0.12291643023490906,
-0.004599647130817175,
0.1398308277130127,
-0.06937699019908905,
-0.03888072073459625,
0.05356992781162262,
-0.037859052419662476,
-0.012833229266107082,
-0.047298576682806015,
0.02697151154279709,
-0.06029711663722992,
-0.012618509121239185,
-0.10466840863227844,
0.012930364347994328,
-0.0388936810195446,
-0.06769615411758423,
-0.04555997997522354,
0.04402066767215729,
0.04539979249238968,
-0.00353108998388052,
0.15231339633464813,
-0.010838954709470272,
0.1136644184589386,
0.048744384199380875,
0.06046447157859802,
-0.06204691156744957,
-0.10752249509096146,
-0.01850900426506996,
0.00833151489496231,
0.04758154973387718,
-0.13260920345783234,
0.015375776216387749,
0.15228407084941864,
0.04943443462252617,
0.12221904844045639,
0.08648478984832764,
-0.03246122971177101,
0.03208085149526596,
0.06972602009773254,
-0.15693266689777374,
-0.11227903515100479,
0.0013418900780379772,
-0.06658440828323364,
-0.07313991338014603,
0.052583206444978714,
0.07773560285568237,
-0.07507479190826416,
0.01205501053482294,
-0.0066259209997951984,
0.006797967478632927,
-0.06821276992559433,
0.20487575232982635,
0.05566054582595825,
0.041133441030979156,
-0.10329809039831161,
0.07347003370523453,
0.017363468185067177,
-0.08510752022266388,
-0.0014960605185478926,
0.09201300144195557,
-0.06920777261257172,
-0.02483738213777542,
0.08131858706474304,
0.19032466411590576,
-0.07863075286149979,
-0.02270977571606636,
-0.15031938254833221,
-0.10656477510929108,
0.0695284977555275,
0.1841556876897812,
0.1004321277141571,
-0.007318377494812012,
-0.05236385762691498,
0.047397125512361526,
-0.11662990599870682,
0.07693072408437729,
0.023970171809196472,
0.08136561512947083,
-0.14991366863250732,
0.18174730241298676,
0.011687264777719975,
0.055347736924886703,
-0.026145366951823235,
0.03271676227450371,
-0.1190951019525528,
0.040742430835962296,
-0.11505230516195297,
-0.036935433745384216,
-0.01618773490190506,
0.004669161979109049,
-0.013356480747461319,
-0.062012411653995514,
-0.06344255059957504,
0.005434367805719376,
-0.1271473914384842,
-0.02260776422917843,
0.0452495701611042,
0.02267582342028618,
-0.12671732902526855,
-0.0390266589820385,
0.027462095022201538,
-0.0639568567276001,
0.05608043447136879,
0.035452939569950104,
0.013840378262102604,
0.06580358743667603,
-0.1730583906173706,
-0.02282106690108776,
0.0703514814376831,
-0.007203318178653717,
0.06232687085866928,
-0.0366305448114872,
-0.026281604543328285,
-0.02910427749156952,
0.08718588948249817,
0.01298639364540577,
0.06237444281578064,
-0.13668407499790192,
0.006213736720383167,
-0.03299032524228096,
-0.09234659373760223,
-0.05839794874191284,
0.053035344928503036,
0.06233586370944977,
0.03737616166472435,
0.16254039108753204,
-0.08333157747983932,
0.044984035193920135,
-0.21841612458229065,
-0.0164665337651968,
0.0018833264475688338,
-0.10650001466274261,
-0.08360176533460617,
-0.07185343652963638,
0.0824684351682663,
-0.07519587874412537,
0.11156070232391357,
0.03669772669672966,
0.06490853428840637,
0.031698670238256454,
-0.03261697664856911,
-0.0020481105893850327,
0.03388416767120361,
0.210927352309227,
0.011350899003446102,
-0.032606806606054306,
0.08931900560855865,
0.07884462922811508,
0.1002785712480545,
0.13491471111774445,
0.22851504385471344,
0.15499670803546906,
-0.026361871510744095,
0.08939691632986069,
0.05229821056127548,
-0.06479450315237045,
-0.17212559282779694,
0.03481404483318329,
-0.05090286582708359,
0.09775900095701218,
-0.061160873621702194,
0.20184721052646637,
0.08728577196598053,
-0.18398678302764893,
0.06550253927707672,
-0.0451340451836586,
-0.10112285614013672,
-0.08130311965942383,
-0.03782977536320686,
-0.06964492052793503,
-0.1480770856142044,
0.025917263701558113,
-0.10287830978631973,
0.042506083846092224,
0.15076063573360443,
0.010468785651028156,
-0.012213587760925293,
0.2135465294122696,
0.033625487238168716,
0.035995982587337494,
0.057313088327646255,
0.014271584339439869,
-0.028803816065192223,
-0.09212145209312439,
-0.06047889217734337,
0.017424648627638817,
-0.029513197019696236,
0.018649844452738762,
-0.06916222721338272,
-0.07753527164459229,
0.027022233232855797,
0.0048261722549796104,
-0.09392049908638,
0.02354523167014122,
0.02049095183610916,
0.09011025726795197,
0.02770555019378662,
0.006251844577491283,
0.01624777726829052,
-0.028084497898817062,
0.24562418460845947,
-0.09299247711896896,
-0.08212646096944809,
-0.08260203152894974,
0.21761731803417206,
0.031158512458205223,
0.0013461606577038765,
0.008507671765983105,
-0.08096644282341003,
0.008408990688621998,
0.22978870570659637,
0.1737034022808075,
-0.13272763788700104,
-0.010249785147607327,
-0.00010265110176987946,
0.0016836047871038318,
-0.02940971404314041,
0.11851155757904053,
0.12115024030208588,
0.052382346242666245,
-0.11461067944765091,
-0.052260637283325195,
-0.052412550896406174,
-0.018258582800626755,
-0.026400191709399223,
0.04908032342791557,
0.06722354888916016,
0.022531850263476372,
-0.06953738629817963,
0.0760277658700943,
-0.05932505056262016,
-0.14101992547512054,
0.1048654168844223,
-0.22703136503696442,
-0.15631406009197235,
-0.00711339246481657,
0.12067399173974991,
0.003429584437981248,
0.06026557832956314,
-0.041714973747730255,
0.0010934170568361878,
0.05072740465402603,
-0.004878721199929714,
-0.079104945063591,
-0.10343799740076065,
0.08524784445762634,
-0.11356052756309509,
0.21715715527534485,
-0.05884687602519989,
0.03447098657488823,
0.11297964304685593,
0.0634879395365715,
-0.04997463896870613,
0.05654465779662132,
0.04185411334037781,
-0.12334147840738297,
-0.0044664726592600346,
0.12353415787220001,
-0.03904144465923309,
0.05490783974528313,
0.03297015652060509,
-0.13199153542518616,
0.032621726393699646,
-0.05498421564698219,
-0.0407620370388031,
-0.027256276458501816,
-0.050684232264757156,
-0.0640455111861229,
0.11558375507593155,
0.20812341570854187,
-0.008313491009175777,
0.024143340066075325,
-0.0869632437825203,
0.017221445217728615,
0.06573953479528427,
0.04842563346028328,
-0.07829015702009201,
-0.2159627079963684,
0.006738430354744196,
0.06964126974344254,
-0.04179685190320015,
-0.2051277607679367,
-0.11257582902908325,
0.03696665167808533,
-0.05386553704738617,
-0.07216659933328629,
0.09070315957069397,
0.08999166637659073,
0.05633538216352463,
-0.054740212857723236,
-0.10217511653900146,
-0.059030719101428986,
0.17010535299777985,
-0.14732299745082855,
-0.07690544426441193
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGB-b0_5
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2543
- Mean Iou: 0.6541
- Mean Accuracy: 0.6937
- Overall Accuracy: 0.9665
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3944
- Accuracy Undropoff: 0.9930
- Iou Unlabeled: nan
- Iou Dropoff: 0.3424
- Iou Undropoff: 0.9659
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.2123 | 3.33 | 10 | 1.1206 | 0.0793 | 0.1898 | 0.1888 | nan | 0.1908 | 0.1887 | 0.0 | 0.0494 | 0.1886 |
| 1.0927 | 6.67 | 20 | 1.0985 | 0.2196 | 0.5875 | 0.5351 | nan | 0.6450 | 0.5300 | 0.0 | 0.1290 | 0.5298 |
| 1.0578 | 10.0 | 30 | 0.9786 | 0.3662 | 0.7562 | 0.8622 | nan | 0.6400 | 0.8725 | 0.0 | 0.2367 | 0.8621 |
| 0.788 | 13.33 | 40 | 0.7940 | 0.4289 | 0.7505 | 0.9456 | nan | 0.5365 | 0.9646 | 0.0 | 0.3398 | 0.9468 |
| 0.6353 | 16.67 | 50 | 0.6206 | 0.4182 | 0.6840 | 0.9583 | nan | 0.3830 | 0.9850 | 0.0 | 0.2966 | 0.9581 |
| 0.6944 | 20.0 | 60 | 0.5213 | 0.4211 | 0.6766 | 0.9623 | nan | 0.3631 | 0.9901 | 0.0 | 0.3014 | 0.9620 |
| 0.5046 | 23.33 | 70 | 0.4765 | 0.4239 | 0.6796 | 0.9634 | nan | 0.3683 | 0.9910 | 0.0 | 0.3090 | 0.9628 |
| 0.4684 | 26.67 | 80 | 0.4643 | 0.3982 | 0.6347 | 0.9598 | nan | 0.2779 | 0.9914 | 0.0 | 0.2352 | 0.9593 |
| 0.4401 | 30.0 | 90 | 0.4483 | 0.4110 | 0.6507 | 0.9632 | nan | 0.3077 | 0.9936 | 0.0 | 0.2703 | 0.9627 |
| 0.4268 | 33.33 | 100 | 0.4366 | 0.6489 | 0.7001 | 0.9638 | nan | 0.4108 | 0.9895 | nan | 0.3347 | 0.9632 |
| 0.3939 | 36.67 | 110 | 0.4027 | 0.4272 | 0.6798 | 0.9650 | nan | 0.3670 | 0.9927 | 0.0 | 0.3171 | 0.9644 |
| 0.4472 | 40.0 | 120 | 0.4159 | 0.6428 | 0.6896 | 0.9638 | nan | 0.3887 | 0.9905 | nan | 0.3225 | 0.9632 |
| 0.3618 | 43.33 | 130 | 0.3765 | 0.6325 | 0.6671 | 0.9650 | nan | 0.3402 | 0.9939 | nan | 0.3006 | 0.9644 |
| 0.3456 | 46.67 | 140 | 0.3671 | 0.6395 | 0.6816 | 0.9643 | nan | 0.3715 | 0.9917 | nan | 0.3153 | 0.9637 |
| 0.3352 | 50.0 | 150 | 0.3572 | 0.6431 | 0.6839 | 0.9650 | nan | 0.3755 | 0.9923 | nan | 0.3218 | 0.9644 |
| 0.3143 | 53.33 | 160 | 0.3451 | 0.6351 | 0.6702 | 0.9651 | nan | 0.3467 | 0.9938 | nan | 0.3056 | 0.9646 |
| 0.3009 | 56.67 | 170 | 0.3357 | 0.6449 | 0.6941 | 0.9636 | nan | 0.3984 | 0.9898 | nan | 0.3267 | 0.9630 |
| 0.2765 | 60.0 | 180 | 0.3188 | 0.6458 | 0.6934 | 0.9641 | nan | 0.3965 | 0.9903 | nan | 0.3282 | 0.9634 |
| 0.2703 | 63.33 | 190 | 0.3179 | 0.6385 | 0.6732 | 0.9656 | nan | 0.3525 | 0.9940 | nan | 0.3119 | 0.9650 |
| 0.2746 | 66.67 | 200 | 0.3067 | 0.6385 | 0.6702 | 0.9662 | nan | 0.3456 | 0.9949 | nan | 0.3113 | 0.9656 |
| 0.2516 | 70.0 | 210 | 0.2992 | 0.6569 | 0.6968 | 0.9667 | nan | 0.4008 | 0.9929 | nan | 0.3477 | 0.9661 |
| 0.2503 | 73.33 | 220 | 0.2999 | 0.6671 | 0.7198 | 0.9659 | nan | 0.4497 | 0.9899 | nan | 0.3689 | 0.9652 |
| 0.2443 | 76.67 | 230 | 0.2816 | 0.6439 | 0.6750 | 0.9668 | nan | 0.3547 | 0.9952 | nan | 0.3215 | 0.9663 |
| 0.3757 | 80.0 | 240 | 0.2907 | 0.6593 | 0.7063 | 0.9659 | nan | 0.4215 | 0.9911 | nan | 0.3535 | 0.9652 |
| 0.2306 | 83.33 | 250 | 0.2767 | 0.6439 | 0.6807 | 0.9658 | nan | 0.3680 | 0.9935 | nan | 0.3226 | 0.9652 |
| 0.2216 | 86.67 | 260 | 0.2792 | 0.6583 | 0.7018 | 0.9663 | nan | 0.4115 | 0.9920 | nan | 0.3509 | 0.9657 |
| 0.3202 | 90.0 | 270 | 0.2681 | 0.6425 | 0.6789 | 0.9657 | nan | 0.3642 | 0.9936 | nan | 0.3199 | 0.9652 |
| 0.2174 | 93.33 | 280 | 0.2633 | 0.6467 | 0.6860 | 0.9657 | nan | 0.3791 | 0.9928 | nan | 0.3284 | 0.9651 |
| 0.2086 | 96.67 | 290 | 0.2658 | 0.6476 | 0.6900 | 0.9652 | nan | 0.3880 | 0.9920 | nan | 0.3306 | 0.9646 |
| 0.2042 | 100.0 | 300 | 0.2651 | 0.6486 | 0.6898 | 0.9655 | nan | 0.3873 | 0.9923 | nan | 0.3322 | 0.9649 |
| 0.2071 | 103.33 | 310 | 0.2597 | 0.6445 | 0.6792 | 0.9662 | nan | 0.3643 | 0.9941 | nan | 0.3233 | 0.9657 |
| 0.2097 | 106.67 | 320 | 0.2596 | 0.6615 | 0.7062 | 0.9665 | nan | 0.4206 | 0.9918 | nan | 0.3571 | 0.9658 |
| 0.3118 | 110.0 | 330 | 0.2557 | 0.6516 | 0.6928 | 0.9659 | nan | 0.3931 | 0.9924 | nan | 0.3380 | 0.9653 |
| 0.1956 | 113.33 | 340 | 0.2517 | 0.6494 | 0.6865 | 0.9664 | nan | 0.3794 | 0.9936 | nan | 0.3331 | 0.9658 |
| 0.201 | 116.67 | 350 | 0.2570 | 0.6573 | 0.7032 | 0.9658 | nan | 0.4151 | 0.9913 | nan | 0.3494 | 0.9651 |
| 0.1952 | 120.0 | 360 | 0.2543 | 0.6541 | 0.6937 | 0.9665 | nan | 0.3944 | 0.9930 | nan | 0.3424 | 0.9659 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGB-b0_5", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGB-b0_5 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:10:28+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGB-b0\_5
===================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2543
* Mean Iou: 0.6541
* Mean Accuracy: 0.6937
* Overall Accuracy: 0.9665
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3944
* Accuracy Undropoff: 0.9930
* Iou Unlabeled: nan
* Iou Dropoff: 0.3424
* Iou Undropoff: 0.9659
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10649233311414719,
0.03541902080178261,
-0.002008110284805298,
0.11587068438529968,
0.17249611020088196,
0.02851674146950245,
0.11544682830572128,
0.11457392573356628,
-0.10972921550273895,
0.031442414969205856,
0.10455724596977234,
0.1422630250453949,
0.01651071570813656,
0.09646012634038925,
-0.020534632727503777,
-0.30654653906822205,
-0.025566929951310158,
0.031897399574518204,
-0.085641048848629,
0.12503349781036377,
0.0653185248374939,
-0.1630534678697586,
0.09219326823949814,
-0.003469406859949231,
-0.22116395831108093,
0.016427848488092422,
-0.004900987725704908,
-0.031342100352048874,
0.1595771759748459,
0.023765306919813156,
0.11590364575386047,
0.009380001574754715,
0.11358951032161713,
-0.20307078957557678,
0.017782283946871758,
0.05622512102127075,
-0.0043406435288488865,
0.06756623834371567,
0.06620211154222488,
0.0016262467252090573,
0.15138760209083557,
-0.10710099339485168,
0.06757436692714691,
0.00219079852104187,
-0.14455656707286835,
-0.21195274591445923,
-0.07354521006345749,
0.022106915712356567,
0.0785149559378624,
0.09556274861097336,
-0.005519969388842583,
0.11230214685201645,
-0.09189144521951675,
0.11340049654245377,
0.2702406644821167,
-0.24366013705730438,
-0.08655690401792526,
0.03894307091832161,
0.002010022522881627,
0.06756281107664108,
-0.13231836259365082,
0.008765911683440208,
0.03365390747785568,
0.046433039009571075,
0.11845893412828445,
-0.03324741870164871,
-0.09865421056747437,
0.02689301408827305,
-0.13883599638938904,
-0.03352012485265732,
0.05586788058280945,
0.053482215851545334,
-0.020159399136900902,
-0.03282769396901131,
-0.06759633123874664,
-0.18260620534420013,
-0.06595168262720108,
0.011947227641940117,
0.06682480126619339,
-0.060095276683568954,
-0.11392708122730255,
-0.0157875157892704,
-0.11059167236089706,
-0.08550330996513367,
-0.04916469380259514,
0.12676334381103516,
0.03432220220565796,
0.01899923011660576,
-0.03564015403389931,
0.12657660245895386,
-0.028372230008244514,
-0.1392209231853485,
0.01717406138777733,
0.028990289196372032,
-0.043101582676172256,
-0.03224371373653412,
-0.04957793653011322,
-0.06523070484399796,
-0.01213329192250967,
0.10851341485977173,
-0.059460222721099854,
0.06760929524898529,
0.03599978983402252,
0.05042739585042,
-0.11337345093488693,
0.18941862881183624,
-0.06706184148788452,
-0.009321819990873337,
-0.03720396012067795,
0.05813972279429436,
0.0031261146068573,
-0.022109270095825195,
-0.10478325188159943,
0.003962255083024502,
0.0705074816942215,
-0.008451290428638458,
-0.0875946432352066,
0.06957915425300598,
-0.039526890963315964,
-0.0115347383543849,
-0.0019373028771951795,
-0.07579585909843445,
0.045837875455617905,
-0.0002824731345754117,
-0.08342599123716354,
-0.028811132535338402,
0.05099841579794884,
0.014591277576982975,
0.01335973385721445,
0.16602031886577606,
-0.08670076727867126,
0.06328269839286804,
-0.11252683401107788,
-0.1004817932844162,
0.00011045288556488231,
-0.08778220415115356,
0.0363481231033802,
-0.0785403698682785,
-0.14912283420562744,
-0.009210201911628246,
0.07156404107809067,
-0.04026268422603607,
0.002490660175681114,
-0.053450807929039,
-0.09049218147993088,
0.0022155873011797667,
-0.008602695539593697,
0.16406433284282684,
-0.06495238840579987,
0.12248161435127258,
0.03826385736465454,
0.07250885665416718,
-0.0641457810997963,
0.039288248866796494,
-0.08494531363248825,
0.01955431140959263,
-0.2205471694469452,
0.04310363903641701,
-0.050358060747385025,
0.06745324283838272,
-0.061215128749608994,
-0.12230591475963593,
0.007535097189247608,
0.0020529634784907103,
0.09138645976781845,
0.10658048838376999,
-0.22340381145477295,
-0.07632577419281006,
0.15001466870307922,
-0.07304205000400543,
-0.09915706515312195,
0.11305509507656097,
-0.06369689851999283,
0.012146028690040112,
0.06172249838709831,
0.19822841882705688,
0.05336225405335426,
-0.13852377235889435,
0.022840937599539757,
-0.015900328755378723,
0.049486055970191956,
-0.027449751272797585,
0.05092339962720871,
0.022035008296370506,
0.08634582906961441,
0.01932832971215248,
-0.06749307364225388,
0.06813844293355942,
-0.12453159689903259,
-0.0973159596323967,
-0.026223300024867058,
-0.08707922697067261,
0.0425996370613575,
0.08921016752719879,
0.06201290711760521,
-0.1042734906077385,
-0.07783126085996628,
0.09162496030330658,
0.07593740522861481,
-0.06878883391618729,
0.03940066322684288,
-0.06503570079803467,
0.04400739446282387,
-0.018228968605399132,
-0.03635057434439659,
-0.17525562644004822,
-0.024230515584349632,
-0.022172527387738228,
0.032932575792074203,
0.029138708487153053,
0.022093607112765312,
0.0918286070227623,
0.08869968354701996,
-0.07115217298269272,
-0.024756498634815216,
-0.06573759019374847,
0.002546109724789858,
-0.12175681442022324,
-0.22863373160362244,
-0.04354200139641762,
-0.007867901585996151,
0.08889780193567276,
-0.21311669051647186,
0.024510059505701065,
0.022799061611294746,
0.08842430263757706,
0.02574062906205654,
-0.03138883039355278,
-0.05283556878566742,
0.07688137143850327,
-0.010428636334836483,
-0.06497251987457275,
0.06946990638971329,
-0.0056456285528838634,
-0.06745541095733643,
-0.05654985457658768,
-0.11247335374355316,
0.16469940543174744,
0.13317930698394775,
-0.14714597165584564,
-0.09214778244495392,
-0.013312920928001404,
-0.063844233751297,
-0.033247288316488266,
-0.04398446902632713,
0.039430055767297745,
0.1780266910791397,
-0.00010981995001202449,
0.13309699296951294,
-0.060851842164993286,
-0.03459689021110535,
0.029519541189074516,
-0.027206161990761757,
0.027374764904379845,
0.12912613153457642,
0.12653884291648865,
-0.062028925865888596,
0.12387561798095703,
0.12418481707572937,
-0.08114030212163925,
0.1494152545928955,
-0.033264271914958954,
-0.0810229554772377,
-0.018308553844690323,
-0.013985042460262775,
-0.008388935588300228,
0.17675906419754028,
-0.15065397322177887,
-0.017390701919794083,
-0.00424934271723032,
0.013155661523342133,
0.014830991625785828,
-0.24995087087154388,
-0.0558709055185318,
0.039869796484708786,
-0.04461740329861641,
-0.009433131664991379,
-0.02365971729159355,
-0.004057092592120171,
0.10431108623743057,
-0.007682909723371267,
-0.07424376159906387,
0.0007452150457538664,
-0.0073659587651491165,
-0.049111928790807724,
0.2073720246553421,
-0.057999152690172195,
-0.1201261654496193,
-0.08830592781305313,
-0.07677625864744186,
-0.03632787987589836,
0.002308193128556013,
0.0591593012213707,
-0.1085822731256485,
-0.018768738955259323,
-0.05880347266793251,
0.019458457827568054,
0.006906068418174982,
0.037035223096609116,
-0.0010389474919065833,
-0.008258688263595104,
0.05622628331184387,
-0.09682733565568924,
-0.009275443851947784,
-0.06575994193553925,
-0.05308527871966362,
0.054321642965078354,
0.05974826216697693,
0.1487133949995041,
0.13398493826389313,
-0.02516005001962185,
0.02025916799902916,
-0.033261314034461975,
0.2603031396865845,
-0.09539790451526642,
-0.028128890320658684,
0.11783468723297119,
-0.01214310247451067,
0.055856361985206604,
0.10701325535774231,
0.08245833963155746,
-0.10904908925294876,
-0.0019291220232844353,
0.06410636752843857,
-0.05143284797668457,
-0.15570802986621857,
-0.015027196146547794,
-0.058327436447143555,
-0.029958199709653854,
0.077450692653656,
0.02701939083635807,
-0.001855272683314979,
0.05557077005505562,
0.048519883304834366,
0.04288593307137489,
-0.02351878397166729,
0.05035394802689552,
0.08817767351865768,
0.03225383907556534,
0.10927974432706833,
-0.04484448581933975,
-0.06649953126907349,
0.03129003196954727,
0.003169270697981119,
0.2457313984632492,
-0.015184384770691395,
0.09709212183952332,
0.07371558248996735,
0.16226935386657715,
-0.012239483185112476,
0.048111625015735626,
-0.015381221659481525,
-0.06757867336273193,
-0.018642326816916466,
-0.044339071959257126,
-0.016227567568421364,
0.010756530798971653,
-0.051272425800561905,
0.039304398000240326,
-0.1268744319677353,
0.009760280139744282,
0.06842625141143799,
0.2505629360675812,
0.027572879567742348,
-0.3173650801181793,
-0.06491094827651978,
-0.006207557395100594,
-0.011117108166217804,
-0.008987529203295708,
0.006322643253952265,
0.15259242057800293,
-0.0820283591747284,
0.0560346357524395,
-0.08593686670064926,
0.08597972989082336,
-0.03631705045700073,
0.05112973973155022,
0.07715161889791489,
0.0740961879491806,
-0.004238784778863192,
0.055085957050323486,
-0.28788360953330994,
0.30207714438438416,
0.001856112270615995,
0.0847378820180893,
-0.06413950026035309,
-0.032106366008520126,
0.03255250304937363,
0.08181426674127579,
0.0877198874950409,
-0.015706293284893036,
-0.02197972871363163,
-0.21383488178253174,
-0.021518880501389503,
0.03125238046050072,
0.1295892745256424,
-0.018455512821674347,
0.10351228713989258,
-0.009902393445372581,
-0.00588485412299633,
0.0742294192314148,
-0.0014182537561282516,
-0.03541947901248932,
-0.0902821347117424,
-0.025833291932940483,
-0.024863464757800102,
-0.04891938716173172,
-0.05869458615779877,
-0.10644339770078659,
-0.11488485336303711,
0.11102840304374695,
0.015258767642080784,
-0.013820716179907322,
-0.12087500095367432,
0.09911880642175674,
0.07827001810073853,
-0.075639508664608,
0.042421214282512665,
0.031842127442359924,
0.05792653188109398,
0.0320403166115284,
-0.05805044621229172,
0.11795914173126221,
-0.060278356075286865,
-0.15919680893421173,
-0.05573449283838272,
0.09106617420911789,
0.05156053602695465,
0.056767117232084274,
-0.02431696094572544,
0.015305006876587868,
-0.017362257465720177,
-0.09200239181518555,
0.05444863811135292,
-0.04200691729784012,
0.06232433393597603,
0.00978512316942215,
-0.019913334399461746,
0.049309540539979935,
-0.056224022060632706,
-0.012404541485011578,
0.14732937514781952,
0.2859904170036316,
-0.08951869606971741,
0.01367634255439043,
0.01556307915598154,
-0.06635761260986328,
-0.1894458681344986,
0.08042430877685547,
0.05870437249541283,
0.0004593605117406696,
0.08709000796079636,
-0.1676849126815796,
0.09808558970689774,
0.10309728235006332,
0.0002890884061343968,
0.1149357259273529,
-0.368745893239975,
-0.12818658351898193,
0.0806475505232811,
0.19140946865081787,
0.07766779512166977,
-0.15497079491615295,
0.0007545585976913571,
-0.0016887550009414554,
-0.1488846242427826,
0.091683529317379,
-0.07718772441148758,
0.13508258759975433,
-0.019662674516439438,
0.08572512865066528,
0.01652027852833271,
-0.061377961188554764,
0.12247602641582489,
-0.004532550927251577,
0.14073969423770905,
-0.06842464208602905,
-0.03942161798477173,
0.053188927471637726,
-0.03752102702856064,
-0.01285445224493742,
-0.04775197058916092,
0.0267304927110672,
-0.06012487784028053,
-0.012296222150325775,
-0.10488420724868774,
0.013460368849337101,
-0.038980379700660706,
-0.06695487350225449,
-0.04539008066058159,
0.043324876576662064,
0.04547496885061264,
-0.003100699046626687,
0.15248781442642212,
-0.010343621484935284,
0.11362266540527344,
0.04826241731643677,
0.059205636382102966,
-0.060791101306676865,
-0.10804982483386993,
-0.017937231808900833,
0.008780719712376595,
0.04870529845356941,
-0.13476704061031342,
0.015425100922584534,
0.15282666683197021,
0.05006728321313858,
0.12231310456991196,
0.08622467517852783,
-0.03185645863413811,
0.031450528651475906,
0.06939312070608139,
-0.15781576931476593,
-0.1146925538778305,
0.0013899669284000993,
-0.06701695919036865,
-0.07267320156097412,
0.052834637463092804,
0.07759876549243927,
-0.07596214860677719,
0.011788897216320038,
-0.007180044427514076,
0.0064629726111888885,
-0.0678086206316948,
0.20557783544063568,
0.05521059408783913,
0.041427500545978546,
-0.10367696732282639,
0.07370376586914062,
0.018692485988140106,
-0.08784474432468414,
-0.0012133740819990635,
0.0907752588391304,
-0.06913053244352341,
-0.025034740567207336,
0.08155308663845062,
0.19176793098449707,
-0.0766492486000061,
-0.0228650514036417,
-0.14995284378528595,
-0.10607686638832092,
0.06983733922243118,
0.18615254759788513,
0.10015472769737244,
-0.007434334140270948,
-0.051916833966970444,
0.0478600412607193,
-0.11728440970182419,
0.07808661460876465,
0.025074606761336327,
0.08142903447151184,
-0.14993512630462646,
0.18285270035266876,
0.011471637524664402,
0.05453678220510483,
-0.025952065363526344,
0.032693106681108475,
-0.11964283883571625,
0.04019041731953621,
-0.11360186338424683,
-0.03626396507024765,
-0.015379550866782665,
0.004765302408486605,
-0.013333975337445736,
-0.062404002994298935,
-0.06320908665657043,
0.005723265465348959,
-0.12754005193710327,
-0.02226276695728302,
0.04579556733369827,
0.023444000631570816,
-0.12672723829746246,
-0.03899329900741577,
0.027743156999349594,
-0.06368311494588852,
0.055810023099184036,
0.03595525026321411,
0.013894312083721161,
0.06595242023468018,
-0.17330007255077362,
-0.021810321137309074,
0.070060595870018,
-0.007457200437784195,
0.06273166090250015,
-0.03653194010257721,
-0.02568737417459488,
-0.02941904403269291,
0.08768700808286667,
0.012275002896785736,
0.06199142336845398,
-0.13691334426403046,
0.0052282540127635,
-0.032837286591529846,
-0.09186450392007828,
-0.05883370339870453,
0.052420951426029205,
0.06178171560168266,
0.03659208491444588,
0.16250573098659515,
-0.08272150158882141,
0.044772256165742874,
-0.2185482531785965,
-0.016092568635940552,
0.0017781276255846024,
-0.1076044961810112,
-0.08283082395792007,
-0.07246813923120499,
0.08282878249883652,
-0.07605460286140442,
0.1108672097325325,
0.03712504357099533,
0.06563857197761536,
0.03164868801832199,
-0.03307740390300751,
-0.004263680428266525,
0.03464328125119209,
0.2116430103778839,
0.011542430147528648,
-0.03280042111873627,
0.08914298564195633,
0.07961855828762054,
0.10031109303236008,
0.13723793625831604,
0.22827985882759094,
0.1556602418422699,
-0.02679344266653061,
0.08985250443220139,
0.051500122994184494,
-0.06530258804559708,
-0.1727873831987381,
0.03688646852970123,
-0.05145660415291786,
0.09927286952733994,
-0.061768654733896255,
0.20273716747760773,
0.08735809475183487,
-0.18363389372825623,
0.0660974308848381,
-0.04502302035689354,
-0.10125258564949036,
-0.08116330206394196,
-0.03795985132455826,
-0.07032150030136108,
-0.14877022802829742,
0.025798222050070763,
-0.1027073934674263,
0.04333026334643364,
0.14922219514846802,
0.010018367320299149,
-0.012742683291435242,
0.21410177648067474,
0.03483295440673828,
0.036013517528772354,
0.05892600119113922,
0.014170661568641663,
-0.029570305719971657,
-0.09281884878873825,
-0.060259848833084106,
0.017781438305974007,
-0.030329212546348572,
0.018029576167464256,
-0.06978259235620499,
-0.077887162566185,
0.027117611840367317,
0.005052410531789064,
-0.09354209899902344,
0.023705361410975456,
0.020700691267848015,
0.08957210183143616,
0.026962894946336746,
0.00625406252220273,
0.01595999114215374,
-0.028466112911701202,
0.24721521139144897,
-0.09312139451503754,
-0.08100591599941254,
-0.08266246318817139,
0.2153213918209076,
0.03227752074599266,
0.0012870734790340066,
0.008854847401380539,
-0.08128305524587631,
0.009572489187121391,
0.22953803837299347,
0.1718563735485077,
-0.1308571994304657,
-0.010271341539919376,
-0.00013335027324501425,
0.0014996831305325031,
-0.029510721564292908,
0.11788038164377213,
0.12110237777233124,
0.052471183240413666,
-0.11455448716878891,
-0.05134640634059906,
-0.05223130062222481,
-0.0185924731194973,
-0.026401309296488762,
0.049513909965753555,
0.06708652526140213,
0.02181844599545002,
-0.0690699890255928,
0.07623188942670822,
-0.05878157168626785,
-0.1415841430425644,
0.1051250696182251,
-0.22685500979423523,
-0.15652738511562347,
-0.0076664225198328495,
0.1219167709350586,
0.004140776582062244,
0.05953097715973854,
-0.0418783500790596,
0.0007616900838911533,
0.05116748437285423,
-0.004854890983551741,
-0.07880108803510666,
-0.10559022426605225,
0.08460206538438797,
-0.1158004179596901,
0.2171562761068344,
-0.05846324563026428,
0.033730924129486084,
0.11284830421209335,
0.06351487338542938,
-0.05023018643260002,
0.05544724315404892,
0.04167674481868744,
-0.12405151128768921,
-0.004302829038351774,
0.12474225461483002,
-0.039072390645742416,
0.055527038872241974,
0.032598868012428284,
-0.13145694136619568,
0.03233962506055832,
-0.05618817359209061,
-0.040532518178224564,
-0.027373507618904114,
-0.05006802827119827,
-0.06371132284402847,
0.11478167772293091,
0.20842719078063965,
-0.00825675018131733,
0.02441330999135971,
-0.08749818056821823,
0.017284953966736794,
0.06594919413328171,
0.04984837397933006,
-0.07829046994447708,
-0.21566852927207947,
0.007306637242436409,
0.07119728624820709,
-0.04119804874062538,
-0.20360709726810455,
-0.11198395490646362,
0.036078546196222305,
-0.05377151817083359,
-0.0720614343881607,
0.09022299200296402,
0.092189721763134,
0.056102801114320755,
-0.05502096191048622,
-0.10624532401561737,
-0.05849136412143707,
0.1705469787120819,
-0.1468687802553177,
-0.07732440531253815
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGB-b0_6
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1833
- Mean Iou: 0.6595
- Mean Accuracy: 0.7018
- Overall Accuracy: 0.9666
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.4113
- Accuracy Undropoff: 0.9924
- Iou Unlabeled: nan
- Iou Dropoff: 0.3531
- Iou Undropoff: 0.9660
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.1234 | 3.33 | 10 | 1.0973 | 0.1779 | 0.5629 | 0.3723 | nan | 0.7720 | 0.3538 | 0.0 | 0.1801 | 0.3536 |
| 0.975 | 6.67 | 20 | 1.0260 | 0.3499 | 0.8180 | 0.8109 | nan | 0.8259 | 0.8102 | 0.0 | 0.2428 | 0.8069 |
| 0.9464 | 10.0 | 30 | 0.8130 | 0.4297 | 0.7456 | 0.9502 | nan | 0.5212 | 0.9700 | 0.0 | 0.3385 | 0.9507 |
| 0.6167 | 13.33 | 40 | 0.6001 | 0.4451 | 0.7438 | 0.9617 | nan | 0.5048 | 0.9829 | 0.0 | 0.3743 | 0.9610 |
| 0.4818 | 16.67 | 50 | 0.4629 | 0.4491 | 0.7237 | 0.9666 | nan | 0.4573 | 0.9902 | 0.0 | 0.3815 | 0.9659 |
| 0.4733 | 20.0 | 60 | 0.4379 | 0.4335 | 0.7067 | 0.9630 | nan | 0.4256 | 0.9879 | 0.0 | 0.3383 | 0.9623 |
| 0.3843 | 23.33 | 70 | 0.4073 | 0.4310 | 0.6872 | 0.9652 | nan | 0.3821 | 0.9922 | 0.0 | 0.3283 | 0.9646 |
| 0.3579 | 26.67 | 80 | 0.3731 | 0.4354 | 0.6999 | 0.9651 | nan | 0.4090 | 0.9908 | 0.0 | 0.3418 | 0.9644 |
| 0.3212 | 30.0 | 90 | 0.3655 | 0.6589 | 0.7129 | 0.9647 | nan | 0.4366 | 0.9892 | nan | 0.3538 | 0.9640 |
| 0.3088 | 33.33 | 100 | 0.3306 | 0.6310 | 0.6689 | 0.9641 | nan | 0.3451 | 0.9928 | nan | 0.2985 | 0.9635 |
| 0.2825 | 36.67 | 110 | 0.3253 | 0.6633 | 0.7103 | 0.9663 | nan | 0.4293 | 0.9912 | nan | 0.3609 | 0.9657 |
| 0.3029 | 40.0 | 120 | 0.3130 | 0.6556 | 0.7079 | 0.9645 | nan | 0.4264 | 0.9895 | nan | 0.3474 | 0.9638 |
| 0.252 | 43.33 | 130 | 0.2898 | 0.6703 | 0.7310 | 0.9652 | nan | 0.4740 | 0.9880 | nan | 0.3762 | 0.9645 |
| 0.2395 | 46.67 | 140 | 0.2843 | 0.6587 | 0.7088 | 0.9653 | nan | 0.4275 | 0.9902 | nan | 0.3527 | 0.9646 |
| 0.2308 | 50.0 | 150 | 0.2744 | 0.6481 | 0.6870 | 0.9659 | nan | 0.3811 | 0.9930 | nan | 0.3309 | 0.9653 |
| 0.2125 | 53.33 | 160 | 0.2579 | 0.6555 | 0.7028 | 0.9653 | nan | 0.4147 | 0.9909 | nan | 0.3464 | 0.9647 |
| 0.1953 | 56.67 | 170 | 0.2551 | 0.6549 | 0.7054 | 0.9647 | nan | 0.4209 | 0.9899 | nan | 0.3458 | 0.9641 |
| 0.1743 | 60.0 | 180 | 0.2377 | 0.6393 | 0.6768 | 0.9651 | nan | 0.3605 | 0.9931 | nan | 0.3140 | 0.9646 |
| 0.17 | 63.33 | 190 | 0.2342 | 0.6564 | 0.7002 | 0.9660 | nan | 0.4086 | 0.9918 | nan | 0.3474 | 0.9654 |
| 0.173 | 66.67 | 200 | 0.2296 | 0.6629 | 0.7095 | 0.9664 | nan | 0.4277 | 0.9913 | nan | 0.3602 | 0.9657 |
| 0.1487 | 70.0 | 210 | 0.2152 | 0.6525 | 0.6861 | 0.9673 | nan | 0.3777 | 0.9946 | nan | 0.3383 | 0.9667 |
| 0.1501 | 73.33 | 220 | 0.2179 | 0.6593 | 0.7019 | 0.9665 | nan | 0.4116 | 0.9923 | nan | 0.3527 | 0.9659 |
| 0.1419 | 76.67 | 230 | 0.2055 | 0.6605 | 0.7057 | 0.9663 | nan | 0.4199 | 0.9916 | nan | 0.3553 | 0.9656 |
| 0.2049 | 80.0 | 240 | 0.2060 | 0.6563 | 0.7004 | 0.9659 | nan | 0.4091 | 0.9917 | nan | 0.3472 | 0.9653 |
| 0.1339 | 83.33 | 250 | 0.2006 | 0.6514 | 0.6921 | 0.9660 | nan | 0.3916 | 0.9926 | nan | 0.3375 | 0.9654 |
| 0.1262 | 86.67 | 260 | 0.1963 | 0.6559 | 0.7033 | 0.9654 | nan | 0.4158 | 0.9908 | nan | 0.3470 | 0.9647 |
| 0.179 | 90.0 | 270 | 0.1907 | 0.6549 | 0.6976 | 0.9660 | nan | 0.4032 | 0.9921 | nan | 0.3445 | 0.9654 |
| 0.1216 | 93.33 | 280 | 0.1901 | 0.6561 | 0.6994 | 0.9661 | nan | 0.4068 | 0.9920 | nan | 0.3468 | 0.9655 |
| 0.1144 | 96.67 | 290 | 0.1917 | 0.6565 | 0.7017 | 0.9658 | nan | 0.4119 | 0.9915 | nan | 0.3478 | 0.9652 |
| 0.1095 | 100.0 | 300 | 0.1900 | 0.6621 | 0.7108 | 0.9659 | nan | 0.4309 | 0.9907 | nan | 0.3590 | 0.9653 |
| 0.1144 | 103.33 | 310 | 0.1848 | 0.6595 | 0.6994 | 0.9670 | nan | 0.4058 | 0.9930 | nan | 0.3526 | 0.9664 |
| 0.1144 | 106.67 | 320 | 0.1849 | 0.6585 | 0.7011 | 0.9665 | nan | 0.4100 | 0.9922 | nan | 0.3512 | 0.9658 |
| 0.1574 | 110.0 | 330 | 0.1852 | 0.6592 | 0.7025 | 0.9664 | nan | 0.4128 | 0.9921 | nan | 0.3526 | 0.9658 |
| 0.1085 | 113.33 | 340 | 0.1819 | 0.6595 | 0.7016 | 0.9667 | nan | 0.4108 | 0.9924 | nan | 0.3530 | 0.9660 |
| 0.1099 | 116.67 | 350 | 0.1856 | 0.6602 | 0.7057 | 0.9662 | nan | 0.4198 | 0.9915 | nan | 0.3548 | 0.9656 |
| 0.1048 | 120.0 | 360 | 0.1833 | 0.6595 | 0.7018 | 0.9666 | nan | 0.4113 | 0.9924 | nan | 0.3531 | 0.9660 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGB-b0_6", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGB-b0_6 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:10:29+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGB-b0\_6
===================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1833
* Mean Iou: 0.6595
* Mean Accuracy: 0.7018
* Overall Accuracy: 0.9666
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.4113
* Accuracy Undropoff: 0.9924
* Iou Unlabeled: nan
* Iou Dropoff: 0.3531
* Iou Undropoff: 0.9660
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10668216645717621,
0.03452228009700775,
-0.001955996034666896,
0.11583433300256729,
0.17118558287620544,
0.028295427560806274,
0.1152672991156578,
0.1157500222325325,
-0.10835208743810654,
0.030481625348329544,
0.10422509908676147,
0.14300428330898285,
0.01632128469645977,
0.09492900222539902,
-0.020201684907078743,
-0.30762186646461487,
-0.025416793301701546,
0.032419878989458084,
-0.08495583385229111,
0.1249079778790474,
0.06513309478759766,
-0.1630345731973648,
0.0914556011557579,
-0.0034688466694206,
-0.2195017784833908,
0.01592184044420719,
-0.004851622506976128,
-0.03168648108839989,
0.16014404594898224,
0.023912712931632996,
0.11566649377346039,
0.00990807730704546,
0.11406362056732178,
-0.20257675647735596,
0.01772981323301792,
0.055963896214962006,
-0.004478502087295055,
0.0677139163017273,
0.0661034882068634,
0.002566681709140539,
0.15228892862796783,
-0.10584034025669098,
0.06737075001001358,
0.002023936016485095,
-0.14480088651180267,
-0.21224600076675415,
-0.07455022633075714,
0.02200588583946228,
0.07823474705219269,
0.09618998318910599,
-0.005171929951757193,
0.11348935961723328,
-0.09242809563875198,
0.11379505693912506,
0.2691955268383026,
-0.24320051074028015,
-0.08647218346595764,
0.039555616676807404,
0.0024925812613219023,
0.06708166003227234,
-0.13343501091003418,
0.008916690945625305,
0.032949063926935196,
0.04611779376864433,
0.1170024573802948,
-0.0330200120806694,
-0.09815603494644165,
0.026776010170578957,
-0.1384267956018448,
-0.033329449594020844,
0.054709844291210175,
0.05374779924750328,
-0.019952058792114258,
-0.033103495836257935,
-0.0683293342590332,
-0.1822303831577301,
-0.06617806106805801,
0.012883052229881287,
0.06644143909215927,
-0.0598788782954216,
-0.11478424817323685,
-0.016088983044028282,
-0.11010082811117172,
-0.08548025786876678,
-0.049414630979299545,
0.1266322284936905,
0.033910397440195084,
0.019352365285158157,
-0.0343242846429348,
0.12678514420986176,
-0.028107071295380592,
-0.13987018167972565,
0.016231566667556763,
0.029924705624580383,
-0.04298516735434532,
-0.03211383894085884,
-0.04920271039009094,
-0.06429684907197952,
-0.012736261822283268,
0.10928399860858917,
-0.06043541058897972,
0.06855661422014236,
0.03562033548951149,
0.050958555191755295,
-0.11495985835790634,
0.19116653501987457,
-0.06670752167701721,
-0.009020998142659664,
-0.03657402843236923,
0.058287110179662704,
0.0038287658244371414,
-0.022437362000346184,
-0.1049780398607254,
0.0038731968961656094,
0.07038122415542603,
-0.008917720057070255,
-0.08775182068347931,
0.06960032880306244,
-0.03893955796957016,
-0.011626145802438259,
0.002019679406657815,
-0.07545624673366547,
0.0458497516810894,
-0.000853220175486058,
-0.0840488150715828,
-0.02933313138782978,
0.0505036786198616,
0.015026706270873547,
0.014244863763451576,
0.1650863140821457,
-0.08640444278717041,
0.06335426867008209,
-0.11297277361154556,
-0.09956681728363037,
0.0006875023245811462,
-0.0864410474896431,
0.03629220649600029,
-0.07794832438230515,
-0.15083736181259155,
-0.009255703538656235,
0.07139384001493454,
-0.040475111454725266,
0.0031775925308465958,
-0.052167125046253204,
-0.09037140011787415,
0.0027780921664088964,
-0.008584096096456051,
0.16279540956020355,
-0.06506035476922989,
0.12273271381855011,
0.03824048116803169,
0.07206552475690842,
-0.0660700872540474,
0.03880484029650688,
-0.08548253774642944,
0.019758224487304688,
-0.2227982133626938,
0.04275309294462204,
-0.05086992308497429,
0.067560113966465,
-0.06008364260196686,
-0.1226377859711647,
0.0068888296373188496,
0.001753312535583973,
0.09199848026037216,
0.10535812377929688,
-0.22365745902061462,
-0.07566362619400024,
0.14855562150478363,
-0.07291552424430847,
-0.09930848330259323,
0.11356339603662491,
-0.06325499713420868,
0.011372840031981468,
0.060934897512197495,
0.19969454407691956,
0.053332164883613586,
-0.1365780234336853,
0.023041291162371635,
-0.01556539535522461,
0.049858953803777695,
-0.027456358075141907,
0.05094395950436592,
0.022096246480941772,
0.08760178089141846,
0.01927143521606922,
-0.06491079181432724,
0.06751275062561035,
-0.124356709420681,
-0.09698860347270966,
-0.025668298825621605,
-0.08645659685134888,
0.04229143261909485,
0.09025189280509949,
0.06171484664082527,
-0.10474032163619995,
-0.07818260788917542,
0.09154386818408966,
0.07635575532913208,
-0.06922173500061035,
0.039830777794122696,
-0.06529497355222702,
0.044472988694906235,
-0.016702212393283844,
-0.03679250180721283,
-0.17506472766399384,
-0.024978457018733025,
-0.021771812811493874,
0.034686796367168427,
0.029212797060608864,
0.021840563043951988,
0.0918044000864029,
0.08847380429506302,
-0.07230006903409958,
-0.024881964549422264,
-0.06549447774887085,
0.0022977262269705534,
-0.12209254503250122,
-0.22743944823741913,
-0.04419347271323204,
-0.007545523811131716,
0.09069961309432983,
-0.21444758772850037,
0.024340828880667686,
0.022555870935320854,
0.08911862969398499,
0.025102872401475906,
-0.031155860051512718,
-0.05332503095269203,
0.07699013501405716,
-0.010134764015674591,
-0.06550844758749008,
0.06988680362701416,
-0.0056714341044425964,
-0.06817638128995895,
-0.056383468210697174,
-0.11230379343032837,
0.16403992474079132,
0.1331801563501358,
-0.14674654603004456,
-0.0918625071644783,
-0.013985667377710342,
-0.0640120878815651,
-0.0333138071000576,
-0.04305218160152435,
0.03874405845999718,
0.1788344532251358,
-0.0002677168231457472,
0.13266460597515106,
-0.061546459794044495,
-0.0350022092461586,
0.02906111441552639,
-0.02722392976284027,
0.028192197903990746,
0.12883315980434418,
0.12493351101875305,
-0.0642624944448471,
0.12368166446685791,
0.12396900355815887,
-0.08083552122116089,
0.15006554126739502,
-0.033385615795850754,
-0.08015941828489304,
-0.017216235399246216,
-0.014320533722639084,
-0.008187088184058666,
0.17701038718223572,
-0.15018023550510406,
-0.016390357166528702,
-0.004461516160517931,
0.013593021780252457,
0.015381401404738426,
-0.25050392746925354,
-0.056337445974349976,
0.03881708160042763,
-0.044651735574007034,
-0.00926507730036974,
-0.023117568343877792,
-0.003986579366028309,
0.10449608415365219,
-0.007289177272468805,
-0.0753358006477356,
0.0013512653531506658,
-0.0073913466185331345,
-0.04895378276705742,
0.2074557989835739,
-0.05837719142436981,
-0.11970783025026321,
-0.0895998626947403,
-0.07626514136791229,
-0.03666219487786293,
0.0025776957627385855,
0.05860929936170578,
-0.10751813650131226,
-0.018485598266124725,
-0.059262312948703766,
0.017765309661626816,
0.007086309604346752,
0.03582917898893356,
-0.0010089564602822065,
-0.008640148676931858,
0.05668450519442558,
-0.0971149355173111,
-0.009685374796390533,
-0.06608551740646362,
-0.053390663117170334,
0.05445463955402374,
0.05849944055080414,
0.14778108894824982,
0.1347694993019104,
-0.02583364024758339,
0.019826622679829597,
-0.03356350585818291,
0.25934672355651855,
-0.09626150876283646,
-0.026832615956664085,
0.1185842901468277,
-0.01127053052186966,
0.05623108148574829,
0.1067696362733841,
0.0827367901802063,
-0.10919398069381714,
-0.0018074919935315847,
0.063935786485672,
-0.052085477858781815,
-0.1553507000207901,
-0.01570393145084381,
-0.058498919010162354,
-0.02952396869659424,
0.07645037770271301,
0.027644522488117218,
-0.0005509888869710267,
0.055298950523138046,
0.04738091304898262,
0.04222806170582771,
-0.022977497428655624,
0.050678979605436325,
0.08906951546669006,
0.03248363360762596,
0.11002167314291,
-0.04455726593732834,
-0.06593197584152222,
0.031019434332847595,
0.0027367109432816505,
0.2437392920255661,
-0.0158351082354784,
0.0965699777007103,
0.07310114800930023,
0.1629893034696579,
-0.011679907329380512,
0.048217449337244034,
-0.015909338369965553,
-0.06845609843730927,
-0.018871212378144264,
-0.04426874965429306,
-0.017078138887882233,
0.010171747766435146,
-0.051949501037597656,
0.03927450627088547,
-0.12558074295520782,
0.009040136821568012,
0.06802196800708771,
0.24910080432891846,
0.028212234377861023,
-0.3177371621131897,
-0.06550715863704681,
-0.006419237703084946,
-0.011601245030760765,
-0.009368089959025383,
0.006290399003773928,
0.15154822170734406,
-0.08207935094833374,
0.056600648909807205,
-0.0854916200041771,
0.08616773039102554,
-0.03596021234989166,
0.05108534172177315,
0.07746446132659912,
0.07374673336744308,
-0.003997485619038343,
0.05639810487627983,
-0.2854348123073578,
0.3021954894065857,
0.0018290458247065544,
0.0844842866063118,
-0.06451708823442459,
-0.03245364874601364,
0.03255452960729599,
0.08218684792518616,
0.0862865075469017,
-0.015558782033622265,
-0.020373491570353508,
-0.2145521491765976,
-0.02151002548635006,
0.030844321474432945,
0.12958677113056183,
-0.017364023253321648,
0.10302693396806717,
-0.009590191766619682,
-0.006017274688929319,
0.07404810935258865,
-0.00078987778397277,
-0.03486090898513794,
-0.0904926061630249,
-0.026160476729273796,
-0.024492409080266953,
-0.047999307513237,
-0.058972690254449844,
-0.10624150931835175,
-0.11552219837903976,
0.11203731596469879,
0.018825067207217216,
-0.01317568589001894,
-0.12102724611759186,
0.09884462505578995,
0.07937050610780716,
-0.07561321556568146,
0.04199516028165817,
0.03172530233860016,
0.05749772489070892,
0.032229747623205185,
-0.057488393038511276,
0.11801213026046753,
-0.05990540608763695,
-0.16055525839328766,
-0.055658452212810516,
0.09156057238578796,
0.05139395594596863,
0.057335928082466125,
-0.023601356893777847,
0.01574804075062275,
-0.01702164299786091,
-0.09211086481809616,
0.05349208042025566,
-0.04241287335753441,
0.06254646927118301,
0.009264731779694557,
-0.018936846405267715,
0.053364790976047516,
-0.05597256124019623,
-0.012699726037681103,
0.14638306200504303,
0.28513506054878235,
-0.08905017375946045,
0.01342394296079874,
0.015964359045028687,
-0.06538612395524979,
-0.19000448286533356,
0.07955155521631241,
0.05859869718551636,
-0.00005751884600613266,
0.08637966960668564,
-0.1663971096277237,
0.09852157533168793,
0.10411889851093292,
-0.000057413955801166594,
0.11399456858634949,
-0.3669925630092621,
-0.12837199866771698,
0.07960257679224014,
0.19012223184108734,
0.07559841871261597,
-0.15421649813652039,
0.0009688400314189494,
-0.002033200114965439,
-0.14825552701950073,
0.09130217880010605,
-0.0780077576637268,
0.1355607956647873,
-0.020395882427692413,
0.08504746109247208,
0.01619894802570343,
-0.061725325882434845,
0.12291354686021805,
-0.00456562265753746,
0.1403384506702423,
-0.06864003837108612,
-0.03862601891160011,
0.05441322922706604,
-0.03759187087416649,
-0.01317577250301838,
-0.04640738293528557,
0.02732822485268116,
-0.06128140911459923,
-0.012215357273817062,
-0.10521700978279114,
0.013581590726971626,
-0.039001017808914185,
-0.06668572127819061,
-0.046039871871471405,
0.04357583820819855,
0.04460809752345085,
-0.0033130552619695663,
0.15479449927806854,
-0.010639186948537827,
0.11508552730083466,
0.05015096440911293,
0.05995767191052437,
-0.06382566690444946,
-0.10589136928319931,
-0.01838368922472,
0.009097598493099213,
0.04811302199959755,
-0.13342462480068207,
0.015326726250350475,
0.1531793177127838,
0.05039472132921219,
0.12185995280742645,
0.0865125060081482,
-0.032498639076948166,
0.03175705671310425,
0.06932219117879868,
-0.1568727344274521,
-0.11310319602489471,
0.0011172330705448985,
-0.06492513418197632,
-0.07257074117660522,
0.05333353206515312,
0.07690292596817017,
-0.07568442821502686,
0.01218703668564558,
-0.006318105850368738,
0.0053465478122234344,
-0.06801267713308334,
0.20541970431804657,
0.05531667172908783,
0.041622113436460495,
-0.1036776453256607,
0.07279457151889801,
0.018411224707961082,
-0.08750921487808228,
-0.002308572642505169,
0.09208808839321136,
-0.06925085932016373,
-0.025191674008965492,
0.08091219514608383,
0.19003531336784363,
-0.07804623991250992,
-0.02309727482497692,
-0.14961956441402435,
-0.105857715010643,
0.06931725889444351,
0.186893031001091,
0.10050079226493835,
-0.00768991420045495,
-0.05220440402626991,
0.047207582741975784,
-0.11697019636631012,
0.0768924430012703,
0.024076569825410843,
0.08142545819282532,
-0.14971458911895752,
0.1811119168996811,
0.011487024836242199,
0.05469726026058197,
-0.02626340091228485,
0.03217598795890808,
-0.11936581134796143,
0.04047144949436188,
-0.11329479515552521,
-0.036771953105926514,
-0.015711015090346336,
0.004845113959163427,
-0.013893096707761288,
-0.06239005923271179,
-0.06276945769786835,
0.005082677584141493,
-0.12740683555603027,
-0.022364625707268715,
0.045550212264060974,
0.022588133811950684,
-0.12629267573356628,
-0.03886046260595322,
0.02745879255235195,
-0.06322000175714493,
0.05577126145362854,
0.035765569657087326,
0.014542865566909313,
0.06629293411970139,
-0.17224864661693573,
-0.02202991209924221,
0.06969162821769714,
-0.006774276029318571,
0.06285475939512253,
-0.035233449190855026,
-0.026241319254040718,
-0.0293502826243639,
0.08732601255178452,
0.01266384869813919,
0.061062224209308624,
-0.13707832992076874,
0.005039793439209461,
-0.03328782320022583,
-0.09327297657728195,
-0.05868352949619293,
0.05333911255002022,
0.061394982039928436,
0.03646638244390488,
0.16184620559215546,
-0.08299855887889862,
0.044320475310087204,
-0.21923847496509552,
-0.0163169763982296,
0.0020274538546800613,
-0.10760428756475449,
-0.08197945356369019,
-0.07211071252822876,
0.08307691663503647,
-0.07504528015851974,
0.11072805523872375,
0.036703549325466156,
0.06599593907594681,
0.031265392899513245,
-0.03308309614658356,
-0.0029485945124179125,
0.03387054428458214,
0.21230904757976532,
0.012027048505842686,
-0.032397668808698654,
0.09026296436786652,
0.07964222878217697,
0.09965947270393372,
0.1381053477525711,
0.2267380952835083,
0.1549333930015564,
-0.02653668262064457,
0.089117132127285,
0.05123714357614517,
-0.06471620500087738,
-0.17233321070671082,
0.03578842058777809,
-0.05129331722855568,
0.09767045825719833,
-0.06195820868015289,
0.20171283185482025,
0.0875607579946518,
-0.18375727534294128,
0.06577827036380768,
-0.04527178779244423,
-0.10189516097307205,
-0.08075357973575592,
-0.0383853018283844,
-0.07011755555868149,
-0.14938205480575562,
0.026120152324438095,
-0.10287898033857346,
0.04356784000992775,
0.15021295845508575,
0.010228297673165798,
-0.012655959464609623,
0.2140616625547409,
0.033855780959129333,
0.03627445548772812,
0.05768121778964996,
0.014813564717769623,
-0.028669752180576324,
-0.09245344251394272,
-0.06057759001851082,
0.017766166478395462,
-0.030833978205919266,
0.018237173557281494,
-0.06945143640041351,
-0.07720036059617996,
0.02684912085533142,
0.005890588741749525,
-0.09370170533657074,
0.024110222235322,
0.02115786261856556,
0.09061791002750397,
0.02681778185069561,
0.005865046754479408,
0.01642012968659401,
-0.028914494439959526,
0.2478397786617279,
-0.09309648722410202,
-0.08174651116132736,
-0.0815243124961853,
0.2189643383026123,
0.03225523233413696,
0.0009209680138155818,
0.008748773485422134,
-0.08171188831329346,
0.009967570193111897,
0.22951556742191315,
0.17165836691856384,
-0.1316705048084259,
-0.010039259679615498,
0.00017556168313603848,
0.0011848146095871925,
-0.030105551704764366,
0.11803902685642242,
0.1211584061384201,
0.05255655571818352,
-0.11442896723747253,
-0.052648089826107025,
-0.052683550864458084,
-0.019061902537941933,
-0.02703198418021202,
0.049321483820676804,
0.06710880994796753,
0.021521955728530884,
-0.06972035020589828,
0.07587417960166931,
-0.05898117646574974,
-0.1417282521724701,
0.10520590841770172,
-0.227431520819664,
-0.1562754362821579,
-0.0075513338670134544,
0.12149787694215775,
0.0036339014768600464,
0.05950748547911644,
-0.0416368767619133,
0.0016176262870430946,
0.049279309809207916,
-0.004903750494122505,
-0.07828832417726517,
-0.10440375655889511,
0.08430149406194687,
-0.11540862917900085,
0.2169649302959442,
-0.05928150936961174,
0.035032033920288086,
0.11344976723194122,
0.06377062946557999,
-0.04988444223999977,
0.056165602058172226,
0.04202229902148247,
-0.12509214878082275,
-0.004481189418584108,
0.12499622255563736,
-0.03820042684674263,
0.05529504641890526,
0.03262198343873024,
-0.13330787420272827,
0.03282380849123001,
-0.05601627379655838,
-0.04050467535853386,
-0.02716980315744877,
-0.050051067024469376,
-0.06377048790454865,
0.11575490981340408,
0.20849032700061798,
-0.008310714736580849,
0.02412879280745983,
-0.08693430572748184,
0.016328925266861916,
0.06596571952104568,
0.047013361006975174,
-0.07864978164434433,
-0.21639759838581085,
0.007063151802867651,
0.0700518786907196,
-0.0427006334066391,
-0.20340843498706818,
-0.11198334395885468,
0.03682626411318779,
-0.05475519597530365,
-0.07167350500822067,
0.09025808423757553,
0.090310238301754,
0.055267900228500366,
-0.054465629160404205,
-0.10541201382875443,
-0.05904494971036911,
0.1702825278043747,
-0.1475851684808731,
-0.07703559845685959
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGB-b0_7
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1457
- Mean Iou: 0.6795
- Mean Accuracy: 0.7207
- Overall Accuracy: 0.9691
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.4481
- Accuracy Undropoff: 0.9932
- Iou Unlabeled: nan
- Iou Dropoff: 0.3907
- Iou Undropoff: 0.9684
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.1505 | 3.33 | 10 | 1.1103 | 0.1106 | 0.6036 | 0.2919 | nan | 0.9456 | 0.2616 | 0.0 | 0.0703 | 0.2616 |
| 0.9635 | 6.67 | 20 | 1.0114 | 0.3710 | 0.8470 | 0.8737 | nan | 0.8177 | 0.8763 | 0.0 | 0.2435 | 0.8694 |
| 0.9358 | 10.0 | 30 | 0.8242 | 0.4206 | 0.7727 | 0.9440 | nan | 0.5848 | 0.9606 | 0.0 | 0.3194 | 0.9425 |
| 0.579 | 13.33 | 40 | 0.5703 | 0.4525 | 0.7615 | 0.9633 | nan | 0.5402 | 0.9829 | 0.0 | 0.3951 | 0.9624 |
| 0.4411 | 16.67 | 50 | 0.4166 | 0.4529 | 0.7380 | 0.9667 | nan | 0.4872 | 0.9889 | 0.0 | 0.3928 | 0.9659 |
| 0.4311 | 20.0 | 60 | 0.3843 | 0.6678 | 0.7156 | 0.9667 | nan | 0.4400 | 0.9911 | nan | 0.3695 | 0.9661 |
| 0.3437 | 23.33 | 70 | 0.3590 | 0.4347 | 0.6956 | 0.9655 | nan | 0.3995 | 0.9918 | 0.0 | 0.3392 | 0.9649 |
| 0.3136 | 26.67 | 80 | 0.3198 | 0.6259 | 0.6622 | 0.9638 | nan | 0.3312 | 0.9931 | nan | 0.2885 | 0.9633 |
| 0.2682 | 30.0 | 90 | 0.2919 | 0.6187 | 0.6470 | 0.9648 | nan | 0.2984 | 0.9957 | nan | 0.2730 | 0.9643 |
| 0.2521 | 33.33 | 100 | 0.2957 | 0.6448 | 0.6845 | 0.9653 | nan | 0.3764 | 0.9926 | nan | 0.3248 | 0.9648 |
| 0.2287 | 36.67 | 110 | 0.2747 | 0.6800 | 0.7256 | 0.9685 | nan | 0.4591 | 0.9921 | nan | 0.3922 | 0.9678 |
| 0.2203 | 40.0 | 120 | 0.2537 | 0.7108 | 0.7687 | 0.9706 | nan | 0.5472 | 0.9902 | nan | 0.4517 | 0.9699 |
| 0.1964 | 43.33 | 130 | 0.2356 | 0.6689 | 0.7054 | 0.9686 | nan | 0.4167 | 0.9941 | nan | 0.3699 | 0.9680 |
| 0.1776 | 46.67 | 140 | 0.2205 | 0.6729 | 0.7137 | 0.9684 | nan | 0.4343 | 0.9931 | nan | 0.3780 | 0.9677 |
| 0.1675 | 50.0 | 150 | 0.2061 | 0.6809 | 0.7244 | 0.9689 | nan | 0.4562 | 0.9926 | nan | 0.3936 | 0.9682 |
| 0.148 | 53.33 | 160 | 0.1954 | 0.6924 | 0.7418 | 0.9694 | nan | 0.4920 | 0.9915 | nan | 0.4160 | 0.9687 |
| 0.1364 | 56.67 | 170 | 0.1915 | 0.6869 | 0.7415 | 0.9681 | nan | 0.4928 | 0.9902 | nan | 0.4064 | 0.9674 |
| 0.1171 | 60.0 | 180 | 0.1776 | 0.7206 | 0.7816 | 0.9714 | nan | 0.5734 | 0.9899 | nan | 0.4706 | 0.9707 |
| 0.1169 | 63.33 | 190 | 0.1754 | 0.6580 | 0.6853 | 0.9689 | nan | 0.3741 | 0.9965 | nan | 0.3476 | 0.9684 |
| 0.1178 | 66.67 | 200 | 0.1676 | 0.6783 | 0.7233 | 0.9684 | nan | 0.4545 | 0.9922 | nan | 0.3888 | 0.9677 |
| 0.1016 | 70.0 | 210 | 0.1670 | 0.6633 | 0.6985 | 0.9682 | nan | 0.4025 | 0.9944 | nan | 0.3590 | 0.9676 |
| 0.1025 | 73.33 | 220 | 0.1648 | 0.6789 | 0.7154 | 0.9696 | nan | 0.4366 | 0.9943 | nan | 0.3888 | 0.9690 |
| 0.0956 | 76.67 | 230 | 0.1607 | 0.6684 | 0.7103 | 0.9677 | nan | 0.4279 | 0.9927 | nan | 0.3697 | 0.9671 |
| 0.1443 | 80.0 | 240 | 0.1611 | 0.6747 | 0.7134 | 0.9688 | nan | 0.4332 | 0.9937 | nan | 0.3811 | 0.9682 |
| 0.0902 | 83.33 | 250 | 0.1600 | 0.6713 | 0.7060 | 0.9691 | nan | 0.4174 | 0.9946 | nan | 0.3740 | 0.9685 |
| 0.0846 | 86.67 | 260 | 0.1559 | 0.6772 | 0.7263 | 0.9677 | nan | 0.4613 | 0.9912 | nan | 0.3874 | 0.9670 |
| 0.1166 | 90.0 | 270 | 0.1587 | 0.6615 | 0.6984 | 0.9677 | nan | 0.4030 | 0.9939 | nan | 0.3559 | 0.9671 |
| 0.0825 | 93.33 | 280 | 0.1538 | 0.6684 | 0.7068 | 0.9682 | nan | 0.4199 | 0.9936 | nan | 0.3692 | 0.9676 |
| 0.0769 | 96.67 | 290 | 0.1527 | 0.6649 | 0.7033 | 0.9679 | nan | 0.4130 | 0.9936 | nan | 0.3626 | 0.9673 |
| 0.0722 | 100.0 | 300 | 0.1473 | 0.6832 | 0.7247 | 0.9694 | nan | 0.4563 | 0.9932 | nan | 0.3976 | 0.9688 |
| 0.0779 | 103.33 | 310 | 0.1465 | 0.6809 | 0.7200 | 0.9695 | nan | 0.4462 | 0.9937 | nan | 0.3930 | 0.9689 |
| 0.0771 | 106.67 | 320 | 0.1494 | 0.6673 | 0.7052 | 0.9682 | nan | 0.4167 | 0.9937 | nan | 0.3670 | 0.9676 |
| 0.1082 | 110.0 | 330 | 0.1479 | 0.6753 | 0.7182 | 0.9683 | nan | 0.4438 | 0.9926 | nan | 0.3830 | 0.9677 |
| 0.0726 | 113.33 | 340 | 0.1451 | 0.6765 | 0.7159 | 0.9689 | nan | 0.4384 | 0.9935 | nan | 0.3846 | 0.9683 |
| 0.0743 | 116.67 | 350 | 0.1469 | 0.6814 | 0.7249 | 0.9689 | nan | 0.4571 | 0.9927 | nan | 0.3946 | 0.9683 |
| 0.0703 | 120.0 | 360 | 0.1457 | 0.6795 | 0.7207 | 0.9691 | nan | 0.4481 | 0.9932 | nan | 0.3907 | 0.9684 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGB-b0_7", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGB-b0_7 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:10:30+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGB-b0\_7
===================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1457
* Mean Iou: 0.6795
* Mean Accuracy: 0.7207
* Overall Accuracy: 0.9691
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.4481
* Accuracy Undropoff: 0.9932
* Iou Unlabeled: nan
* Iou Dropoff: 0.3907
* Iou Undropoff: 0.9684
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 9e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10698888450860977,
0.036441706120967865,
-0.0019506275421008468,
0.11572462320327759,
0.17164912819862366,
0.027917172759771347,
0.11470325291156769,
0.11539258062839508,
-0.11006776988506317,
0.03151639550924301,
0.10494685918092728,
0.14315444231033325,
0.016575386747717857,
0.09572827816009521,
-0.01985798217356205,
-0.3064938485622406,
-0.02523018978536129,
0.03222807124257088,
-0.08448228985071182,
0.1251840442419052,
0.06494688242673874,
-0.16332796216011047,
0.0917927548289299,
-0.00341205857694149,
-0.22004684805870056,
0.015741029754281044,
-0.005488728638738394,
-0.031734712421894073,
0.1596153825521469,
0.02332714945077896,
0.11629735678434372,
0.009241495281457901,
0.11388376355171204,
-0.20122689008712769,
0.01775975152850151,
0.056348659098148346,
-0.004200944676995277,
0.06761504709720612,
0.0658675953745842,
0.0021639044862240553,
0.15146030485630035,
-0.10634762793779373,
0.06794709712266922,
0.0018994128331542015,
-0.144719660282135,
-0.2131481170654297,
-0.07463879138231277,
0.02310418151319027,
0.07857739925384521,
0.09572931379079819,
-0.005492504220455885,
0.11359455436468124,
-0.0917300209403038,
0.11322897672653198,
0.26852408051490784,
-0.24224360287189484,
-0.086004339158535,
0.03812379390001297,
0.002549873199313879,
0.06622302532196045,
-0.13225865364074707,
0.008114645257592201,
0.03274383023381233,
0.04607772454619408,
0.11748483031988144,
-0.03319721668958664,
-0.10077518224716187,
0.02642441913485527,
-0.13875606656074524,
-0.032808516174554825,
0.05579257011413574,
0.05341612547636032,
-0.020309101790189743,
-0.03234953433275223,
-0.06751273572444916,
-0.1815667450428009,
-0.06631273776292801,
0.013076373375952244,
0.06645462661981583,
-0.060221780091524124,
-0.11404638737440109,
-0.015684252604842186,
-0.11018043756484985,
-0.08551160991191864,
-0.04963507503271103,
0.1281404197216034,
0.03416020795702934,
0.01886441558599472,
-0.034426845610141754,
0.12663613259792328,
-0.027475666254758835,
-0.14027148485183716,
0.017749939113855362,
0.0298087690025568,
-0.04260074719786644,
-0.03184163197875023,
-0.049353525042533875,
-0.0654076412320137,
-0.013339268043637276,
0.10810881853103638,
-0.05842287465929985,
0.06796950101852417,
0.03592711687088013,
0.050935693085193634,
-0.11408856511116028,
0.19121836125850677,
-0.06768878549337387,
-0.008856520056724548,
-0.03634073585271835,
0.0590212382376194,
0.0043197814375162125,
-0.022377021610736847,
-0.10538116842508316,
0.004626469220966101,
0.06983321160078049,
-0.008485966362059116,
-0.08732792735099792,
0.07022283226251602,
-0.039138782769441605,
-0.011526009999215603,
0.0008069204050116241,
-0.0761040449142456,
0.04592046141624451,
-0.0004814189742319286,
-0.08394326269626617,
-0.028993353247642517,
0.05152413249015808,
0.014531095512211323,
0.013567515648901463,
0.1662231683731079,
-0.08668159693479538,
0.06310790777206421,
-0.11324604600667953,
-0.10056551545858383,
0.000822747009806335,
-0.0881146714091301,
0.036262188106775284,
-0.07815452665090561,
-0.15049895644187927,
-0.009466185234487057,
0.07138118892908096,
-0.039857588708400726,
0.003554475726559758,
-0.05298313871026039,
-0.09100713580846786,
0.0027641765773296356,
-0.008898764848709106,
0.16368739306926727,
-0.06541991978883743,
0.12287503480911255,
0.037284012883901596,
0.07241460680961609,
-0.06519117206335068,
0.03978648781776428,
-0.08533412218093872,
0.020032595843076706,
-0.22236330807209015,
0.04329092428088188,
-0.05069870501756668,
0.06835754960775375,
-0.06003706529736519,
-0.12234549224376678,
0.007273869123309851,
0.002296838676556945,
0.09198150783777237,
0.10613424330949783,
-0.2250698357820511,
-0.0757809653878212,
0.14862510561943054,
-0.07292257249355316,
-0.09920968860387802,
0.11283563822507858,
-0.06417480856180191,
0.01244199275970459,
0.06173001229763031,
0.1998792588710785,
0.05363358184695244,
-0.13725411891937256,
0.022026926279067993,
-0.015720393508672714,
0.0490301251411438,
-0.02824658900499344,
0.05086333304643631,
0.021759197115898132,
0.08817695081233978,
0.019061574712395668,
-0.06667432934045792,
0.06840512901544571,
-0.12369958311319351,
-0.09695179015398026,
-0.02545422874391079,
-0.08635058254003525,
0.042185764759778976,
0.0901612862944603,
0.061564717441797256,
-0.10510560870170593,
-0.0781584233045578,
0.09225747734308243,
0.07631921023130417,
-0.06905047595500946,
0.039653100073337555,
-0.0653006061911583,
0.04452607035636902,
-0.017119454219937325,
-0.03656432405114174,
-0.17498309910297394,
-0.024739904329180717,
-0.02208550088107586,
0.03294292837381363,
0.029752494767308235,
0.021195068955421448,
0.09135955572128296,
0.08895355463027954,
-0.07113959640264511,
-0.025295792147517204,
-0.06504576653242111,
0.002853347919881344,
-0.12191250920295715,
-0.22839201986789703,
-0.0436847098171711,
-0.00782459694892168,
0.08831162750720978,
-0.21212570369243622,
0.024374445900321007,
0.02305813878774643,
0.08880651742219925,
0.025462424382567406,
-0.0317736454308033,
-0.05223679915070534,
0.07698767632246017,
-0.010532958433032036,
-0.06572329998016357,
0.06974133849143982,
-0.005599552299827337,
-0.06890640407800674,
-0.05551714822649956,
-0.113435298204422,
0.16283336281776428,
0.13426204025745392,
-0.14808161556720734,
-0.0922188013792038,
-0.01170878391712904,
-0.0636860579252243,
-0.03333812206983566,
-0.042522139847278595,
0.03849917650222778,
0.17991451919078827,
-0.0001571235479786992,
0.13322539627552032,
-0.06151047721505165,
-0.035181816667318344,
0.028767555952072144,
-0.027116362005472183,
0.027623387053608894,
0.12915126979351044,
0.12556350231170654,
-0.06259054690599442,
0.12421479821205139,
0.12511518597602844,
-0.08064696937799454,
0.1492813676595688,
-0.03343291953206062,
-0.08083612471818924,
-0.01781415566802025,
-0.014845546334981918,
-0.008191768079996109,
0.1772596836090088,
-0.1512315720319748,
-0.01751023903489113,
-0.0043712519109249115,
0.0135234035551548,
0.01513171661645174,
-0.25078722834587097,
-0.055842284113168716,
0.03914861008524895,
-0.04416626691818237,
-0.00968187302350998,
-0.024392327293753624,
-0.004289896693080664,
0.10446952283382416,
-0.006976652424782515,
-0.07456402480602264,
0.000875753874424845,
-0.007641230244189501,
-0.04885067790746689,
0.2072855830192566,
-0.058623190969228745,
-0.12073752284049988,
-0.08990401029586792,
-0.07693970203399658,
-0.03617119416594505,
0.002907266141846776,
0.05847717821598053,
-0.10847745090723038,
-0.018738532438874245,
-0.0592842698097229,
0.018869055435061455,
0.007085567805916071,
0.03626856952905655,
-0.0011943228309974074,
-0.008214939385652542,
0.05619246885180473,
-0.09676984697580338,
-0.009580917656421661,
-0.0660177543759346,
-0.05168606713414192,
0.05342411622405052,
0.05996372550725937,
0.14880236983299255,
0.13485218584537506,
-0.025694115087389946,
0.0193569827824831,
-0.03275922313332558,
0.2582509517669678,
-0.09615176916122437,
-0.026911845430731773,
0.11865443736314774,
-0.013292305171489716,
0.05626576021313667,
0.10656722635030746,
0.0825405940413475,
-0.1091189831495285,
-0.0023946654982864857,
0.06353466957807541,
-0.051928386092185974,
-0.15508435666561127,
-0.015337692573666573,
-0.05829254165291786,
-0.030357353389263153,
0.07701724022626877,
0.027055509388446808,
-0.0028699629474431276,
0.05596603453159332,
0.04830314591526985,
0.042352162301540375,
-0.023839935660362244,
0.05068133771419525,
0.08856511861085892,
0.03192279487848282,
0.10945591330528259,
-0.04467906802892685,
-0.06641416251659393,
0.03166283294558525,
0.003316424321383238,
0.24425791203975677,
-0.01636841706931591,
0.09695488959550858,
0.07356058061122894,
0.16239604353904724,
-0.012307635508477688,
0.04825644567608833,
-0.016069311648607254,
-0.06790239363908768,
-0.019116470590233803,
-0.04434377700090408,
-0.017185142263770103,
0.01032891497015953,
-0.0520818717777729,
0.039383430033922195,
-0.12591715157032013,
0.008352558128535748,
0.06806930899620056,
0.24917757511138916,
0.028454294428229332,
-0.3183601200580597,
-0.06566125154495239,
-0.00574481999501586,
-0.011316265910863876,
-0.009446937590837479,
0.0064241583459079266,
0.15246117115020752,
-0.08145362883806229,
0.056983012706041336,
-0.08516839146614075,
0.08577614277601242,
-0.036592837423086166,
0.05070466548204422,
0.07712646573781967,
0.07374069094657898,
-0.00433371914550662,
0.05579186603426933,
-0.2846602499485016,
0.30174481868743896,
0.001998338382691145,
0.08480876684188843,
-0.06417321413755417,
-0.032005541026592255,
0.033527664840221405,
0.08169365674257278,
0.08693911135196686,
-0.015501614660024643,
-0.022096535190939903,
-0.2144138365983963,
-0.022179756313562393,
0.030647389590740204,
0.12894178926944733,
-0.01715848222374916,
0.10379287600517273,
-0.009384890086948872,
-0.00563735282048583,
0.0740499198436737,
-0.002242414280772209,
-0.03316393494606018,
-0.09016277641057968,
-0.026224007830023766,
-0.025045117363333702,
-0.04884226620197296,
-0.05888162553310394,
-0.10646912455558777,
-0.11533300578594208,
0.11155472695827484,
0.01783888414502144,
-0.014280887320637703,
-0.12032812088727951,
0.09823650121688843,
0.07956492900848389,
-0.07555615901947021,
0.041600994765758514,
0.031529396772384644,
0.05724094435572624,
0.03285609185695648,
-0.05764799565076828,
0.11806035786867142,
-0.060163844376802444,
-0.1597592532634735,
-0.05661061406135559,
0.09188058227300644,
0.05108117684721947,
0.05717123672366142,
-0.024467550218105316,
0.01599268428981304,
-0.017773564904928207,
-0.09199009090662003,
0.05417950451374054,
-0.043466195464134216,
0.0633617639541626,
0.010520675219595432,
-0.019872551783919334,
0.05010661110281944,
-0.056562747806310654,
-0.012174316681921482,
0.1471240520477295,
0.2851855754852295,
-0.08916649222373962,
0.013885416090488434,
0.01743941940367222,
-0.06596991419792175,
-0.1907978504896164,
0.08038318157196045,
0.05799282714724541,
0.00025811418890953064,
0.0860593393445015,
-0.16705526411533356,
0.09809858351945877,
0.1041085422039032,
0.00037729571340605617,
0.11559778451919556,
-0.3681841492652893,
-0.12862682342529297,
0.08100574463605881,
0.19125483930110931,
0.07578472793102264,
-0.15548807382583618,
0.0009023864404298365,
-0.0020743797067552805,
-0.1471850872039795,
0.09162960201501846,
-0.07676015794277191,
0.13519242405891418,
-0.019671108573675156,
0.08524524420499802,
0.016484010964632034,
-0.061802711337804794,
0.12263740599155426,
-0.004063030704855919,
0.14058692753314972,
-0.06928866356611252,
-0.039045244455337524,
0.054822683334350586,
-0.03756994754076004,
-0.012684161774814129,
-0.04664503037929535,
0.026881400495767593,
-0.061654843389987946,
-0.012196526862680912,
-0.10476307570934296,
0.012851151637732983,
-0.038527000695466995,
-0.06680891662836075,
-0.046027567237615585,
0.04359079897403717,
0.04535405710339546,
-0.0035038385540246964,
0.15192289650440216,
-0.010394894517958164,
0.11411907523870468,
0.0475783571600914,
0.059343598783016205,
-0.06250457465648651,
-0.10725396871566772,
-0.01751568540930748,
0.008826305158436298,
0.04820532351732254,
-0.13458441197872162,
0.014873156324028969,
0.15295100212097168,
0.050385601818561554,
0.1219601035118103,
0.08622747659683228,
-0.031963784247636795,
0.03228941559791565,
0.06924545764923096,
-0.15717479586601257,
-0.11271721124649048,
0.0020007179118692875,
-0.06729430705308914,
-0.07270099222660065,
0.052811939269304276,
0.07705207169055939,
-0.07545565068721771,
0.012211987748742104,
-0.00692431814968586,
0.006294754799455404,
-0.06750494986772537,
0.20529384911060333,
0.05613885074853897,
0.04132061451673508,
-0.10377327352762222,
0.07343608886003494,
0.01877872832119465,
-0.08791302889585495,
-0.0013444285141304135,
0.09182810038328171,
-0.0695977583527565,
-0.024941643700003624,
0.07999785989522934,
0.1917228251695633,
-0.07641182094812393,
-0.02346237748861313,
-0.14999887347221375,
-0.10627636313438416,
0.069828100502491,
0.1863066703081131,
0.10040981322526932,
-0.0070329331792891026,
-0.05238598957657814,
0.04732499271631241,
-0.11740197241306305,
0.07781419903039932,
0.024505890905857086,
0.08137445896863937,
-0.14945189654827118,
0.18323303759098053,
0.01213808823376894,
0.054703690111637115,
-0.02623630128800869,
0.03297341242432594,
-0.11911994963884354,
0.04033777117729187,
-0.11380002647638321,
-0.03652041405439377,
-0.016390196979045868,
0.005039701238274574,
-0.013495375402271748,
-0.06235634163022041,
-0.06290288269519806,
0.004962040577083826,
-0.12734273076057434,
-0.022335462272167206,
0.0457119382917881,
0.022709844633936882,
-0.12647494673728943,
-0.03931998834013939,
0.027785632759332657,
-0.06355828046798706,
0.05587029084563255,
0.03604619950056076,
0.014624064788222313,
0.065652035176754,
-0.17238566279411316,
-0.021873030811548233,
0.06996585428714752,
-0.006448161322623491,
0.06273359060287476,
-0.03562479466199875,
-0.026027249172329903,
-0.029558172449469566,
0.08710160851478577,
0.012362103909254074,
0.062322959303855896,
-0.13702401518821716,
0.00511175999417901,
-0.032495591789484024,
-0.09257379919290543,
-0.05878797546029091,
0.053459227085113525,
0.06229424849152565,
0.036778371781110764,
0.1623867005109787,
-0.0827142596244812,
0.04480697214603424,
-0.21858428418636322,
-0.01633734069764614,
0.0015065321931615472,
-0.10799645632505417,
-0.08222702145576477,
-0.07238233089447021,
0.08296071738004684,
-0.07516288012266159,
0.11126124858856201,
0.03665176406502724,
0.06524350494146347,
0.03163639456033707,
-0.03242642804980278,
-0.0034766565077006817,
0.03456224501132965,
0.21165549755096436,
0.011159288696944714,
-0.03308090940117836,
0.08908817917108536,
0.07952970266342163,
0.09996318072080612,
0.13528099656105042,
0.2267378866672516,
0.15591958165168762,
-0.025322124361991882,
0.08933405578136444,
0.05189678817987442,
-0.06466560065746307,
-0.17333287000656128,
0.036837704479694366,
-0.05204493924975395,
0.09864352643489838,
-0.06112434342503548,
0.2038503736257553,
0.08571939170360565,
-0.18332429230213165,
0.06640961766242981,
-0.04506092891097069,
-0.10130048543214798,
-0.07949268817901611,
-0.03734021633863449,
-0.06996020674705505,
-0.14770901203155518,
0.025733644142746925,
-0.10257547348737717,
0.04311133548617363,
0.149701327085495,
0.00991432461887598,
-0.013295114040374756,
0.2150111049413681,
0.0331418402493,
0.03557443991303444,
0.05836351588368416,
0.014840426854789257,
-0.02956581674516201,
-0.09253069758415222,
-0.06062045320868492,
0.018142715096473694,
-0.029495975002646446,
0.018125679343938828,
-0.06945392489433289,
-0.07720745354890823,
0.02666562609374523,
0.005343671888113022,
-0.09364587068557739,
0.023994632065296173,
0.02048140950500965,
0.09145079553127289,
0.026609839871525764,
0.0065123410895466805,
0.016702575609087944,
-0.028778837993741035,
0.24603474140167236,
-0.0928659662604332,
-0.08101007342338562,
-0.08173099905252457,
0.215449720621109,
0.03150666877627373,
0.0001713715319056064,
0.008758875541388988,
-0.08177723735570908,
0.009439172223210335,
0.22995677590370178,
0.17199040949344635,
-0.13198934495449066,
-0.010740760713815689,
0.00003829423076240346,
0.001458232756704092,
-0.030069036409258842,
0.11830835789442062,
0.1217745766043663,
0.05242851749062538,
-0.11425435543060303,
-0.05286705121397972,
-0.05305914208292961,
-0.018814079463481903,
-0.02694215252995491,
0.049020253121852875,
0.066792793571949,
0.02151275798678398,
-0.06964023411273956,
0.07614139467477798,
-0.05826883763074875,
-0.14441439509391785,
0.10612090677022934,
-0.22847653925418854,
-0.15661491453647614,
-0.007453833241015673,
0.12206707894802094,
0.004215598572045565,
0.05957123637199402,
-0.041368961334228516,
0.0014594764215871692,
0.049943048506975174,
-0.004811708349734545,
-0.07890451699495316,
-0.1045445054769516,
0.08426984399557114,
-0.11530204862356186,
0.2177894413471222,
-0.05934489518404007,
0.03386760130524635,
0.11317384988069534,
0.06365274637937546,
-0.05003955960273743,
0.05624428018927574,
0.042064208537340164,
-0.12377183884382248,
-0.004150159657001495,
0.12414009124040604,
-0.038546252995729446,
0.05464359000325203,
0.03285973146557808,
-0.13329584896564484,
0.03232704848051071,
-0.05659763514995575,
-0.04150042682886124,
-0.027407841756939888,
-0.05067397654056549,
-0.06398855149745941,
0.1155647560954094,
0.20873890817165375,
-0.008400616236031055,
0.02350081130862236,
-0.08703453093767166,
0.016182253137230873,
0.06582740694284439,
0.04808720946311951,
-0.07828748226165771,
-0.21570970118045807,
0.00692250719293952,
0.06968362629413605,
-0.04146188497543335,
-0.20452812314033508,
-0.11154643446207047,
0.03644070401787758,
-0.054411038756370544,
-0.07180064171552658,
0.09028228372335434,
0.08942431211471558,
0.05595926567912102,
-0.0551147423684597,
-0.10435299575328827,
-0.058357372879981995,
0.1700337678194046,
-0.14724817872047424,
-0.07737929373979568
] |
null | null | transformers | [AQLM](https://arxiv.org/abs/2401.06118) quantization of `152334H/miqu-1-70b-sf`.
For this quantization, we used 1 codebook of 16 bits.
Selected evaluation results for this and other models:
| Model | AQLM scheme | WikiText 2 PPL | Model size, Gb | Hub link |
|------------|-------------|----------------|----------------|--------------------------------------------------------------------------|
| Llama-2-7b | 1x16 | 6.31 | 2.4 | [Link](https://huggingface.co/BlackSamorez/Llama-2-7b-AQLM-2Bit-1x16-hf) |
| Llama-2-7b | 2x8 | 7.98 | 2.2 | [Link](https://huggingface.co/BlackSamorez/Llama-2-7b-AQLM-2Bit-2x8-hf) |
| Llama-2-7b | 8x8 | 7.83 | 2.2 | [Link](https://huggingface.co/BlackSamorez/Llama-2-7b-AQLM-2Bit-8x8-hf) |
| Llama-2-13b | 1x16 | 5.41 | 4.1 | [Link](https://huggingface.co/BlackSamorez/Llama-2-13b-AQLM-2Bit-1x16-hf)|
| Llama-2-70b| 1x16 | 3.96 | 18.8 | [Link](https://huggingface.co/BlackSamorez/Llama-2-70b-AQLM-2Bit-1x16-hf)|
| Llama-2-70b| 2x8 | 4.83 | 18.2 | [Link](https://huggingface.co/BlackSamorez/Llama-2-70b-AQLM-2Bit-2x8-hf) |
| Mixtral-8x7b| 1x16 | 4.37 | 12.6 | [Link](https://huggingface.co/BlackSamorez/Mixtral-8x7b-AQLM-2Bit-1x16-hf)|
| miqu-1-70b (THIS) | 1x16 | 4.01 | 18.8 | [Link](https://huggingface.co/AlexWortega/miqu-1-70b-AQLM-2Bit-1x16-hf)|
To learn more about the inference, as well as the information on how to quantize models yourself, please refer to the [official GitHub repo](https://github.com/Vahe1994/AQLM). | {} | text-generation | AlexWortega/miqu-1-70b-AQLM-2Bit-1x16-hf | [
"transformers",
"safetensors",
"llama_aqlm",
"text-generation",
"conversational",
"custom_code",
"arxiv:2401.06118",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:12:30+00:00 | [
"2401.06118"
] | [] | TAGS
#transformers #safetensors #llama_aqlm #text-generation #conversational #custom_code #arxiv-2401.06118 #autotrain_compatible #endpoints_compatible #region-us
| AQLM quantization of '152334H/miqu-1-70b-sf'.
For this quantization, we used 1 codebook of 16 bits.
Selected evaluation results for this and other models:
To learn more about the inference, as well as the information on how to quantize models yourself, please refer to the official GitHub repo.
| [] | [
"TAGS\n#transformers #safetensors #llama_aqlm #text-generation #conversational #custom_code #arxiv-2401.06118 #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
59
] | [
"passage: TAGS\n#transformers #safetensors #llama_aqlm #text-generation #conversational #custom_code #arxiv-2401.06118 #autotrain_compatible #endpoints_compatible #region-us \n"
] | [
-0.05906282365322113,
0.04273756965994835,
-0.00525812990963459,
0.011420043185353279,
0.15116684138774872,
-0.036437056958675385,
0.11461302638053894,
0.09652071446180344,
-0.023082271218299866,
0.019977150484919548,
0.14484570920467377,
0.2079700231552124,
-0.009465474635362625,
0.1402147114276886,
-0.1320418119430542,
-0.10343624651432037,
0.06483081728219986,
-0.0048357401974499226,
0.020229339599609375,
0.07512501627206802,
0.10846107453107834,
-0.05378638580441475,
0.10744588077068329,
-0.05716225877404213,
-0.10899057984352112,
0.06428667902946472,
0.04760832339525223,
-0.12321723252534866,
0.11159472167491913,
0.08612744510173798,
0.12124495953321457,
0.0377340093255043,
-0.05196862295269966,
-0.21081964671611786,
0.031615082174539566,
0.003829390276223421,
-0.066251240670681,
0.005000561010092497,
0.03511067107319832,
-0.054833222180604935,
-0.01519916020333767,
0.033909693360328674,
-0.017949657514691353,
0.09109774976968765,
-0.15077176690101624,
0.020027974620461464,
-0.02045854739844799,
-0.03747427836060524,
0.13366904854774475,
0.11087748408317566,
-0.0016356837004423141,
0.13843128085136414,
-0.03468950092792511,
0.1456930935382843,
0.08416321128606796,
-0.34776195883750916,
0.00021820077381562442,
0.09319466352462769,
0.05484394356608391,
0.06754772365093231,
-0.03620319440960884,
0.05493868514895439,
0.081540547311306,
-0.025035511702299118,
-0.03571804612874985,
-0.0877171978354454,
-0.15380103886127472,
0.031885821372270584,
-0.06018602102994919,
-0.028880614787340164,
0.20488445460796356,
-0.023814136162400246,
0.049820899963378906,
-0.012739406898617744,
-0.1116562932729721,
-0.0765625610947609,
-0.035765908658504486,
0.0032354460563510656,
-0.037273772060871124,
0.07369885593652725,
0.010450874455273151,
0.010334983468055725,
-0.13572758436203003,
0.004812936764210463,
-0.18554019927978516,
0.14893557131290436,
-0.01612224616110325,
0.03470964357256889,
-0.19893833994865417,
0.010928516276180744,
-0.0007894673035480082,
-0.12521244585514069,
0.00015386607265099883,
-0.055127013474702835,
0.04048972576856613,
0.00825702864676714,
-0.034877583384513855,
-0.06323707103729248,
0.16197019815444946,
0.09494268149137497,
0.027341075241565704,
0.09254835546016693,
-0.068546362221241,
0.038522493094205856,
-0.0301238764077425,
0.07871474325656891,
0.04716368392109871,
-0.07921212911605835,
0.08701111376285553,
-0.04972977191209793,
0.055269114673137665,
-0.05159750580787659,
-0.156097874045372,
-0.006376094184815884,
0.047777608036994934,
0.14182741940021515,
0.002761627547442913,
0.0925854742527008,
-0.04174830764532089,
0.06332217156887054,
0.04897509142756462,
-0.11707847565412521,
-0.01113758236169815,
0.00915600173175335,
0.053838569670915604,
-0.010654906742274761,
0.016790026798844337,
0.03959008678793907,
-0.06793384253978729,
-0.013562981970608234,
-0.04910937696695328,
-0.03307507932186127,
-0.035488031804561615,
-0.05694248527288437,
0.030611254274845123,
-0.028486672788858414,
0.03208731487393379,
-0.2193002998828888,
-0.09712643176317215,
-0.01420616265386343,
-0.004749870393425226,
-0.001783448038622737,
0.010459106415510178,
-0.08541391789913177,
-0.049065206199884415,
0.05055404081940651,
-0.05830471217632294,
-0.09157784283161163,
-0.07447843998670578,
0.06196889653801918,
0.07658850401639938,
0.09205316752195358,
-0.11122143268585205,
0.03856567665934563,
-0.08341629803180695,
0.031380925327539444,
-0.0302981436252594,
0.037896301597356796,
-0.03731240704655647,
0.16485533118247986,
-0.015779074281454086,
0.04095866531133652,
-0.05763336271047592,
0.06061479449272156,
0.013482112437486649,
0.20071883499622345,
-0.12539567053318024,
-0.028929036110639572,
0.23603646457195282,
-0.09985316544771194,
-0.22977694869041443,
0.0837772861123085,
0.012563587166368961,
0.021105151623487473,
0.08099431544542313,
0.16309984028339386,
0.008999061770737171,
-0.06879623979330063,
0.03337609022855759,
0.11182138323783875,
-0.07631312310695648,
-0.12522153556346893,
-0.008973656222224236,
0.0020040501840412617,
-0.10962527245283127,
0.04925895854830742,
0.1257387101650238,
0.08054789155721664,
-0.0196223184466362,
-0.06481650471687317,
-0.05674877017736435,
-0.061208270490169525,
-0.006667611189186573,
-0.005071215331554413,
0.028303800150752068,
-0.10476147383451462,
0.005368765443563461,
-0.09229885041713715,
0.02177598513662815,
0.010117119178175926,
-0.001629395061172545,
-0.12615616619586945,
0.06273636221885681,
-0.06298714131116867,
0.044511087238788605,
-0.11544790863990784,
-0.1054149642586708,
-0.05410396680235863,
0.09503773599863052,
-0.018335284665226936,
0.05799293518066406,
0.08128198236227036,
-0.017344865947961807,
0.02077397145330906,
-0.03223429247736931,
0.20460398495197296,
0.053751103579998016,
-0.07231920212507248,
-0.053717371076345444,
0.10990317165851593,
-0.07354796677827835,
0.07583504915237427,
-0.09461726248264313,
0.007332189008593559,
0.042971670627593994,
0.109565369784832,
0.027634086087346077,
0.059691496193408966,
0.004022859502583742,
0.039123278111219406,
-0.08154965937137604,
0.0014336840249598026,
0.07006655633449554,
0.01835324801504612,
-0.07093144208192825,
0.17103268206119537,
-0.2528930604457855,
0.31325677037239075,
0.1716822236776352,
-0.1953577846288681,
0.0014383277157321572,
-0.034447383135557175,
0.01859235018491745,
0.013778177089989185,
0.07115259766578674,
-0.028795626014471054,
0.04000532254576683,
-0.005499076098203659,
0.2040863037109375,
-0.03863949328660965,
0.007330627180635929,
0.009607297368347645,
-0.11161871999502182,
-0.05215829610824585,
0.07925237715244293,
0.02618919499218464,
-0.2063535898923874,
0.17127661406993866,
0.21332389116287231,
0.02226816676557064,
0.14399421215057373,
-0.005554739385843277,
0.024133754894137383,
0.03397239372134209,
0.07888643443584442,
0.0007673444924876094,
-0.03801184520125389,
-0.15293601155281067,
-0.03950817137956619,
0.03563731163740158,
0.01093324925750494,
0.08184868842363358,
-0.11019871383905411,
-0.047768328338861465,
0.027641721069812775,
-0.007850011810660362,
0.0003344123251736164,
0.061323363333940506,
0.024994026869535446,
0.1313716322183609,
0.0017398245399817824,
-0.08336560428142548,
0.0853290855884552,
-0.021826960146427155,
-0.10202062875032425,
0.1900499165058136,
-0.15096724033355713,
-0.3137649893760681,
-0.16636979579925537,
-0.16850556433200836,
-0.05831856280565262,
0.05119295045733452,
0.1124192401766777,
-0.15664291381835938,
-0.06233055144548416,
-0.03750355914235115,
-0.03479362279176712,
-0.014745167456567287,
0.002968593267723918,
0.012837283313274384,
0.056137632578611374,
-0.011881391517817974,
-0.1079925000667572,
-0.04555946961045265,
0.015899905934929848,
-0.037397351115942,
0.10331989824771881,
-0.10337276011705399,
0.13987615704536438,
0.15543201565742493,
-0.003549169050529599,
0.03904172033071518,
-0.008669898845255375,
0.18858128786087036,
-0.052543140947818756,
-0.017505483701825142,
0.18692083656787872,
-0.05678710341453552,
0.06777415424585342,
0.2046440690755844,
-0.020803336054086685,
-0.12254216521978378,
0.048800621181726456,
-0.03368324413895607,
-0.0673016756772995,
-0.20164622366428375,
-0.11963974684476852,
-0.0944453775882721,
0.05889172479510307,
-0.01918864995241165,
0.0662062019109726,
0.08785303682088852,
0.06587254256010056,
0.015514591708779335,
-0.038579054176807404,
-0.00704103522002697,
0.08431684225797653,
0.2409450262784958,
-0.022653954103589058,
0.11798687279224396,
-0.09899255633354187,
-0.12161705642938614,
0.06369104981422424,
0.10707050561904907,
0.020324908196926117,
0.09141292423009872,
0.04550640657544136,
-0.0002151287771994248,
0.10635081678628922,
0.12390346825122833,
0.09470857679843903,
0.031720343977212906,
-0.07973907142877579,
0.007544690743088722,
-0.010305996984243393,
-0.10391717404127121,
0.0500989630818367,
-0.05479245260357857,
-0.11542896926403046,
-0.03241242840886116,
0.018076812848448753,
0.07558584958314896,
0.07183090597391129,
0.07049418985843658,
-0.24412967264652252,
-0.010770680382847786,
0.10538733750581741,
-0.01139821857213974,
-0.11800915747880936,
0.10945013910531998,
0.03812447935342789,
-0.04187505692243576,
0.07577063888311386,
-0.014681108295917511,
0.09852802753448486,
-0.023766525089740753,
0.08129242807626724,
-0.11823117733001709,
-0.06534233689308167,
0.0018562513869255781,
0.10595224797725677,
-0.33573296666145325,
0.21602270007133484,
0.03252657130360603,
0.007943883538246155,
-0.053524699062108994,
0.0016684032743796706,
0.0016248131869360805,
0.1349852979183197,
0.11008770763874054,
-0.017152981832623482,
-0.11646626889705658,
-0.17014652490615845,
-0.0029127479065209627,
0.046714894473552704,
0.13428084552288055,
0.08079992979764938,
0.018040964379906654,
-0.057178229093551636,
-0.029595430940389633,
-0.0009415230015292764,
-0.06024887412786484,
-0.023334559053182602,
-0.1794874370098114,
0.05193590745329857,
0.14878667891025543,
0.103556789457798,
-0.0024273220915347338,
0.04756992682814598,
-0.11165659874677658,
0.18278768658638,
-0.09086371213197708,
-0.04652469605207443,
-0.09790650010108948,
-0.14530619978904724,
0.014295185916125774,
-0.07134456187486649,
0.08457008004188538,
-0.08763685077428818,
0.029295891523361206,
-0.05712804198265076,
-0.1517844796180725,
0.1205335482954979,
-0.1373981088399887,
-0.0020255721174180508,
-0.04132111370563507,
0.12788549065589905,
-0.10824751108884811,
-0.04978431016206741,
0.0501752570271492,
0.02944687195122242,
-0.09568843990564346,
-0.08191202580928802,
0.0026008770801126957,
-0.00403242651373148,
0.017588941380381584,
0.0014048625016584992,
-0.07485397905111313,
-0.1157260611653328,
-0.023586511611938477,
-0.05452769994735718,
0.2684219479560852,
0.2513095736503601,
-0.056974224746227264,
0.07505875825881958,
0.16417217254638672,
-0.04623240604996681,
-0.33184728026390076,
-0.12943942844867706,
-0.14812049269676208,
-0.023469092324376106,
-0.033437736332416534,
-0.043956633657217026,
0.06797625869512558,
-0.023616861552000046,
-0.036931704729795456,
0.08450715243816376,
-0.18237225711345673,
-0.09420786798000336,
0.20048604905605316,
0.018632544204592705,
0.38567277789115906,
-0.18155160546302795,
-0.08749917894601822,
-0.11535006016492844,
-0.17473267018795013,
0.04858798906207085,
-0.060669396072626114,
0.07902075350284576,
0.016184760257601738,
0.08167549222707748,
0.047258611768484116,
-0.05602945014834404,
0.10133154690265656,
-0.04963092878460884,
0.03896282613277435,
-0.1455623060464859,
-0.07792162150144577,
-0.012498791329562664,
-0.0328717902302742,
0.046416714787483215,
-0.08767394721508026,
0.06056195870041847,
-0.10432467609643936,
-0.010476659052073956,
-0.029792513698339462,
0.037112586200237274,
0.031187348067760468,
-0.031124267727136612,
-0.01577853597700596,
-0.08725771307945251,
0.03902406245470047,
0.005964114796370268,
0.2246263176202774,
-0.09492563456296921,
0.19362716376781464,
0.24335625767707825,
0.15816174447536469,
-0.16583524644374847,
0.11143267899751663,
0.007725683506578207,
-0.07556909322738647,
0.05979378893971443,
-0.10558880865573883,
0.10081803053617477,
0.0828477293252945,
-0.05197663605213165,
0.06844048947095871,
0.06844554096460342,
0.039868611842393875,
0.012936766259372234,
0.15754541754722595,
-0.18897350132465363,
-0.052180372178554535,
-0.03624724596738815,
0.116967111825943,
0.05804084986448288,
0.11246689409017563,
0.1689101606607437,
0.024903880432248116,
0.010672728531062603,
-0.03186653181910515,
0.03792344406247139,
-0.032673221081495285,
0.06774122267961502,
0.07073532789945602,
0.03994001820683479,
-0.11450914293527603,
0.07250536233186722,
-0.019940542057156563,
-0.16910181939601898,
0.05310245603322983,
0.11319117993116379,
-0.13768118619918823,
-0.13003863394260406,
-0.00291051366366446,
0.18805889785289764,
-0.15978915989398956,
-0.09739230573177338,
-0.07442222535610199,
-0.10915293544530869,
0.022814569994807243,
0.25223642587661743,
0.03153800219297409,
0.07136696577072144,
0.02200785093009472,
-0.04571060836315155,
-0.032236892729997635,
0.02301628887653351,
-0.011838835664093494,
0.034006278961896896,
-0.13355697691440582,
0.0038526933640241623,
-0.056881390511989594,
0.11094710975885391,
-0.08187943696975708,
-0.03429374843835831,
-0.15674561262130737,
0.026242688298225403,
-0.1672482192516327,
-0.008164112456142902,
-0.08680502325296402,
-0.011989888735115528,
0.011642840690910816,
-0.04214783012866974,
-0.05918361246585846,
-0.0320088192820549,
-0.08286567032337189,
0.025364438071846962,
-0.03255832940340042,
0.005795052275061607,
-0.09059731662273407,
-0.02417842485010624,
0.05408785864710808,
-0.017380185425281525,
0.07519025355577469,
0.07479994744062424,
-0.09417926520109177,
0.06490849703550339,
-0.18917162716388702,
-0.08152163028717041,
0.12121769785881042,
0.006366397254168987,
0.04422569274902344,
0.071881964802742,
0.00435607647523284,
0.11429058760404587,
0.026277724653482437,
0.029667066410183907,
0.13769593834877014,
-0.08821991831064224,
0.039558686316013336,
-0.03808457776904106,
-0.12603451311588287,
-0.039988502860069275,
-0.057089224457740784,
0.11222441494464874,
-0.044515080749988556,
0.14308424293994904,
-0.10893595963716507,
0.051759783178567886,
-0.02994980290532112,
0.0065748123452067375,
0.010419792495667934,
-0.15189164876937866,
-0.08945341408252716,
-0.058782801032066345,
0.024054905399680138,
-0.0279177725315094,
0.19619742035865784,
-0.018575334921479225,
0.021978789940476418,
0.06394734233617783,
-0.010206203907728195,
0.0035464130342006683,
0.017293768003582954,
0.19673293828964233,
0.09075343608856201,
-0.03842473030090332,
-0.11510607600212097,
0.04559557884931564,
0.01595418155193329,
-0.03826826810836792,
0.05410047248005867,
0.07895082980394363,
-0.06383505463600159,
0.12934522330760956,
0.02173447236418724,
-0.02277875319123268,
-0.04992173984646797,
-0.14436791837215424,
-0.1409311145544052,
0.03574811667203903,
-0.032319575548172,
0.05273311585187912,
0.23006518185138702,
0.012829311192035675,
-0.013616317883133888,
-0.10512135922908783,
-0.03728990629315376,
-0.17243637144565582,
-0.04623197764158249,
-0.1313922107219696,
-0.13294431567192078,
0.011697816662490368,
-0.08151354640722275,
0.05089682713150978,
0.07711563259363174,
0.04214072600007057,
-0.0251234769821167,
0.17405615746974945,
0.022168243303894997,
-0.05802713334560394,
0.005713175982236862,
-0.024956438690423965,
0.06941377371549606,
0.02400238625705242,
-0.01768897846341133,
-0.06890757381916046,
-0.05711258575320244,
-0.02840578742325306,
0.07455312460660934,
0.001165107125416398,
0.06451069563627243,
-0.13481208682060242,
-0.08299284428358078,
-0.03790190815925598,
0.11136846989393234,
-0.03110347129404545,
0.11782195419073105,
0.019611017778515816,
0.0005228585796430707,
0.05763823166489601,
0.2005026936531067,
-0.07750990986824036,
-0.1074322760105133,
-0.06218305602669716,
0.18676669895648956,
0.022104719653725624,
0.16473761200904846,
-0.09005679935216904,
-0.027687592431902885,
-0.038515523076057434,
0.3457610607147217,
0.20253363251686096,
-0.0843525379896164,
0.03939531743526459,
-0.0622311569750309,
0.057894472032785416,
0.036838386207818985,
0.12657001614570618,
0.11620964109897614,
0.27972498536109924,
-0.055726464837789536,
-0.05778056010603905,
-0.012681953608989716,
-0.013822140172123909,
-0.12028056383132935,
0.10158060491085052,
-0.03507010638713837,
-0.02381291799247265,
-0.02600126340985298,
0.07993008196353912,
-0.1367957442998886,
0.11898022890090942,
-0.07818494737148285,
-0.1879565268754959,
-0.04054667428135872,
-0.02029983513057232,
0.1759994477033615,
-0.0012420964194461703,
0.06537927687168121,
-0.013890854083001614,
-0.07825098186731339,
0.04357919842004776,
-0.000345221342286095,
-0.16083763539791107,
-0.018957650288939476,
-0.003492837306112051,
-0.06755795329809189,
0.0051628598012030125,
0.00041833851719275117,
0.0234233271330595,
0.11580299586057663,
0.03638853505253792,
-0.04035482183098793,
0.1773960441350937,
0.0383695587515831,
-0.05081145092844963,
0.04988080635666847,
-0.012142210267484188,
-0.005528147332370281,
0.010596493259072304,
0.05216440558433533,
-0.15402629971504211,
0.06763948500156403,
0.018808912485837936,
-0.044817402958869934,
-0.028516916558146477,
0.03937491029500961,
-0.05450328812003136,
0.08531179279088974,
0.030165821313858032,
-0.02737126313149929,
0.025585204362869263,
-0.0013271815842017531,
0.020818611606955528,
-0.02866651676595211,
-0.11358828842639923,
-0.07140110433101654,
-0.15314289927482605,
-0.05130668729543686,
0.09074968099594116,
0.017066573724150658,
-0.1942884922027588,
0.005818162579089403,
-0.0976068377494812,
0.07514561712741852,
-0.13499271869659424,
0.08198357373476028,
0.17186503112316132,
0.022629128769040108,
-0.03618701919913292,
-0.11654914915561676,
0.07051108777523041,
0.12150949239730835,
-0.05275971442461014,
-0.13649293780326843
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ./XLM-Roberta-Enc-dec-GEC2
This model is a fine-tuned version of [FacebookAI/xlm-roberta-large](https://huggingface.co/FacebookAI/xlm-roberta-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 5.0328
- Wer: 100.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 500
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-----:|
| 2.1037 | 5.17 | 500 | 5.0328 | 100.0 |
### Framework versions
- Transformers 4.37.0
- Pytorch 2.1.2
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"language": ["ta"], "license": "mit", "tags": ["Tamil-ASR", "generated_from_trainer"], "metrics": ["wer"], "base_model": "FacebookAI/xlm-roberta-large", "model-index": [{"name": "./XLM-Roberta-Enc-dec-GEC2", "results": []}]} | text2text-generation | sujith013/XLM-Roberta-Enc-dec-GEC2 | [
"transformers",
"tensorboard",
"safetensors",
"encoder-decoder",
"text2text-generation",
"Tamil-ASR",
"generated_from_trainer",
"ta",
"base_model:FacebookAI/xlm-roberta-large",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:12:43+00:00 | [] | [
"ta"
] | TAGS
#transformers #tensorboard #safetensors #encoder-decoder #text2text-generation #Tamil-ASR #generated_from_trainer #ta #base_model-FacebookAI/xlm-roberta-large #license-mit #autotrain_compatible #endpoints_compatible #region-us
| ./XLM-Roberta-Enc-dec-GEC2
==========================
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 5.0328
* Wer: 100.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 1
* eval\_batch\_size: 1
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 8
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* training\_steps: 500
### Training results
### Framework versions
* Transformers 4.37.0
* Pytorch 2.1.2
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 500",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.0\n* Pytorch 2.1.2\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #encoder-decoder #text2text-generation #Tamil-ASR #generated_from_trainer #ta #base_model-FacebookAI/xlm-roberta-large #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 500",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.0\n* Pytorch 2.1.2\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
82,
125,
4,
30
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #encoder-decoder #text2text-generation #Tamil-ASR #generated_from_trainer #ta #base_model-FacebookAI/xlm-roberta-large #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 500### Training results### Framework versions\n\n\n* Transformers 4.37.0\n* Pytorch 2.1.2\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.13147702813148499,
0.08663947135210037,
-0.003434482030570507,
0.08234082907438278,
0.14944057166576385,
-0.008585149422287941,
0.12750650942325592,
0.138187974691391,
-0.07543273270130157,
0.06487878412008286,
0.10824146866798401,
0.11734277009963989,
0.06533756107091904,
0.17143310606479645,
-0.032455235719680786,
-0.2944903075695038,
-0.0056865019723773,
0.041659221053123474,
-0.1335030049085617,
0.13329219818115234,
0.10669562965631485,
-0.1277318298816681,
0.07326584309339523,
0.012087054550647736,
-0.12097305059432983,
-0.008130885660648346,
0.012470457702875137,
-0.09207955747842789,
0.10242301225662231,
-0.010698102414608002,
0.11286197602748871,
0.05568176880478859,
0.07363298535346985,
-0.14878550171852112,
0.014659587293863297,
0.04686843976378441,
0.022849055007100105,
0.0902630090713501,
0.0911988615989685,
-0.01117783598601818,
0.10641416162252426,
-0.07978086918592453,
0.07391969859600067,
0.03251703456044197,
-0.11249146610498428,
-0.30863937735557556,
-0.09031512588262558,
0.11001672595739365,
0.1042388305068016,
0.07828188687562943,
-0.015172167681157589,
0.0870053842663765,
-0.058997467160224915,
0.09749823808670044,
0.23655477166175842,
-0.2918269634246826,
-0.05308595299720764,
0.004298657178878784,
0.058462414890527725,
0.08241738379001617,
-0.10586120933294296,
-0.04261436685919762,
0.03967595472931862,
0.02803822234272957,
0.16079314053058624,
-0.01357022300362587,
0.06600276380777359,
-0.00863022543489933,
-0.14191260933876038,
-0.047813404351472855,
0.12402553856372833,
0.01598350703716278,
-0.03544306755065918,
-0.08950118720531464,
-0.07269137352705002,
-0.19242185354232788,
-0.0316033810377121,
0.009183747693896294,
0.0013202652335166931,
-0.036729078739881516,
-0.058227553963661194,
-0.005481059197336435,
-0.10138098150491714,
-0.05512161925435066,
0.044566914439201355,
0.128187358379364,
0.059699006378650665,
0.005359599832445383,
-0.04004219174385071,
0.12493950128555298,
0.0813683569431305,
-0.17991112172603607,
0.030083632096648216,
0.02321586012840271,
-0.023390689864754677,
0.007488594856113195,
0.0014315404696390033,
-0.06792493909597397,
-0.009756379760801792,
0.10441286861896515,
-0.09932445734739304,
0.02410460263490677,
0.021581236273050308,
0.049018364399671555,
-0.10269872844219208,
0.20069539546966553,
-0.09161138534545898,
-0.008393790572881699,
0.004404515493661165,
0.15042631328105927,
0.07164430618286133,
-0.020592980086803436,
-0.08649110049009323,
0.024653121829032898,
0.12380426377058029,
0.05748306214809418,
-0.01124494057148695,
0.04270097240805626,
-0.06464334577322006,
-0.04045839607715607,
0.10621705651283264,
-0.11843521147966385,
0.014915019273757935,
0.011713999323546886,
-0.05728447437286377,
-0.02133248560130596,
0.012435847893357277,
-0.014104345813393593,
-0.01829463616013527,
0.10192044079303741,
-0.09102048724889755,
0.006525468081235886,
-0.09665966778993607,
-0.0909082293510437,
0.007625455502420664,
-0.0836545079946518,
-0.011310617439448833,
-0.08711673319339752,
-0.16880133748054504,
0.0016857271548360586,
0.02005581557750702,
-0.045418281108140945,
-0.0615672804415226,
-0.00343789579346776,
-0.10220152884721756,
0.02330189384520054,
-0.050710998475551605,
0.0762629434466362,
-0.0403628870844841,
0.14153440296649933,
0.06176108866930008,
0.08876190334558487,
0.06449099630117416,
0.055692408233881,
-0.09408795833587646,
0.05959462374448776,
-0.19956855475902557,
0.04897972568869591,
-0.0717310756444931,
0.04635358229279518,
-0.07728152722120285,
-0.12552915513515472,
0.024077879264950752,
-0.031080519780516624,
0.08030153810977936,
0.10432692617177963,
-0.13936063647270203,
-0.10755980014801025,
0.19670695066452026,
-0.1340765357017517,
-0.13240328431129456,
0.1320180594921112,
0.006865303497761488,
-0.007335386238992214,
0.027167005464434624,
0.1448444277048111,
0.0865740105509758,
-0.05726541578769684,
-0.041499003767967224,
-0.029077423736453056,
0.062003180384635925,
-0.015211875550448895,
0.10898708552122116,
0.0048858183436095715,
0.06238441541790962,
0.01862526312470436,
-0.07051508128643036,
0.07162507623434067,
-0.10241039842367172,
-0.08286088705062866,
-0.03817736729979515,
-0.08371500670909882,
0.10524740815162659,
0.08172623813152313,
0.05103885754942894,
-0.06881403177976608,
-0.10684586316347122,
0.007703697308897972,
0.10628914088010788,
-0.07730002701282501,
0.020411085337400436,
-0.09584319591522217,
0.14250726997852325,
-0.09385015815496445,
-0.007345428690314293,
-0.18319864571094513,
-0.027977701276540756,
0.04552377387881279,
-0.04966112598776817,
-0.03369439020752907,
-0.03370233625173569,
0.07209217548370361,
0.07547971606254578,
-0.0868333950638771,
-0.10370778292417526,
-0.06468657404184341,
-0.016311630606651306,
-0.08123819530010223,
-0.2009287327528,
-0.07668963074684143,
-0.03376861289143562,
0.13203728199005127,
-0.19304923713207245,
0.05478259548544884,
0.05750184878706932,
0.15415024757385254,
0.05426481366157532,
-0.025182630866765976,
0.013828609138727188,
0.07052109390497208,
-0.014588608406484127,
-0.08285056799650192,
0.05169256404042244,
0.012902932241559029,
-0.09608852118253708,
0.01608964055776596,
-0.1442473977804184,
0.18024912476539612,
0.11103615909814835,
-0.010038710199296474,
-0.060347963124513626,
0.021753666922450066,
-0.0822431668639183,
-0.03992684185504913,
-0.025694802403450012,
0.0162917859852314,
0.1062743067741394,
0.03244161233305931,
0.17433716356754303,
-0.09249000251293182,
-0.03381485491991043,
0.045160334557294846,
-0.0353977344930172,
0.016714593395590782,
0.1212640032172203,
0.024783484637737274,
-0.07343398034572601,
0.14625626802444458,
0.14466018974781036,
-0.05259896442294121,
0.16499416530132294,
-0.06991198658943176,
-0.09585133194923401,
-0.031225714832544327,
0.01767151989042759,
0.02614535018801689,
0.1223454624414444,
-0.11165142804384232,
-0.009891358204185963,
0.031322915107011795,
-0.009619099088013172,
0.002087511122226715,
-0.19677206873893738,
-0.01900627464056015,
0.015303893946111202,
-0.05338050052523613,
-0.0024825199507176876,
0.025399230420589447,
0.014479940757155418,
0.10244482010602951,
-0.007065406534820795,
-0.04744916781783104,
0.03734242171049118,
0.010981612838804722,
-0.08140640705823898,
0.2011564075946808,
-0.06814930588006973,
-0.23821832239627838,
-0.1509452760219574,
0.024164274334907532,
-0.04152991622686386,
-0.007524355314671993,
0.042096879333257675,
-0.11435535550117493,
-0.030829785391688347,
-0.05512365326285362,
0.04801248386502266,
0.037730488926172256,
0.03336820751428604,
0.001312950043939054,
0.006130521185696125,
0.09112962335348129,
-0.09938562661409378,
0.009136462584137917,
-0.01678718812763691,
-0.05229821801185608,
0.05509866401553154,
0.0764792338013649,
0.08129880577325821,
0.1501150131225586,
-0.025749273598194122,
0.012634210288524628,
-0.021393880248069763,
0.1312340646982193,
-0.10926321893930435,
0.006449774373322725,
0.18596895039081573,
-0.0012899629073217511,
0.06481306999921799,
0.14302602410316467,
0.03819400817155838,
-0.04807563126087189,
0.0020035833586007357,
0.012088018469512463,
-0.017838234081864357,
-0.2316252887248993,
-0.04416852071881294,
-0.04135163873434067,
0.054840102791786194,
0.0998653769493103,
0.037671077996492386,
0.024390581995248795,
0.059022970497608185,
-0.028173396363854408,
0.0077211689203977585,
-0.01067993976175785,
0.07130975276231766,
0.06703000515699387,
0.023493055254220963,
0.1445644348859787,
-0.04555702209472656,
-0.03307066485285759,
0.02620726265013218,
-0.005512477830052376,
0.21309641003608704,
-0.051434945315122604,
0.11464402079582214,
0.05870891734957695,
0.18510301411151886,
0.03384459763765335,
0.06459391117095947,
-0.012638086453080177,
-0.03443341702222824,
0.02194800227880478,
-0.042428888380527496,
-0.03754851222038269,
0.014135215431451797,
-0.015555491670966148,
0.07644528150558472,
-0.13091915845870972,
0.030743325129151344,
0.04216273874044418,
0.2965388596057892,
0.061579905450344086,
-0.3136052191257477,
-0.1309555023908615,
0.013206847012043,
-0.04551213979721069,
-0.04061910882592201,
0.010981002822518349,
0.12148530036211014,
-0.08932607620954514,
0.0591442845761776,
-0.11470310389995575,
0.05078859254717827,
-0.07754349708557129,
0.00317101227119565,
0.0633927583694458,
0.0679495632648468,
0.02276225946843624,
0.06550807505846024,
-0.24702411890029907,
0.30168667435646057,
0.010529298335313797,
0.062337908893823624,
-0.023749686777591705,
0.022296201437711716,
0.012450332753360271,
0.06554657220840454,
0.0692405253648758,
-0.014466805383563042,
-0.005743909627199173,
-0.2197127491235733,
-0.10100496560335159,
-0.005837338976562023,
0.11309593915939331,
-0.04724321886897087,
0.11738216131925583,
-0.041118357330560684,
-0.013720182701945305,
0.026490546762943268,
-0.10898566991090775,
-0.06391329318284988,
-0.07271245121955872,
0.059805728495121,
-0.013102995231747627,
0.0338311567902565,
-0.12709613144397736,
-0.13803869485855103,
-0.052967801690101624,
0.15764406323432922,
-0.09743925929069519,
-0.0815897136926651,
-0.11233928799629211,
0.11331496387720108,
0.1195015087723732,
-0.10150262713432312,
0.04530160501599312,
-0.002679098630324006,
0.11939145624637604,
0.01685265079140663,
-0.024158012121915817,
0.08301005512475967,
-0.06120501831173897,
-0.2632841169834137,
-0.014995058998465538,
0.15515677630901337,
0.012199077755212784,
0.04424398019909859,
-0.02792983502149582,
0.02623068168759346,
0.01205897331237793,
-0.07558801770210266,
0.04565495252609253,
0.023751910775899887,
0.03827129676938057,
0.038756586611270905,
-0.03426244854927063,
-0.033484961837530136,
-0.04662994667887688,
-0.05286184325814247,
0.11815354228019714,
0.2539704740047455,
-0.09097389131784439,
0.04846588149666786,
0.09244982898235321,
-0.042222946882247925,
-0.18657547235488892,
-0.011149333789944649,
0.0747387707233429,
0.026699163019657135,
0.026333609595894814,
-0.15154887735843658,
0.04792483523488045,
0.058327723294496536,
-0.047751788049936295,
0.03977792337536812,
-0.30466219782829285,
-0.12424850463867188,
0.08832721412181854,
0.12227976322174072,
0.025415990501642227,
-0.18483422696590424,
-0.06493137776851654,
0.01469864696264267,
-0.07581315189599991,
0.05022430419921875,
-0.05838819220662117,
0.10878852009773254,
-0.05843394249677658,
0.019178567454218864,
0.014651141129434109,
-0.08036697655916214,
0.12251352518796921,
-0.05102137476205826,
0.047427162528038025,
-0.029239455237984657,
0.006526322104036808,
0.12534202635288239,
-0.04502212628722191,
0.06422143429517746,
-0.040706101804971695,
0.04910191521048546,
-0.09013083577156067,
-0.019175736233592033,
-0.11343841254711151,
0.02811231091618538,
-0.035826899111270905,
-0.046641405671834946,
-0.02773544192314148,
0.0626528188586235,
0.02811216562986374,
-0.016095900908112526,
0.13870885968208313,
0.004323679953813553,
0.15443964302539825,
0.12794944643974304,
0.10481611639261246,
-0.06926753371953964,
-0.08305664360523224,
-0.055054936558008194,
-0.01688764989376068,
0.05084136128425598,
-0.14448443055152893,
0.03391602262854576,
0.11167708784341812,
0.042955245822668076,
0.13782735168933868,
0.07254686951637268,
-0.05445345118641853,
0.049244485795497894,
0.09581267088651657,
-0.12409702688455582,
-0.1445818841457367,
-0.030158836394548416,
0.01086029876023531,
-0.15083883702754974,
0.07757103443145752,
0.10903608798980713,
-0.0745680034160614,
-0.025555742904543877,
-0.006752276327461004,
0.035190094262361526,
-0.020852837711572647,
0.17992761731147766,
0.07612288743257523,
0.0895356759428978,
-0.09019691497087479,
0.09360390156507492,
0.06020371615886688,
-0.1631808876991272,
-0.0017373263835906982,
0.10845976322889328,
-0.1031411737203598,
-0.03825286403298378,
-0.01768725924193859,
0.0868908241391182,
-0.029179705306887627,
-0.03864917531609535,
-0.18355797231197357,
-0.11352081596851349,
0.05878930538892746,
0.11637209355831146,
0.06025131419301033,
0.02648492157459259,
-0.049782391637563705,
0.01733342930674553,
-0.12278880923986435,
0.14717429876327515,
0.09581725299358368,
0.08648859709501266,
-0.18599101901054382,
0.14033928513526917,
0.032481856644153595,
0.05159389600157738,
-0.019136907532811165,
0.019809089601039886,
-0.07032156735658646,
-0.00161671731621027,
-0.08002682030200958,
-0.045473288744688034,
-0.044953852891922,
-0.017099421471357346,
-0.03781390190124512,
-0.062114715576171875,
-0.03848004341125488,
0.026860356330871582,
-0.09132824838161469,
-0.034434787929058075,
0.004671653266996145,
0.04755762591958046,
-0.12678246200084686,
-0.04118433594703674,
0.03788919746875763,
-0.09005393087863922,
0.0992535874247551,
0.01887819543480873,
0.006806550547480583,
0.029336629435420036,
-0.03882509097456932,
0.0070305513218045235,
0.04940973222255707,
-0.0000915924392757006,
0.05208591744303703,
-0.12794815003871918,
0.012646113522350788,
-0.013038421981036663,
-0.002270851284265518,
0.02911487966775894,
0.07574018090963364,
-0.14003819227218628,
0.02766040526330471,
-0.04120277985930443,
-0.028435295447707176,
-0.08625897765159607,
0.08519398421049118,
0.05459553748369217,
0.025179952383041382,
0.1771390438079834,
-0.10018475353717804,
0.04540075734257698,
-0.2299441546201706,
0.004063391592353582,
-0.03330046683549881,
-0.10730709135532379,
-0.0772729143500328,
-0.001322057913057506,
0.07045017182826996,
-0.06437670439481735,
0.06919905543327332,
-0.014659518375992775,
0.07255805283784866,
0.04445381090044975,
-0.07022804766893387,
0.0580633319914341,
0.04100174829363823,
0.16506537795066833,
0.020752348005771637,
-0.029580913484096527,
0.05821740999817848,
0.031056134030222893,
0.045424479991197586,
0.10025366395711899,
0.13086654245853424,
0.1260402351617813,
-0.02328578568994999,
0.056883830577135086,
0.03738962113857269,
-0.07464078068733215,
-0.2314155399799347,
0.030791902914643288,
-0.08252882957458496,
0.10883085429668427,
-0.007194834761321545,
0.1787082701921463,
0.1942925602197647,
-0.15497560799121857,
0.04768002778291702,
-0.020574212074279785,
-0.08299266546964645,
-0.07556083053350449,
-0.05542619153857231,
-0.07379046827554703,
-0.17803232371807098,
0.008349452167749405,
-0.11058816313743591,
0.0040915245190262794,
0.07897987961769104,
0.004977465141564608,
-0.003982082940638065,
0.18771857023239136,
0.09894614666700363,
0.03197475150227547,
0.07076229900121689,
0.036868974566459656,
0.0010892626596614718,
-0.028896834701299667,
-0.05776640400290489,
-0.013461705297231674,
-0.023986218497157097,
0.02684313803911209,
-0.04306725412607193,
-0.10051441937685013,
0.047078099101781845,
0.011012655682861805,
-0.11721333116292953,
0.03299381583929062,
0.0126644903793931,
0.06042313203215599,
0.08443304151296616,
-0.0014200819423422217,
0.012056379579007626,
-0.03728938102722168,
0.22281259298324585,
-0.07783947139978409,
-0.037751417607069016,
-0.11797386407852173,
0.26952701807022095,
0.024801185354590416,
-0.03583468496799469,
0.048060934990644455,
-0.08459458500146866,
-0.02447541430592537,
0.17742151021957397,
0.1436864584684372,
0.004739831667393446,
-0.0049538398161530495,
0.00435443501919508,
-0.016069531440734863,
-0.0720064640045166,
0.09901607036590576,
0.1311061531305313,
0.10076098889112473,
-0.0917825922369957,
-0.028445161879062653,
-0.06634681671857834,
-0.039719972759485245,
-0.057130955159664154,
0.07332388311624527,
0.016763709485530853,
0.039306558668613434,
-0.059553295373916626,
0.0780177041888237,
-0.0017251832177862525,
-0.09621290862560272,
0.027223113924264908,
-0.208564892411232,
-0.15567906200885773,
-0.033051103353500366,
0.08062940090894699,
-0.013880363665521145,
0.06251759827136993,
0.01786303147673607,
0.0015179013134911656,
0.08025594800710678,
0.013283192180097103,
-0.04680636152625084,
-0.06017318740487099,
0.09660458564758301,
-0.13844946026802063,
0.1821601688861847,
-0.05671003833413124,
0.024271728470921516,
0.13136425614356995,
0.03436596691608429,
-0.11302396655082703,
0.030809570103883743,
0.06215471401810646,
-0.04305147007107735,
0.009904912672936916,
0.1930973380804062,
-0.00414170091971755,
0.0625092163681984,
0.03620089218020439,
-0.11710979789495468,
-0.0014218945289030671,
-0.11931642144918442,
-0.04632306471467018,
-0.05099937319755554,
0.004960320889949799,
-0.010384016670286655,
0.1193259134888649,
0.20395071804523468,
-0.05708000063896179,
-0.018178638070821762,
-0.07456304877996445,
0.005746922455728054,
0.058754391968250275,
0.06130759418010712,
-0.00028500688495114446,
-0.2514480948448181,
0.022078799083828926,
0.034493789076805115,
0.010162580758333206,
-0.24921615421772003,
-0.08643898367881775,
0.04512389376759529,
-0.0643182322382927,
-0.12254319339990616,
0.09370589256286621,
0.024666907265782356,
0.059431903064250946,
-0.03433184325695038,
-0.057116229087114334,
-0.07685577869415283,
0.16184775531291962,
-0.20532383024692535,
-0.07510306686162949
] |
null | null | null | # MentaLLaMA-chat-7b-GGUF-q8
This model is a GGUF version of the MentaLLama model found [here](https://huggingface.co/klyang/MentaLLaMA-chat-7B)
The process for converting the model has been documented [here](https://www.substratus.ai/blog/converting-hf-model-gguf-model/)
| {"license": "mit"} | null | WesselvanGils/MentaLLaMA-chat-7b-GGUF-q8 | [
"gguf",
"license:mit",
"region:us"
] | 2024-02-12T12:13:33+00:00 | [] | [] | TAGS
#gguf #license-mit #region-us
| # MentaLLaMA-chat-7b-GGUF-q8
This model is a GGUF version of the MentaLLama model found here
The process for converting the model has been documented here
| [
"# MentaLLaMA-chat-7b-GGUF-q8\n\nThis model is a GGUF version of the MentaLLama model found here\nThe process for converting the model has been documented here"
] | [
"TAGS\n#gguf #license-mit #region-us \n",
"# MentaLLaMA-chat-7b-GGUF-q8\n\nThis model is a GGUF version of the MentaLLama model found here\nThe process for converting the model has been documented here"
] | [
14,
45
] | [
"passage: TAGS\n#gguf #license-mit #region-us \n# MentaLLaMA-chat-7b-GGUF-q8\n\nThis model is a GGUF version of the MentaLLama model found here\nThe process for converting the model has been documented here"
] | [
-0.0359986312687397,
0.10290276259183884,
0.0018634957959875464,
0.009009966626763344,
0.05683543160557747,
0.06802480667829514,
0.21322967112064362,
-0.0032346276566386223,
0.09964434802532196,
-0.06725970655679703,
0.10222966969013214,
0.0236556027084589,
0.13207678496837616,
0.0336231030523777,
0.13379602134227753,
-0.23100195825099945,
0.00022979103960096836,
-0.03676655888557434,
-0.11004092544317245,
0.07669475674629211,
0.0545501783490181,
-0.0010722544975578785,
0.11155464500188828,
0.014347356744110584,
-0.014500663615763187,
-0.009033666923642159,
-0.020261159166693687,
0.03632929176092148,
0.053360216319561005,
0.11626695841550827,
-0.06646392494440079,
0.092412568628788,
0.01914397068321705,
-0.11888951808214188,
0.020626969635486603,
-0.12304108589887619,
-0.07882147282361984,
0.047858867794275284,
-0.02005019225180149,
0.17374815046787262,
0.1679314821958542,
0.14470580220222473,
-0.06031360104680061,
0.04405366629362106,
-0.10701243579387665,
0.012837606482207775,
-0.15246278047561646,
-0.001898768823593855,
0.10425598174333572,
0.07752519845962524,
0.0069698309525847435,
0.12671752274036407,
-0.12552715837955475,
-0.006801180075854063,
0.02819213829934597,
-0.30122002959251404,
0.03772132471203804,
0.2768930196762085,
0.08381752669811249,
0.07673292607069016,
-0.05707542970776558,
0.08012273162603378,
0.005150061100721359,
-0.047169581055641174,
-0.09139110147953033,
-0.09593768417835236,
0.14899972081184387,
-0.0002522804425098002,
-0.08887237310409546,
-0.005711430683732033,
0.19662947952747345,
0.10188475251197815,
-0.03449808806180954,
-0.0319952517747879,
-0.027615154162049294,
0.038728971034288406,
-0.03672165796160698,
-0.01114328857511282,
-0.014223986305296421,
0.06878245621919632,
0.023218613117933273,
-0.1645844429731369,
-0.021873757243156433,
-0.07013186812400818,
-0.09557008743286133,
0.1543087512254715,
-0.06568891555070877,
0.06866694986820221,
-0.055041391402482986,
0.045075397938489914,
-0.19143497943878174,
-0.04675109684467316,
-0.07348940521478653,
-0.06536704301834106,
0.011174418963491917,
-0.020485935732722282,
0.025990398600697517,
0.035421568900346756,
0.16398969292640686,
0.030223362147808075,
-0.1029927060008049,
-0.003613923443481326,
0.07850495725870132,
0.06227405369281769,
-0.06437389552593231,
0.0935257226228714,
-0.06053762510418892,
-0.037705134600400925,
-0.03086482547223568,
-0.1475241333246231,
-0.06790023297071457,
-0.027479834854602814,
-0.2193985879421234,
-0.08186808973550797,
-0.14006353914737701,
0.12560009956359863,
-0.10679367929697037,
0.07028868049383163,
0.006760744843631983,
-0.09488283097743988,
0.07958173751831055,
0.02222822606563568,
-0.06119156256318092,
-0.04348482936620712,
-0.014953079633414745,
-0.04342183098196983,
-0.08136416226625443,
0.028224138543009758,
0.08873031288385391,
-0.13767574727535248,
-0.08546911180019379,
-0.029485369101166725,
-0.02668691799044609,
-0.023659149184823036,
-0.002149541163817048,
-0.06995592266321182,
0.03173701837658882,
-0.11313562840223312,
-0.21178224682807922,
-0.012062694877386093,
-0.05515290051698685,
0.01806604117155075,
0.08011127263307571,
-0.0453147329390049,
0.09777697175741196,
0.00846653338521719,
-0.0253922026604414,
0.035976652055978775,
-0.059971656650304794,
0.023382099345326424,
-0.05181325972080231,
0.1147657260298729,
-0.1273914873600006,
0.019095858559012413,
0.012358900159597397,
0.04755859822034836,
-0.13623851537704468,
0.002060286235064268,
-0.09359509497880936,
0.11819557845592499,
-0.057279717177152634,
0.05575341731309891,
-0.05518731847405434,
-0.021198319271206856,
-0.029317831620573997,
0.23366494476795197,
-0.1529868245124817,
-0.06778810918331146,
0.2005195915699005,
-0.022497791796922684,
-0.08006098121404648,
-0.048644375056028366,
0.015268020331859589,
0.0931125059723854,
0.012986964546144009,
0.26900622248649597,
0.13667026162147522,
0.032173290848731995,
0.1037009134888649,
0.09326158463954926,
-0.05360792577266693,
-0.08844740688800812,
0.1629844307899475,
-0.09752658754587173,
-0.16317866742610931,
0.029728684574365616,
-0.13124987483024597,
0.14493998885154724,
-0.05324516445398331,
-0.025598861277103424,
0.05332200601696968,
-0.07033820450305939,
-0.05369299650192261,
-0.11897651851177216,
0.1064109355211258,
0.07303286343812943,
0.06929437071084976,
-0.12849324941635132,
0.11226654052734375,
0.027772145345807076,
-0.06609109789133072,
-0.13180600106716156,
0.10085084289312363,
-0.010439428500831127,
0.031139688566327095,
0.015303054824471474,
-0.0441136434674263,
0.005094438325613737,
0.0030879571568220854,
0.025484329089522362,
0.08201954513788223,
0.06799150258302689,
0.03184344246983528,
0.03293333947658539,
0.06315820664167404,
0.06698209047317505,
0.015102190896868706,
0.03635699301958084,
-0.063360795378685,
0.02086903527379036,
-0.024281971156597137,
0.08328185230493546,
-0.01867116056382656,
-0.023415498435497284,
0.05538177117705345,
0.045658718794584274,
-0.05452364310622215,
-0.04187772795557976,
0.07832640409469604,
-0.08925200253725052,
0.042958177626132965,
-0.05517738685011864,
0.03613833710551262,
0.005894978530704975,
-0.17863570153713226,
0.19184823334217072,
-0.010281691327691078,
0.22955480217933655,
0.11530527472496033,
0.14086651802062988,
-0.09368427842855453,
-0.09629475325345993,
-0.02241619862616062,
0.054576288908720016,
0.018881093710660934,
0.006877114064991474,
0.09325234591960907,
-0.05351514369249344,
0.07908245176076889,
0.024038420990109444,
-0.007899980992078781,
0.04526796191930771,
-0.030714839696884155,
-0.023309245705604553,
0.08136780560016632,
0.19948413968086243,
-0.2800223231315613,
0.059017620980739594,
0.17012062668800354,
0.12923116981983185,
0.2799268364906311,
-0.037515170872211456,
0.004240537062287331,
-0.10143062472343445,
0.08315911144018173,
-0.011518794111907482,
0.13300767540931702,
-0.1908978521823883,
-0.004711327143013477,
0.023527126759290695,
0.017062272876501083,
0.06938723474740982,
-0.04269781708717346,
-0.12047335505485535,
0.020503275096416473,
-0.06520251929759979,
-0.03286043927073479,
0.0806969627737999,
-0.16581808030605316,
0.07001416385173798,
0.020716367289423943,
-0.06397327035665512,
0.024767674505710602,
0.01867046393454075,
-0.11243044584989548,
0.1710761934518814,
-0.10752158612012863,
-0.1504254937171936,
-0.12166344374418259,
-0.1154620349407196,
-0.02405325509607792,
-0.006155459210276604,
0.020718112587928772,
-0.12317665666341782,
-0.008431720547378063,
0.013525566086173058,
0.05645575746893883,
-0.055421292781829834,
0.007776676211506128,
0.13713452219963074,
-0.1411147266626358,
-0.026436109095811844,
-0.09725379943847656,
-0.05694695562124252,
-0.019895564764738083,
-0.07817606627941132,
0.06748778373003006,
-0.2129136323928833,
0.07251216471195221,
0.13585098087787628,
0.02325567789375782,
0.09982834756374359,
0.016360729932785034,
0.3367328941822052,
-0.05386688560247421,
-0.0026664468459784985,
0.0950523316860199,
0.14149633049964905,
-0.04330425336956978,
0.016536274924874306,
0.038689762353897095,
-0.09549952298402786,
-0.04658839851617813,
-0.06553371250629425,
-0.12820060551166534,
-0.08238662034273148,
-0.05151986703276634,
-0.0338536761701107,
0.004258343949913979,
-0.07815734297037125,
0.06025339663028717,
0.15719319880008698,
0.10480198264122009,
0.04751380905508995,
-0.040430791676044464,
0.008352668024599552,
0.027864592149853706,
0.08417978882789612,
-0.08748738467693329,
-0.011367099359631538,
-0.04001579061150551,
0.010490577667951584,
0.13355417549610138,
0.07633433490991592,
0.11605875194072723,
0.24588380753993988,
0.029869787395000458,
0.07122798264026642,
-0.012594148516654968,
0.09340053796768188,
0.013476190157234669,
0.01396782137453556,
-0.0781603679060936,
-0.08420702069997787,
-0.06719252467155457,
0.01963438279926777,
0.02149524912238121,
0.07686194032430649,
-0.1904195100069046,
-0.05600326135754585,
-0.08224658668041229,
0.026223009452223778,
-0.10981962084770203,
0.06057288497686386,
-0.12976358830928802,
-0.03795096278190613,
-0.015513447113335133,
0.0393284372985363,
0.025318318977952003,
0.03990842029452324,
-0.07935390621423721,
-0.054050397127866745,
-0.0011312072165310383,
0.04011337459087372,
0.049710463732481,
0.06337980926036835,
0.033337462693452835,
-0.016424736008048058,
-0.039940737187862396,
-0.04995519667863846,
0.037661582231521606,
-0.1879020482301712,
0.22083652019500732,
0.06219949945807457,
-0.015126639045774937,
0.05721120163798332,
-0.04671672359108925,
0.07765727490186691,
0.1936272233724594,
0.18106213212013245,
0.09473521262407303,
-0.03789006173610687,
-0.1014452576637268,
-0.07181229442358017,
0.04512744024395943,
0.05305478349328041,
-0.11842701584100723,
-0.013853131793439388,
0.04894643276929855,
0.06111927703022957,
0.011085165664553642,
0.1290767937898636,
-0.12636983394622803,
-0.015682898461818695,
0.021801071241497993,
0.1107921227812767,
0.08347741514444351,
-0.028236914426088333,
0.0374506451189518,
-0.09972002357244492,
0.12751390039920807,
-0.012213604524731636,
-0.030356545001268387,
-0.11209718883037567,
-0.06955213844776154,
0.033098287880420685,
-0.052519746124744415,
-0.014372240751981735,
-0.026876984164118767,
-0.038031332194805145,
-0.037186149507761,
-0.11286651343107224,
0.0697653591632843,
-0.06513265520334244,
-0.015226410701870918,
-0.050980404019355774,
0.06335495412349701,
-0.014865160919725895,
0.026400495320558548,
0.03752351179718971,
-0.09057120978832245,
0.03521165996789932,
-0.2207506000995636,
0.08820287138223648,
0.06789150834083557,
-0.13660113513469696,
-0.03816569223999977,
0.04514209181070328,
0.1172897219657898,
-0.00909517053514719,
-0.10530894249677658,
0.14396755397319794,
0.20892193913459778,
-0.07672064751386642,
0.11144751310348511,
0.2573416233062744,
-0.041282329708337784,
-0.1358238160610199,
-0.12433839589357376,
-0.1064256802201271,
0.016963431611657143,
0.04374346137046814,
-0.16561980545520782,
-0.055631525814533234,
0.07160796225070953,
-0.01853759214282036,
0.1285737007856369,
-0.35474997758865356,
-0.07835797220468521,
0.14671039581298828,
0.093654103577137,
0.4710884690284729,
-0.08606772869825363,
-0.0918898954987526,
0.021568750962615013,
-0.2995056211948395,
0.09514395892620087,
-0.017449934035539627,
0.11696821451187134,
-0.04067619889974594,
0.15598319470882416,
-0.026371723040938377,
-0.018563836812973022,
0.18759605288505554,
0.049141284078359604,
-0.024556733667850494,
-0.026497960090637207,
-0.09602149575948715,
0.026938995346426964,
0.02187611535191536,
0.04953357204794884,
-0.016715168952941895,
0.022739576175808907,
-0.14006145298480988,
-0.047260601073503494,
0.00002406442945357412,
0.08803107589483261,
0.006592818535864353,
-0.12050316482782364,
-0.03670746460556984,
0.12558697164058685,
-0.014623499475419521,
-0.0012279001530259848,
-0.1411549150943756,
-0.08114651590585709,
0.025409812107682228,
0.053398821502923965,
0.04378921166062355,
-0.09997919201850891,
-0.044931791722774506,
-0.01913049817085266,
-0.04191554710268974,
0.1195392981171608,
-0.02760683186352253,
-0.06369806081056595,
0.07904224097728729,
0.04565533995628357,
0.06338879466056824,
0.031338196247816086,
-0.06327638030052185,
0.06918922066688538,
0.06031474471092224,
-0.07313169538974762,
-0.18698281049728394,
-0.10486579686403275,
0.02172822691500187,
0.15570895373821259,
0.09140115976333618,
0.1168421059846878,
-0.016879910603165627,
0.03608618676662445,
0.042894668877124786,
-0.012843318283557892,
-0.09743855893611908,
0.11668621748685837,
0.023356955498456955,
0.03640172258019447,
-0.05132954567670822,
-0.013425922021269798,
-0.06149189919233322,
0.11615053564310074,
-0.035839904099702835,
0.001320664887316525,
-0.15578347444534302,
-0.0520738884806633,
-0.2639767527580261,
0.04776596277952194,
-0.3810242712497711,
-0.05778934806585312,
0.013364894315600395,
-0.06533322483301163,
0.012859746813774109,
0.027219543233513832,
-0.027518827468156815,
0.1394081711769104,
-0.03544146940112114,
-0.03284531086683273,
0.08703216910362244,
-0.12084785103797913,
-0.158497616648674,
-0.04598480463027954,
-0.03989040479063988,
-0.055278975516557693,
-0.03059113398194313,
0.06502437591552734,
-0.06692861020565033,
-0.0937001183629036,
-0.19672484695911407,
-0.025275589898228645,
-0.12790685892105103,
-0.061161108314991,
-0.11746592819690704,
-0.008065457455813885,
-0.038245897740125656,
0.002237201202660799,
-0.025825191289186478,
0.041906408965587616,
-0.06543763726949692,
0.022899890318512917,
-0.01569906249642372,
0.04456321522593498,
0.07735154032707214,
0.04422822967171669,
-0.010194103233516216,
0.04146827757358551,
0.06442343443632126,
0.055103715509176254,
0.05226074531674385,
0.08552069962024689,
-0.16435492038726807,
0.1032535582780838,
-0.033262740820646286,
-0.031504418700933456,
0.035680703818798065,
-0.09258377552032471,
-0.005132634658366442,
0.011945505626499653,
-0.05215674266219139,
0.04801264405250549,
-0.0614074170589447,
-0.0008296493324451149,
-0.005891426000744104,
-0.09925702959299088,
0.03524598851799965,
-0.04686998575925827,
0.027552317827939987,
0.09488270431756973,
0.09020595252513885,
0.025960464030504227,
-0.004004704300314188,
0.05416664853692055,
-0.08429385721683502,
0.024789374321699142,
0.034556612372398376,
-0.01353102270513773,
-0.01874145306646824,
-0.03880046680569649,
-0.01813138835132122,
-0.027468128129839897,
0.324419230222702,
0.027156077325344086,
0.03471223637461662,
-0.020578093826770782,
0.027023237198591232,
0.11142858862876892,
-0.11076094210147858,
0.3280380666255951,
0.09959467500448227,
0.0448121502995491,
-0.05504627153277397,
0.12227319926023483,
-0.07235945016145706,
-0.15937866270542145,
0.11267507821321487,
-0.03233376890420914,
-0.14226670563220978,
-0.004079807549715042,
0.0634755939245224,
-0.05580483376979828,
-0.06040557846426964,
-0.04551991447806358,
0.01275924313813448,
0.04252143204212189,
-0.06721597909927368,
-0.010517855174839497,
0.18957428634166718,
-0.10065358132123947,
0.04018314555287361,
0.06687748432159424,
-0.008838159963488579,
-0.0932149738073349,
-0.17461727559566498,
-0.019115380942821503,
-0.1677931547164917,
0.07678277045488358,
-0.07619724422693253,
0.045916665345430374,
-0.009294092655181885,
0.006449734326452017,
-0.02057502046227455,
0.08271361887454987,
-0.10067161172628403,
-0.0972171202301979,
0.03513811156153679,
-0.02266606129705906,
-0.052139993757009506,
-0.18028044700622559,
-0.002497066045179963,
-0.021277423948049545,
-0.031534310430288315,
-0.019481034949421883,
-0.03887440264225006,
0.015238531865179539,
-0.06171681731939316,
-0.020602798089385033,
0.033343322575092316,
-0.10026450455188751,
0.00670512905344367,
-0.015131553635001183,
0.09066516160964966,
0.0042855748906731606,
-0.029079625383019447,
0.008122286759316921,
0.03852837160229683,
0.0072393715381622314,
-0.08740638941526413,
-0.03634844347834587,
0.03758765012025833,
-0.04111072048544884,
0.09562676399946213,
-0.054251447319984436,
0.016946254298090935,
0.027709560468792915,
0.24340029060840607,
0.27393633127212524,
-0.09516287595033646,
0.00256621022708714,
-0.0012106143403798342,
0.020112916827201843,
0.054441194981336594,
0.1996879130601883,
0.007694211322814226,
0.20880025625228882,
-0.025539904832839966,
-0.0188507791608572,
-0.061312876641750336,
-0.013746358454227448,
0.09032592922449112,
0.03422151133418083,
0.11975131183862686,
-0.02600703202188015,
-0.08533384650945663,
0.11757879704236984,
-0.1669948548078537,
-0.07103321701288223,
-0.06233334168791771,
0.012837382033467293,
0.06465914845466614,
-0.06927528232336044,
-0.08221931755542755,
0.032059624791145325,
0.048763275146484375,
-0.09457219392061234,
-0.02517087385058403,
-0.050438351929187775,
0.02288181707262993,
-0.24727363884449005,
-0.08399464190006256,
0.13090798258781433,
0.10248604416847229,
0.12390480935573578,
-0.03697078302502632,
0.07149037718772888,
0.055257271975278854,
-0.00895946566015482,
-0.007402648683637381,
0.13627788424491882,
0.01893940009176731,
-0.03389431908726692,
-0.07233824580907822,
-0.13023535907268524,
0.05999220907688141,
0.010773248039186,
-0.1049993634223938,
-0.09807473421096802,
0.10059529542922974,
0.11948321759700775,
0.005173280369490385,
-0.03969333693385124,
0.10541152209043503,
-0.08892463147640228,
0.1587449312210083,
0.04715670272707939,
-0.04084270820021629,
-0.033693503588438034,
-0.032492175698280334,
0.08752132952213287,
0.022704504430294037,
0.04159752279520035,
-0.06321966648101807,
-0.009631749242544174,
-0.12273705750703812,
-0.036150142550468445,
-0.04141511023044586,
-0.1954590529203415,
-0.08080939948558807,
-0.13321061432361603,
0.03211493417620659,
0.0914280116558075,
0.10855817049741745,
0.1636572927236557,
0.06549838185310364,
0.025672556832432747,
-0.11719419807195663,
0.009514094330370426,
0.03492029011249542,
-0.029625389724969864,
-0.08660116791725159
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | ISTNetworks/mistral-v2-lora | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:22:04+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | weifar/codellama-7b-SCdetecting | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T12:23:48+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
56,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06061961501836777,
0.15481999516487122,
-0.004844071343541145,
0.02074851468205452,
0.0983177199959755,
0.007407687604427338,
0.07119518518447876,
0.11185134947299957,
-0.023851769044995308,
0.1167980208992958,
0.031993988901376724,
0.09781743586063385,
0.11217817664146423,
0.16186554729938507,
0.0015333457849919796,
-0.22897611558437347,
0.049678247421979904,
-0.125278040766716,
-0.0294334813952446,
0.11977242678403854,
0.1422213912010193,
-0.10954539477825165,
0.0752737894654274,
-0.038042325526475906,
-0.005828251596540213,
-0.0323176346719265,
-0.06205610930919647,
-0.05266609415411949,
0.05311284959316254,
0.06794639676809311,
0.07308239489793777,
0.01171939354389906,
0.09106900542974472,
-0.2724283039569855,
0.02348201349377632,
0.0805930644273758,
-0.0006441773730330169,
0.07586129754781723,
0.04993962123990059,
-0.08749990910291672,
0.07524524629116058,
-0.060156844556331635,
0.1498761922121048,
0.07955671846866608,
-0.09018243104219437,
-0.19217631220817566,
-0.07921334356069565,
0.09916994720697403,
0.1890910118818283,
0.05953684076666832,
-0.026427440345287323,
0.11642678081989288,
-0.08593545109033585,
0.013638701289892197,
0.06446459144353867,
-0.06054406240582466,
-0.055855002254247665,
0.06904532760381699,
0.08335285633802414,
0.08567540347576141,
-0.12976622581481934,
-0.010767064057290554,
0.015032444149255753,
0.008952446281909943,
0.08948688954114914,
0.017146794125437737,
0.1335189938545227,
0.040557652711868286,
-0.13501930236816406,
-0.043155476450920105,
0.09761431813240051,
0.03665134683251381,
-0.04888195917010307,
-0.2485782504081726,
-0.023432478308677673,
-0.04339504987001419,
-0.03198111802339554,
-0.03649339824914932,
0.043764639645814896,
-0.014506848528981209,
0.07738617807626724,
-0.004502781666815281,
-0.0837155357003212,
-0.04301247000694275,
0.07241875678300858,
0.06128999963402748,
0.02571401372551918,
-0.015821760520339012,
0.0059297760017216206,
0.12327717989683151,
0.11431120336055756,
-0.126715749502182,
-0.052547648549079895,
-0.06306339055299759,
-0.08449548482894897,
-0.044861067086458206,
0.030838407576084137,
0.037995077669620514,
0.045936476439237595,
0.23867325484752655,
0.007765117567032576,
0.053257301449775696,
0.04455438256263733,
0.014407169073820114,
0.06501194834709167,
0.11008983850479126,
-0.05894824117422104,
-0.09719445556402206,
-0.028582042083144188,
0.10156717151403427,
0.007986726239323616,
-0.04139331728219986,
-0.05712985619902611,
0.07059531658887863,
0.018587570637464523,
0.12360043078660965,
0.08000938594341278,
0.003056557849049568,
-0.0755772516131401,
-0.062465377151966095,
0.17764076590538025,
-0.15825673937797546,
0.04532013460993767,
0.03055616281926632,
-0.0341108962893486,
-0.009745313785970211,
0.012105142697691917,
0.025474950671195984,
-0.021481726318597794,
0.09522198140621185,
-0.05601342022418976,
-0.034448131918907166,
-0.11389608681201935,
-0.03694311901926994,
0.030394554138183594,
0.011153047904372215,
-0.02865210548043251,
-0.03502652049064636,
-0.08865131437778473,
-0.06405586749315262,
0.09101516753435135,
-0.07148737460374832,
-0.04784895107150078,
-0.016645915806293488,
-0.07833752781152725,
0.021804187446832657,
0.01691517047584057,
0.09064167737960815,
-0.0222476739436388,
0.03985358029603958,
-0.0550384595990181,
0.061440225690603256,
0.11723454296588898,
0.027987057343125343,
-0.05787884071469307,
0.061519939452409744,
-0.2424532175064087,
0.10252492874860764,
-0.07715212553739548,
0.04971238598227501,
-0.15203025937080383,
-0.02478341944515705,
0.03986154496669769,
0.01284773275256157,
-0.008251311257481575,
0.14196595549583435,
-0.21994100511074066,
-0.030957341194152832,
0.16964265704154968,
-0.10025953501462936,
-0.08109250664710999,
0.060782887041568756,
-0.05354252830147743,
0.11210215091705322,
0.04557164013385773,
-0.02375967986881733,
0.05775221437215805,
-0.14725260436534882,
-0.011030761525034904,
-0.041942402720451355,
-0.0180682260543108,
0.16207332909107208,
0.0703711211681366,
-0.06047816202044487,
0.07456906884908676,
0.01960151270031929,
-0.014246034435927868,
-0.04887177795171738,
-0.02822130173444748,
-0.1047162413597107,
0.01184528972953558,
-0.06102835759520531,
0.018109694123268127,
-0.021768750622868538,
-0.09445013850927353,
-0.029118487611413002,
-0.17402999103069305,
-0.0031633328180760145,
0.08821269869804382,
-0.011630427092313766,
-0.021509924903512,
-0.11245372891426086,
0.009332616813480854,
0.030967719852924347,
0.0002618339203763753,
-0.13677829504013062,
-0.06033218279480934,
0.026970699429512024,
-0.16097871959209442,
0.029791243374347687,
-0.05741601809859276,
0.04530094936490059,
0.04005871340632439,
-0.03433511033654213,
-0.03489551320672035,
0.010874404571950436,
0.010431389324367046,
-0.01894843392074108,
-0.25422003865242004,
-0.01882786676287651,
-0.0234990194439888,
0.1751047968864441,
-0.22956320643424988,
0.042598169296979904,
0.07489731162786484,
0.1460893303155899,
0.007349682506173849,
-0.03550100699067116,
0.015185600146651268,
-0.07262228429317474,
-0.03268764168024063,
-0.06316669285297394,
-0.01207790058106184,
-0.038400664925575256,
-0.05820201337337494,
0.04906858503818512,
-0.1686294972896576,
-0.030321966856718063,
0.10717973858118057,
0.06342670321464539,
-0.1473218947649002,
-0.02780107781291008,
-0.04056945815682411,
-0.04624456167221069,
-0.06676914542913437,
-0.05461418256163597,
0.11812574416399002,
0.056411582976579666,
0.04860803112387657,
-0.07140495628118515,
-0.07455260306596756,
0.008036690764129162,
-0.01956399530172348,
-0.014917809516191483,
0.09334591031074524,
0.07554110884666443,
-0.12264352291822433,
0.09177418053150177,
0.09668384492397308,
0.08576478064060211,
0.10314212739467621,
-0.014663571491837502,
-0.08914592862129211,
-0.040637146681547165,
0.02245822176337242,
0.016187267377972603,
0.15129362046718597,
-0.012961224652826786,
0.055492039769887924,
0.0358695350587368,
-0.014034898020327091,
0.011105312965810299,
-0.09736533463001251,
0.02655916102230549,
0.030835967510938644,
-0.016302183270454407,
0.03745110332965851,
-0.0447014644742012,
0.019208140671253204,
0.09039704501628876,
0.040895868092775345,
0.040978945791721344,
0.010155045427381992,
-0.04354988783597946,
-0.11037563532590866,
0.1787576973438263,
-0.12389461696147919,
-0.24818050861358643,
-0.13812170922756195,
0.010281167924404144,
0.04737642779946327,
-0.010411068797111511,
0.006690691225230694,
-0.06616118550300598,
-0.1175973042845726,
-0.09878289699554443,
0.018617089837789536,
0.045352302491664886,
-0.07590975612401962,
-0.06842505931854248,
0.06414616107940674,
0.03875524550676346,
-0.13939815759658813,
0.024007495492696762,
0.04662325978279114,
-0.08205481618642807,
-0.0029386086389422417,
0.0791812464594841,
0.06965780258178711,
0.17661017179489136,
0.013885351829230785,
-0.023669935762882233,
0.026634456589818,
0.20819635689258575,
-0.1436755359172821,
0.10975687950849533,
0.13545554876327515,
-0.08767466992139816,
0.08120133727788925,
0.1998777538537979,
0.03777998685836792,
-0.10680917650461197,
0.03608465939760208,
0.028374753892421722,
-0.028325283899903297,
-0.2502254545688629,
-0.06958996504545212,
0.0019060121849179268,
-0.05172049254179001,
0.07064855098724365,
0.08791537582874298,
0.09593888372182846,
0.016860228031873703,
-0.09976044297218323,
-0.07697858661413193,
0.046900223940610886,
0.10824491083621979,
-0.00015424020239152014,
-0.015208319760859013,
0.0904119610786438,
-0.03033481352031231,
0.01743943803012371,
0.09215071052312851,
0.0030607767403125763,
0.17535938322544098,
0.051709048449993134,
0.17189906537532806,
0.07866133749485016,
0.06444311141967773,
0.02004685252904892,
0.007725914940237999,
0.021817529574036598,
0.017227526754140854,
-0.0030957073904573917,
-0.08709781616926193,
-0.0034981227945536375,
0.1202581599354744,
0.049845851957798004,
0.029173865914344788,
0.012042860500514507,
-0.030704669654369354,
0.08337877690792084,
0.1770893782377243,
0.0029054484330117702,
-0.1893385946750641,
-0.07169844210147858,
0.07795937359333038,
-0.08648337423801422,
-0.10729733109474182,
-0.029470939189195633,
0.041069481521844864,
-0.1729043871164322,
0.016882894560694695,
-0.019335895776748657,
0.10788324475288391,
-0.13190391659736633,
-0.01772487722337246,
0.05657728388905525,
0.06932812184095383,
-0.009677323512732983,
0.06694949418306351,
-0.16090403497219086,
0.11770165711641312,
0.01751571334898472,
0.06636732816696167,
-0.09608277678489685,
0.09618937969207764,
-0.007830657996237278,
0.0041499207727611065,
0.1410749852657318,
0.010120149701833725,
-0.05952107161283493,
-0.09608154743909836,
-0.10546442121267319,
-0.009841260500252247,
0.1306990385055542,
-0.14852415025234222,
0.08813067525625229,
-0.02661319263279438,
-0.044553373008966446,
0.003614129964262247,
-0.12497276812791824,
-0.13103094696998596,
-0.18366187810897827,
0.05707118660211563,
-0.12947207689285278,
0.04045100137591362,
-0.10902881622314453,
-0.045833900570869446,
-0.02098964899778366,
0.20040063560009003,
-0.23137451708316803,
-0.06714103370904922,
-0.1551055610179901,
-0.08061286807060242,
0.14446212351322174,
-0.046455029398202896,
0.08550118654966354,
0.0008278203313238919,
0.19068008661270142,
0.021319707855582237,
-0.017237508669495583,
0.1072206199169159,
-0.10052918642759323,
-0.2010865956544876,
-0.09273224323987961,
0.15895552933216095,
0.13766798377037048,
0.03809428587555885,
-0.004381525795906782,
0.03171157464385033,
-0.02098114788532257,
-0.12076930701732635,
0.020226983353495598,
0.17317426204681396,
0.08982043713331223,
0.025265544652938843,
-0.02972041629254818,
-0.11267432570457458,
-0.07061342149972916,
-0.03774050623178482,
0.024755435064435005,
0.18072067201137543,
-0.07222156971693039,
0.18405316770076752,
0.13775517046451569,
-0.05534014105796814,
-0.19904261827468872,
0.021996473893523216,
0.04293542355298996,
0.0070380112156271935,
0.0323902890086174,
-0.20307663083076477,
0.09384101629257202,
0.0008334947633557022,
-0.05131231248378754,
0.1379684954881668,
-0.1823476254940033,
-0.151598259806633,
0.06042521819472313,
0.043563615530729294,
-0.19374065101146698,
-0.12374074012041092,
-0.08848230540752411,
-0.04693066328763962,
-0.15487661957740784,
0.10312657803297043,
0.0020827590487897396,
0.008401188999414444,
0.03778626397252083,
0.02252252586185932,
0.012139533646404743,
-0.04198719933629036,
0.1914343535900116,
-0.025891713798046112,
0.03347287327051163,
-0.0790715217590332,
-0.060851071029901505,
0.062408581376075745,
-0.058187782764434814,
0.0755455270409584,
-0.025226406753063202,
0.015947066247463226,
-0.10598332434892654,
-0.048235729336738586,
-0.02852320298552513,
0.019321219995617867,
-0.09431382268667221,
-0.09348297864198685,
-0.04829427972435951,
0.09367614984512329,
0.09042316675186157,
-0.03652578964829445,
-0.03649144619703293,
-0.078715980052948,
0.038977332413196564,
0.17627815902233124,
0.18159319460391998,
0.04659178853034973,
-0.07959239184856415,
-0.001915142871439457,
-0.014336181804537773,
0.04684065282344818,
-0.22077152132987976,
0.060553863644599915,
0.04557652771472931,
0.016117896884679794,
0.11537692695856094,
-0.0208132341504097,
-0.16198977828025818,
-0.06710557639598846,
0.061360616236925125,
-0.06944561004638672,
-0.17825035750865936,
0.0039279889315366745,
0.07344977557659149,
-0.16578389704227448,
-0.037031736224889755,
0.04200848564505577,
-0.01189455483108759,
-0.0403641052544117,
0.012352054007351398,
0.08063354343175888,
0.007078902795910835,
0.07699975371360779,
0.055281639099121094,
0.09124495089054108,
-0.10227900743484497,
0.07410510629415512,
0.08149529248476028,
-0.08644098788499832,
0.030720343813300133,
0.09573426842689514,
-0.06469762325286865,
-0.0346054881811142,
0.04237886518239975,
0.08354541659355164,
0.024281201884150505,
-0.04682289808988571,
0.0023111123591661453,
-0.09734189510345459,
0.05927345156669617,
0.11483542621135712,
0.03496333956718445,
0.011234734207391739,
0.03813567012548447,
0.04486291855573654,
-0.08093374222517014,
0.11926916986703873,
0.023795632645487785,
0.020354853942990303,
-0.04112942889332771,
-0.040553025901317596,
0.035851649940013885,
-0.026020776480436325,
-0.011440055444836617,
-0.035174157470464706,
-0.0722682997584343,
-0.014069457538425922,
-0.16000694036483765,
-0.0076758842915296555,
-0.03660871088504791,
0.005114538595080376,
0.022510098293423653,
-0.03652830421924591,
0.00792311318218708,
0.012217256240546703,
-0.06868947297334671,
-0.05553458258509636,
-0.023233558982610703,
0.09422210603952408,
-0.16494666039943695,
0.0220257006585598,
0.0823851153254509,
-0.12121747434139252,
0.09289738535881042,
0.016782134771347046,
0.00412249518558383,
0.026962365955114365,
-0.1545863002538681,
0.04763968288898468,
-0.020152103155851364,
0.013473534025251865,
0.04222847521305084,
-0.21637047827243805,
-0.004404853098094463,
-0.04015503451228142,
-0.05566934496164322,
-0.008993052877485752,
-0.0319182425737381,
-0.11338426172733307,
0.09645436704158783,
0.011025024577975273,
-0.08443772792816162,
-0.02965564839541912,
0.03353232145309448,
0.07690354436635971,
-0.027447547763586044,
0.1498211771249771,
-0.004663881380110979,
0.07559948414564133,
-0.17581342160701752,
-0.02282017655670643,
-0.011197620071470737,
0.022367527708411217,
-0.021871577948331833,
-0.01622559316456318,
0.04623444378376007,
-0.02704801969230175,
0.19120801985263824,
-0.024701936170458794,
0.049393873661756516,
0.06364397704601288,
0.009232889860868454,
-0.013832193799316883,
0.11151392012834549,
0.05708572641015053,
0.024334950372576714,
0.022262847051024437,
0.003451440716162324,
-0.04008655622601509,
-0.009981024079024792,
-0.18596695363521576,
0.06803664565086365,
0.14585918188095093,
0.09060460329055786,
-0.012669353745877743,
0.0707244873046875,
-0.10161512345075607,
-0.12005364894866943,
0.10127941519021988,
-0.06415384262800217,
-0.010188822634518147,
-0.06542414426803589,
0.14027701318264008,
0.14953285455703735,
-0.1886233240365982,
0.06583356112241745,
-0.06602055579423904,
-0.0566304549574852,
-0.11457879096269608,
-0.1930263340473175,
-0.057075321674346924,
-0.050602465867996216,
-0.018466074019670486,
-0.05384097993373871,
0.06939727067947388,
0.05750798434019089,
0.01126816775649786,
0.00868057832121849,
0.08568526059389114,
-0.009656033478677273,
0.00248199631460011,
0.030120067298412323,
0.06713981181383133,
0.016768986359238625,
-0.0321255661547184,
0.0179112758487463,
-0.00597198773175478,
0.034156378358602524,
0.059282708913087845,
0.03608176112174988,
-0.028436895459890366,
0.015559280291199684,
-0.034912437200546265,
-0.11309733241796494,
0.042801856994628906,
-0.029640642926096916,
-0.0749855786561966,
0.1347348988056183,
0.026981467381119728,
0.005015076603740454,
-0.023140020668506622,
0.2503887414932251,
-0.07436972856521606,
-0.09334370493888855,
-0.14373961091041565,
0.11701542884111404,
-0.04212593287229538,
0.0635172426700592,
0.03596310690045357,
-0.10810714215040207,
0.017985546961426735,
0.1320217251777649,
0.15442703664302826,
-0.04732590913772583,
0.019251897931098938,
0.028577854856848717,
0.00439635943621397,
-0.04075566306710243,
0.05177190154790878,
0.07100846618413925,
0.14500564336776733,
-0.05157303810119629,
0.08530787378549576,
0.002609728369861841,
-0.1021018698811531,
-0.041973695158958435,
0.11415864527225494,
-0.014296893030405045,
0.017620453611016273,
-0.057136841118335724,
0.124222531914711,
-0.05874236673116684,
-0.23697422444820404,
0.06316976249217987,
-0.0765061303973198,
-0.1432730257511139,
-0.024886758998036385,
0.071670763194561,
-0.016632623970508575,
0.02605951391160488,
0.07167234271764755,
-0.0754380151629448,
0.18880942463874817,
0.03957989811897278,
-0.05233397334814072,
-0.05954399332404137,
0.0744764655828476,
-0.11850855499505997,
0.27879106998443604,
0.010482731275260448,
0.051307905465364456,
0.1042102724313736,
-0.02021743729710579,
-0.13270841538906097,
0.023401619866490364,
0.09579801559448242,
-0.08917027711868286,
0.04087764397263527,
0.21448291838169098,
-0.00629545608535409,
0.11935057491064072,
0.07611140608787537,
-0.07468950748443604,
0.047562725841999054,
-0.11468592286109924,
-0.07639975845813751,
-0.08699081838130951,
0.09244474768638611,
-0.06785612553358078,
0.14258281886577606,
0.12599852681159973,
-0.05530165135860443,
0.011584274470806122,
-0.028389399871230125,
0.045467376708984375,
0.005578654818236828,
0.100032277405262,
0.011115525849163532,
-0.18496567010879517,
0.024811718612909317,
0.016259413212537766,
0.10884406417608261,
-0.18112654983997345,
-0.09105053544044495,
0.046958595514297485,
0.0005061255069449544,
-0.06443515419960022,
0.12483241409063339,
0.057313691824674606,
0.04654949903488159,
-0.0451689288020134,
-0.026830285787582397,
-0.006042256020009518,
0.14264579117298126,
-0.10707559436559677,
-0.005129707511514425
] |
null | null | transformers |
# Model Trained Using AutoTrain
- Problem type: Image Classification
## Validation Metricsg
loss: 6.152068138122559
f1_macro: 0.002214415118096559
f1_micro: 0.012527101903155865
f1_weighted: 0.0022165489799304268
precision_macro: 0.0015895320987927826
precision_micro: 0.012527101903155867
precision_weighted: 0.0015910638088373914
recall_macro: 0.012515042117930204
recall_micro: 0.012527101903155867
recall_weighted: 0.012527101903155867
accuracy: 0.012527101903155867
| {"tags": ["autotrain", "image-classification"], "datasets": ["footballer-recognition-gray-nobg/autotrain-data"], "widget": [{"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg", "example_title": "Tiger"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg", "example_title": "Teapot"}, {"src": "https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg", "example_title": "Palace"}]} | image-classification | IsaacMwesigwa/footballer-recognition-gray-nobg | [
"transformers",
"safetensors",
"resnet",
"image-classification",
"autotrain",
"dataset:footballer-recognition-gray-nobg/autotrain-data",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:26:15+00:00 | [] | [] | TAGS
#transformers #safetensors #resnet #image-classification #autotrain #dataset-footballer-recognition-gray-nobg/autotrain-data #autotrain_compatible #endpoints_compatible #region-us
|
# Model Trained Using AutoTrain
- Problem type: Image Classification
## Validation Metricsg
loss: 6.152068138122559
f1_macro: 0.002214415118096559
f1_micro: 0.012527101903155865
f1_weighted: 0.0022165489799304268
precision_macro: 0.0015895320987927826
precision_micro: 0.012527101903155867
precision_weighted: 0.0015910638088373914
recall_macro: 0.012515042117930204
recall_micro: 0.012527101903155867
recall_weighted: 0.012527101903155867
accuracy: 0.012527101903155867
| [
"# Model Trained Using AutoTrain\n\n- Problem type: Image Classification",
"## Validation Metricsg\nloss: 6.152068138122559\n\nf1_macro: 0.002214415118096559\n\nf1_micro: 0.012527101903155865\n\nf1_weighted: 0.0022165489799304268\n\nprecision_macro: 0.0015895320987927826\n\nprecision_micro: 0.012527101903155867\n\nprecision_weighted: 0.0015910638088373914\n\nrecall_macro: 0.012515042117930204\n\nrecall_micro: 0.012527101903155867\n\nrecall_weighted: 0.012527101903155867\n\naccuracy: 0.012527101903155867"
] | [
"TAGS\n#transformers #safetensors #resnet #image-classification #autotrain #dataset-footballer-recognition-gray-nobg/autotrain-data #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Trained Using AutoTrain\n\n- Problem type: Image Classification",
"## Validation Metricsg\nloss: 6.152068138122559\n\nf1_macro: 0.002214415118096559\n\nf1_micro: 0.012527101903155865\n\nf1_weighted: 0.0022165489799304268\n\nprecision_macro: 0.0015895320987927826\n\nprecision_micro: 0.012527101903155867\n\nprecision_weighted: 0.0015910638088373914\n\nrecall_macro: 0.012515042117930204\n\nrecall_micro: 0.012527101903155867\n\nrecall_weighted: 0.012527101903155867\n\naccuracy: 0.012527101903155867"
] | [
65,
16,
151
] | [
"passage: TAGS\n#transformers #safetensors #resnet #image-classification #autotrain #dataset-footballer-recognition-gray-nobg/autotrain-data #autotrain_compatible #endpoints_compatible #region-us \n# Model Trained Using AutoTrain\n\n- Problem type: Image Classification## Validation Metricsg\nloss: 6.152068138122559\n\nf1_macro: 0.002214415118096559\n\nf1_micro: 0.012527101903155865\n\nf1_weighted: 0.0022165489799304268\n\nprecision_macro: 0.0015895320987927826\n\nprecision_micro: 0.012527101903155867\n\nprecision_weighted: 0.0015910638088373914\n\nrecall_macro: 0.012515042117930204\n\nrecall_micro: 0.012527101903155867\n\nrecall_weighted: 0.012527101903155867\n\naccuracy: 0.012527101903155867"
] | [
-0.11926262825727463,
0.15352173149585724,
-0.0007494374876841903,
0.10435070842504501,
0.11802400648593903,
0.0028350779321044683,
0.027774149551987648,
0.13615426421165466,
-0.05491797253489494,
0.14655369520187378,
0.17272546887397766,
0.048181790858507156,
0.08880066126585007,
0.14214077591896057,
-0.10391736775636673,
-0.0631241723895073,
-0.0014287169324234128,
-0.007047342136502266,
0.09414560347795486,
0.07063918560743332,
0.03664549067616463,
-0.137830913066864,
0.11067158728837967,
-0.0420776829123497,
-0.25487563014030457,
0.04233958572149277,
0.12061885744333267,
-0.0857425406575203,
0.04055436700582504,
0.10479643940925598,
0.08112148940563202,
-0.04644691199064255,
0.07825970649719238,
-0.09045521914958954,
-0.021439673379063606,
0.027201039716601372,
0.010101890191435814,
0.022484658285975456,
0.14151206612586975,
0.04380722716450691,
-0.017192212864756584,
-0.052160054445266724,
0.06468109786510468,
0.0521644726395607,
-0.09918048232793808,
-0.03300067409873009,
-0.09244187921285629,
0.03427111729979515,
0.055027157068252563,
0.11238200217485428,
-0.009833232499659061,
0.1763858199119568,
-0.025398537516593933,
0.04915696382522583,
0.09234610944986343,
-0.21661558747291565,
-0.09505809843540192,
0.1643671840429306,
-0.0964079350233078,
-0.013956223614513874,
-0.08384405076503754,
0.06452139467000961,
0.103329136967659,
0.028195613995194435,
-0.003474080702289939,
0.03091968595981598,
0.03349985182285309,
-0.0036301633808761835,
-0.057314224541187286,
-0.07114402204751968,
0.16490378975868225,
0.027866728603839874,
-0.08943653851747513,
-0.12061593681573868,
-0.028881428763270378,
-0.16278581321239471,
-0.09312748908996582,
-0.032494403421878815,
0.015488586388528347,
-0.07789571583271027,
-0.13013654947280884,
0.12454601377248764,
-0.024246130138635635,
-0.07394994795322418,
-0.19479435682296753,
0.08068185299634933,
-0.03093043901026249,
0.025507325306534767,
0.012732927687466145,
0.053522318601608276,
-0.04911559447646141,
-0.08081135898828506,
-0.0070114051923155785,
0.02396368607878685,
-0.1358713060617447,
-0.10540608316659927,
-0.00970485806465149,
0.0003320577379781753,
0.05678054690361023,
0.13781970739364624,
0.043159086257219315,
0.0955909788608551,
-0.033951446413993835,
0.010595977306365967,
-0.10413219779729843,
0.1646343171596527,
-0.11868775635957718,
-0.10947703570127487,
0.02193853259086609,
0.018357638269662857,
0.05254125967621803,
-0.015186048112809658,
-0.05424917861819267,
-0.06748315691947937,
0.0958704948425293,
0.01871221698820591,
-0.001189804170280695,
0.022829005494713783,
-0.08168163895606995,
-0.015308220870792866,
-0.024944184347987175,
-0.07230229675769806,
0.04722427949309349,
-0.03352999687194824,
-0.16390705108642578,
-0.03607397899031639,
0.030167503282427788,
0.013196749612689018,
-0.024377601221203804,
0.06641997396945953,
-0.13280846178531647,
-0.0008828914724290371,
-0.03850251063704491,
-0.0823102816939354,
0.06322815269231796,
-0.08416427671909332,
0.020001009106636047,
-0.12714211642742157,
-0.1946251094341278,
-0.04753841087222099,
-0.05229031294584274,
-0.12952718138694763,
-0.016667675226926804,
-0.08835723996162415,
-0.12279067933559418,
0.04818008467555046,
0.05260271951556206,
0.07806575298309326,
-0.036048851907253265,
0.04215104132890701,
0.024071764200925827,
0.1132206991314888,
-0.06261687725782394,
0.02270568162202835,
-0.09525611251592636,
-0.018904445692896843,
-0.19087381660938263,
0.07491311430931091,
-0.0226608719676733,
0.032318949699401855,
-0.10004352033138275,
0.010326250456273556,
-0.03222301974892616,
-0.015495258383452892,
0.06184457615017891,
0.12923014163970947,
-0.1686869114637375,
-0.059285346418619156,
0.09065931290388107,
-0.07935226708650589,
-0.12163033336400986,
0.14002221822738647,
0.016344290226697922,
-0.02360193058848381,
0.07959797978401184,
0.12970013916492462,
0.05649195984005928,
-0.12338034808635712,
-0.039451759308576584,
-0.06634709984064102,
-0.06530401855707169,
-0.0165792815387249,
0.05497799813747406,
-0.009173925034701824,
-0.061986058950424194,
-0.014150159433484077,
0.030674604699015617,
0.06926627457141876,
-0.08932173997163773,
-0.06545929610729218,
-0.027926063165068626,
-0.09821318089962006,
0.04749378189444542,
0.04935922846198082,
0.027942175045609474,
-0.11024564504623413,
-0.01852615550160408,
0.025666162371635437,
0.07452987879514694,
0.02451491542160511,
-0.08444347232580185,
-0.14123409986495972,
0.07319973409175873,
-0.1840820610523224,
-0.034932829439640045,
-0.17757859826087952,
-0.1250104308128357,
0.016404734924435616,
-0.0039744023233652115,
0.006001650355756283,
-0.11404751986265182,
0.10865563154220581,
0.07220770418643951,
-0.0382959246635437,
-0.02951987273991108,
0.052894387394189835,
-0.0064292410388588905,
-0.08292806893587112,
-0.09872325509786606,
0.02504388429224491,
0.03822702914476395,
0.19847610592842102,
-0.2260059267282486,
-0.020749781280755997,
0.036885518580675125,
0.1356787383556366,
0.04035315662622452,
-0.06420587748289108,
-0.04882853105664253,
0.054576385766267776,
0.0010085878893733025,
-0.07985720038414001,
0.06208448484539986,
-0.025012332946062088,
-0.08333584666252136,
-0.02195087820291519,
-0.19579632580280304,
0.31205564737319946,
0.15460872650146484,
0.04825758561491966,
-0.11694253981113434,
-0.006936138495802879,
0.030078431591391563,
-0.06998693197965622,
-0.09888030588626862,
-0.0019702105782926083,
0.059186119586229324,
0.003479479579254985,
0.12278953939676285,
-0.08096090704202652,
-0.017603060230612755,
0.07683423161506653,
0.0003730065072886646,
-0.065283864736557,
0.12039308249950409,
-0.04438973590731621,
-0.17728698253631592,
0.12438582628965378,
0.033686015754938126,
-0.053176991641521454,
0.15795141458511353,
0.008565455675125122,
-0.048100218176841736,
-0.05069158226251602,
0.013301380909979343,
0.010569954290986061,
0.07828066498041153,
0.04009838402271271,
0.03412754461169243,
0.0034867674112319946,
-0.01855887845158577,
-0.035758309066295624,
-0.07710438966751099,
-0.021419664844870567,
0.022233249619603157,
0.0023350107949227095,
-0.015677031129598618,
0.005158529616892338,
0.0661979615688324,
0.17589402198791504,
0.01365508884191513,
-0.06840827316045761,
0.05842645838856697,
-0.016563216224312782,
-0.08682440966367722,
0.16428358852863312,
-0.07859733700752258,
-0.13301905989646912,
-0.1758480817079544,
-0.050046250224113464,
-0.14346003532409668,
0.014752931892871857,
-0.010041119530797005,
-0.07676775753498077,
-0.05278748646378517,
-0.0838426947593689,
-0.04474041983485222,
0.0243160929530859,
-0.04353456571698189,
0.08085377514362335,
-0.042356595396995544,
0.07437071204185486,
-0.044754546135663986,
-0.05480461195111275,
-0.00827921461313963,
-0.0770590752363205,
0.0708293467760086,
-0.005006176419556141,
0.16834843158721924,
0.1463755965232849,
-0.10227906703948975,
0.046536851674318314,
-0.058518245816230774,
0.12303910404443741,
-0.04243655875325203,
0.0007313107489608228,
0.15845267474651337,
0.053028468042612076,
0.03982369229197502,
0.10163827240467072,
0.01183367520570755,
-0.0823001116514206,
0.04610102251172066,
0.055165745317935944,
-0.015876512974500656,
-0.07170966267585754,
-0.13875947892665863,
0.03378494828939438,
0.016381725668907166,
0.1775059849023819,
0.01651235856115818,
0.10724585503339767,
0.07736071944236755,
0.007431474514305592,
0.06577347218990326,
0.010343408212065697,
0.069668710231781,
0.10239794105291367,
0.04199115186929703,
0.14044642448425293,
-0.06787800043821335,
-0.03626696392893791,
0.07418116927146912,
-0.05298619344830513,
0.07882031053304672,
0.03142891824245453,
0.01977537013590336,
-0.031993407756090164,
0.07366672903299332,
0.05333153158426285,
0.10455681383609772,
0.09726805239915848,
-0.0879686251282692,
0.018143054097890854,
-0.09220131486654282,
-0.0708526223897934,
0.016553357243537903,
0.021442152559757233,
0.1311812698841095,
-0.13022425770759583,
0.06345941126346588,
0.04750443249940872,
0.12346435338258743,
0.08006279170513153,
-0.5008777379989624,
-0.07922670990228653,
0.009755046106874943,
0.04949476569890976,
-0.12839798629283905,
0.020926518365740776,
0.09385852515697479,
-0.12501250207424164,
0.046971358358860016,
-0.025387266650795937,
0.07278995215892792,
0.005504793953150511,
0.006682251114398241,
-0.00812680833041668,
0.07995946705341339,
-0.044166576117277145,
0.016096744686365128,
-0.18492135405540466,
0.12523595988750458,
0.055731385946273804,
0.08983040601015091,
-0.10446471720933914,
-0.018229631707072258,
0.07182903587818146,
-0.03724038600921631,
0.16662119328975677,
-0.003928388934582472,
-0.11891815066337585,
-0.4258517622947693,
-0.04414060711860657,
-0.02864113077521324,
0.04962887987494469,
0.03899548947811127,
0.09273020178079605,
-0.045275263488292694,
0.014395145699381828,
0.022857917472720146,
-0.0538829043507576,
-0.1041179671883583,
-0.019046418368816376,
-0.04837219789624214,
0.16215260326862335,
0.006090022157877684,
-0.04875864461064339,
-0.07206074893474579,
-0.07881723344326019,
0.0248656515032053,
-0.047089893370866776,
-0.051858168095350266,
-0.13715486228466034,
0.1438932567834854,
0.09601178020238876,
-0.0653933510184288,
0.0731857642531395,
0.006090419366955757,
0.15932665765285492,
0.0012371176853775978,
-0.07653886079788208,
0.0768962875008583,
-0.0832095742225647,
-0.06753239035606384,
-0.026354948058724403,
0.04596472904086113,
0.04663100838661194,
0.026612674817442894,
0.05004916712641716,
0.09569848328828812,
0.021091308444738388,
-0.08076333999633789,
0.11417456716299057,
0.026616867631673813,
0.13538019359111786,
0.11829681694507599,
0.012356703169643879,
-0.15950457751750946,
-0.02956560254096985,
0.0420679897069931,
0.06054939702153206,
0.2591620683670044,
-0.11011895537376404,
-0.037164125591516495,
-0.01029678899794817,
-0.034330904483795166,
-0.27773329615592957,
0.05300375446677208,
0.028551284223794937,
0.04768487438559532,
0.036234185099601746,
0.016685200855135918,
0.16131217777729034,
0.1610404998064041,
0.008375310339033604,
0.0703083798289299,
-0.26521968841552734,
-0.09115102887153625,
0.2092646360397339,
0.12673868238925934,
0.10134381800889969,
-0.1039653941988945,
-0.05170422047376633,
-0.12566791474819183,
-0.1458236277103424,
0.01248141098767519,
0.02090659737586975,
0.058523211628198624,
-0.05538100749254227,
0.012647207826375961,
0.059556879103183746,
-0.05701613798737526,
0.12287529557943344,
0.0018363569397479296,
0.10979124903678894,
-0.08521008491516113,
-0.06959111243486404,
0.024416683241724968,
-0.06459388881921768,
0.10773611068725586,
0.1131012886762619,
0.03721143305301666,
-0.08803398907184601,
0.020272595807909966,
-0.03122587874531746,
0.04466532915830612,
-0.03257785364985466,
0.018106134608387947,
-0.050770118832588196,
0.05448387190699577,
-0.043677058070898056,
-0.034967001527547836,
-0.04773653298616409,
-0.031098276376724243,
0.1233963817358017,
0.18270817399024963,
0.0744592696428299,
-0.029372844845056534,
-0.05226927623152733,
0.03619072213768959,
-0.07735610008239746,
0.051637202501297,
0.0020763904321938753,
0.06818472594022751,
0.15643085539340973,
0.0445183701813221,
0.11155903339385986,
0.05881085246801376,
-0.031251899898052216,
0.021481845527887344,
0.040752630680799484,
-0.05342325195670128,
0.0441121831536293,
-0.04317224770784378,
-0.013271702453494072,
-0.01636759750545025,
-0.040140118449926376,
0.05984289199113846,
-0.059323620051145554,
0.024826930835843086,
0.004327773582190275,
0.0028507730457931757,
0.019824273884296417,
0.24536454677581787,
-0.030648022890090942,
0.047043439000844955,
-0.11470120400190353,
0.17521832883358002,
0.0162370502948761,
-0.14835616946220398,
0.021782878786325455,
-0.07573996484279633,
-0.09711658209562302,
-0.003774139564484358,
0.009483217261731625,
0.185065358877182,
-0.1461300402879715,
-0.06761261075735092,
-0.07503871619701385,
-0.169618159532547,
0.10384057462215424,
0.2701354920864105,
0.06453435122966766,
0.0127957072108984,
-0.0018328105797991157,
-0.05511970818042755,
-0.07426180690526962,
0.09467091411352158,
0.12951324880123138,
0.05077031999826431,
-0.1275746375322342,
0.11127721518278122,
-0.027171965688467026,
-0.04383786395192146,
-0.029856683686375618,
0.01597166433930397,
-0.14162805676460266,
0.003636162029579282,
-0.06489000469446182,
0.06702469289302826,
-0.02372884564101696,
-0.005920625291764736,
-0.018582915887236595,
0.014032218605279922,
-0.04246169328689575,
0.009067419916391373,
-0.04550056904554367,
-0.026972077786922455,
0.04868003726005554,
0.08757012337446213,
-0.11305386573076248,
-0.08028807491064072,
0.03129531815648079,
-0.010097759775817394,
0.010834554210305214,
0.0794258788228035,
0.0909615010023117,
-0.009023865684866905,
-0.06280577927827835,
-0.031158503144979477,
0.04538678750395775,
-0.07671112567186356,
0.015758909285068512,
-0.16392475366592407,
0.04862074553966522,
0.039321403950452805,
-0.03583896532654762,
0.08349817246198654,
0.037076685577631,
-0.08995328098535538,
-0.017507823184132576,
-0.07491153478622437,
-0.03208523243665695,
-0.1539747416973114,
0.05488735809922218,
0.22160933911800385,
0.020328203216195107,
0.13046512007713318,
-0.11398346722126007,
-0.014025233685970306,
-0.2245500683784485,
-0.006629541050642729,
-0.042077645659446716,
-0.08522375673055649,
-0.1133776605129242,
0.002729272935539484,
0.07580549269914627,
0.003237989265471697,
0.05447278544306755,
-0.035658132284879684,
0.017635662108659744,
0.035093795508146286,
0.1369021236896515,
-0.07100057601928711,
0.01304745301604271,
0.2089947760105133,
0.07187865674495697,
-0.05760688707232475,
0.10367805510759354,
0.08618421107530594,
0.10473427921533585,
0.021836236119270325,
0.03681538626551628,
0.0739985778927803,
-0.1103634461760521,
0.11148737370967865,
0.006995390169322491,
-0.027738245204091072,
-0.008890621364116669,
0.1988750398159027,
-0.09908080101013184,
-0.007012472953647375,
-0.05218058452010155,
0.02353939414024353,
0.18798290193080902,
-0.14821693301200867,
-0.009159586392343044,
-0.092669777572155,
-0.0315948985517025,
-0.10733260214328766,
-0.10295385867357254,
-0.12661850452423096,
-0.09171321243047714,
0.002314081182703376,
-0.09997306019067764,
0.06744963675737381,
0.15509890019893646,
0.02327948808670044,
-0.017445052042603493,
0.18643735349178314,
-0.21996180713176727,
-0.060089971870183945,
0.04392474517226219,
-0.037947557866573334,
-0.043276384472846985,
-0.05929107218980789,
-0.0689840093255043,
0.08073583245277405,
-0.0024250412825495005,
0.05207650735974312,
0.0040947808884084225,
0.04184466227889061,
0.013261771760880947,
-0.011081816628575325,
-0.09468403458595276,
-0.009230917319655418,
0.029636694118380547,
0.023655708879232407,
-0.01723416894674301,
-0.000798065448179841,
0.04566635191440582,
-0.037500541657209396,
0.18288463354110718,
-0.10345717519521713,
-0.009467982687056065,
-0.06295666098594666,
0.24091604351997375,
0.020071152597665787,
0.07611094415187836,
0.014213436283171177,
-0.06079848110675812,
0.10557848960161209,
0.11576741933822632,
0.03352520614862442,
-0.03867825120687485,
-0.06615331023931503,
-0.04190982133150101,
-0.021283484995365143,
-0.04789380729198456,
0.057809412479400635,
0.02223287709057331,
-0.004689383786171675,
-0.08533576130867004,
0.08990184962749481,
-0.021388737484812737,
-0.01265077292919159,
0.03239896893501282,
0.05759083852171898,
0.016092509031295776,
0.06208060681819916,
-0.07638105750083923,
0.07950318604707718,
-0.0019029868999496102,
0.0304196085780859,
0.17155146598815918,
-0.14190857112407684,
-0.11918436735868454,
0.05966753140091896,
-0.021786672994494438,
-0.030696699395775795,
0.103970006108284,
-0.07612119615077972,
0.04484852775931358,
-0.05650268495082855,
-0.031242867931723595,
-0.15131431818008423,
-0.14566464722156525,
-0.0402044914662838,
0.1504683494567871,
0.2599024176597595,
-0.011589989066123962,
0.09705811738967896,
0.10335779935121536,
-0.011909143067896366,
-0.14023888111114502,
0.10390312969684601,
-0.047872498631477356,
-0.0688905417919159,
0.07346836477518082,
0.10211970657110214,
0.0035954758059233427,
0.19669656455516815,
0.03747044503688812,
-0.1785760074853897,
0.018482685089111328,
-0.06056728586554527,
-0.05182884261012077,
-0.07207000255584717,
0.04406458139419556,
-0.039409246295690536,
0.16210563480854034,
0.10107839852571487,
-0.02294563129544258,
0.015401358716189861,
-0.02812272496521473,
0.03697849065065384,
0.04750664532184601,
0.08714547008275986,
0.02532697096467018,
-0.10293688625097275,
0.06401792913675308,
-0.11325743049383163,
-0.0393945686519146,
-0.3306833505630493,
-0.04067313298583031,
-0.022317567840218544,
-0.08371372520923615,
-0.02048519439995289,
0.10261785238981247,
0.08228223770856857,
0.03566087782382965,
-0.053987886756658554,
-0.2305508404970169,
0.09516432136297226,
0.18883052468299866,
-0.026910701766610146,
-0.09659663587808609
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Small TR - tgrhn
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Common Voice 11.0 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2341
- Wer: 75.4885
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.2083 | 0.44 | 1000 | 0.2763 | 43.3933 |
| 0.1758 | 0.89 | 2000 | 0.2537 | 44.1598 |
| 0.0904 | 1.33 | 3000 | 0.2420 | 72.1333 |
| 0.0804 | 1.77 | 4000 | 0.2341 | 75.4885 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.1
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"language": ["tr"], "license": "apache-2.0", "tags": ["hf-asr-leaderboard", "generated_from_trainer"], "datasets": ["mozilla-foundation/common_voice_11_0"], "metrics": ["wer"], "base_model": "openai/whisper-small", "model-index": [{"name": "Whisper Small TR - tgrhn", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 11.0", "type": "mozilla-foundation/common_voice_11_0", "config": "tr", "split": "None", "args": "config: tr, split: test"}, "metrics": [{"type": "wer", "value": 75.48854087830627, "name": "Wer"}]}]}]} | automatic-speech-recognition | tgrhn/whisper-small-tr-trial2 | [
"transformers",
"tensorboard",
"whisper",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"generated_from_trainer",
"tr",
"dataset:mozilla-foundation/common_voice_11_0",
"base_model:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:28:00+00:00 | [] | [
"tr"
] | TAGS
#transformers #tensorboard #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us
| Whisper Small TR - tgrhn
========================
This model is a fine-tuned version of openai/whisper-small on the Common Voice 11.0 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2341
* Wer: 75.4885
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* training\_steps: 4000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.1
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #tensorboard #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
99,
130,
4,
30
] | [
"passage: TAGS\n#transformers #tensorboard #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.09610486030578613,
0.08789045363664627,
-0.003484320128336549,
0.05018914118409157,
0.09998302906751633,
0.006606458220630884,
0.1331280916929245,
0.14324261248111725,
-0.07609260082244873,
0.06392505764961243,
0.0761401355266571,
0.07326967269182205,
0.07807910442352295,
0.11421258002519608,
-0.024476828053593636,
-0.2971479594707489,
0.014405602589249611,
-0.004547130316495895,
-0.10455798357725143,
0.09654242545366287,
0.10925543308258057,
-0.09632984548807144,
0.029919778928160667,
0.016062933951616287,
-0.09235934913158417,
0.011279075406491756,
-0.009571564383804798,
-0.05525914952158928,
0.1056365817785263,
0.055740948766469955,
0.06302295625209808,
0.02584935910999775,
0.08004581183195114,
-0.25339293479919434,
0.021210748702287674,
0.03931562602519989,
0.0436822772026062,
0.058641884475946426,
0.07134399563074112,
-0.012276621535420418,
0.09092313796281815,
-0.052699897438287735,
0.06268703192472458,
0.061804406344890594,
-0.09346689283847809,
-0.3021410405635834,
-0.07685196399688721,
0.047903262078762054,
0.11059465259313583,
0.0716119334101677,
-0.0347091369330883,
0.0697314590215683,
-0.05453689023852348,
0.10068607330322266,
0.19614674150943756,
-0.21525931358337402,
-0.07116004824638367,
-0.04798494279384613,
0.053436826914548874,
0.06509272754192352,
-0.10282094031572342,
-0.009435099549591541,
0.025537047535181046,
0.03374109044671059,
0.08434393256902695,
-0.00512760691344738,
-0.006430389825254679,
-0.005591833498328924,
-0.13450534641742706,
-0.033541787415742874,
0.13942629098892212,
0.07098962366580963,
-0.025284631177783012,
-0.11102931201457977,
-0.02143017016351223,
-0.11668137460947037,
-0.05760179087519646,
0.0074830688536167145,
0.02279190719127655,
-0.024785218760371208,
-0.06334173679351807,
-0.025939175859093666,
-0.0744626522064209,
-0.08099359273910522,
0.025649726390838623,
0.1558612883090973,
0.03559396415948868,
-0.03799205645918846,
-0.012086219154298306,
0.08657055348157883,
0.05458413064479828,
-0.14623039960861206,
-0.0014317564200609922,
0.037104420363903046,
-0.07828203588724136,
-0.018598487600684166,
-0.030690085142850876,
-0.054262615740299225,
0.03706071153283119,
0.12418846040964127,
0.013820639811456203,
0.09401243925094604,
-0.0018987677758559585,
0.031400442123413086,
-0.09674283117055893,
0.16485074162483215,
-0.0508243590593338,
-0.0536077618598938,
-0.0280402023345232,
0.13375556468963623,
0.011095061898231506,
-0.017928777262568474,
-0.07892753928899765,
0.027120571583509445,
0.0766131579875946,
0.03893015533685684,
-0.0020921628456562757,
0.02827141433954239,
-0.06647203117609024,
-0.027508795261383057,
-0.05736394599080086,
-0.12526893615722656,
0.025727955624461174,
0.03811042383313179,
-0.05011896789073944,
-0.015875056385993958,
0.011015736497938633,
0.04015681520104408,
-0.018836818635463715,
0.04220164194703102,
-0.04695271700620651,
0.006021544802933931,
-0.06991442292928696,
-0.09787730872631073,
0.021446147933602333,
-0.004500801209360361,
0.006877456791698933,
-0.0745033547282219,
-0.09169110655784607,
-0.054442986845970154,
0.05808540806174278,
-0.039571262896060944,
-0.05982166901230812,
-0.08030958473682404,
-0.07185573130846024,
0.04355351999402046,
-0.012023864313960075,
0.14199893176555634,
-0.04736843705177307,
0.09569680690765381,
0.027086621150374413,
0.04883376881480217,
0.03995691239833832,
0.06458166241645813,
-0.03155610337853432,
0.04277753829956055,
-0.13547617197036743,
0.1164151132106781,
-0.11155690252780914,
0.06288114190101624,
-0.1339917927980423,
-0.09125129133462906,
-0.005393651779741049,
0.006771144922822714,
0.09363380819559097,
0.11339586973190308,
-0.18751941621303558,
-0.1042136549949646,
0.16375243663787842,
-0.07849624007940292,
-0.08730995655059814,
0.15996503829956055,
-0.026669370010495186,
0.02960135042667389,
0.04819219186902046,
0.20795409381389618,
0.1355263888835907,
-0.07266681641340256,
0.03393262252211571,
-0.03572206199169159,
0.1072852835059166,
0.06137793883681297,
0.08815839886665344,
-0.050651684403419495,
-0.00057045096764341,
0.0007508559501729906,
-0.03122672252357006,
0.0825234055519104,
-0.07329138368368149,
-0.09439987689256668,
-0.015384799800813198,
-0.08189137279987335,
0.030348028987646103,
0.05012930929660797,
0.026369569823145866,
-0.08610759675502777,
-0.10743673145771027,
0.03174480050802231,
0.10889499634504318,
-0.10887347906827927,
0.027889434248209,
-0.09674695879220963,
0.03796900808811188,
-0.008318296633660793,
-0.0051838718354702,
-0.13371700048446655,
0.029141811653971672,
0.028015071526169777,
-0.06681757420301437,
0.04811909794807434,
-0.020237158983945847,
0.09209326654672623,
0.035767536610364914,
-0.054655734449625015,
-0.05696507543325424,
-0.04212941229343414,
0.008611666969954967,
-0.0750589370727539,
-0.24055340886116028,
-0.06260254234075546,
-0.0307268388569355,
0.1955094188451767,
-0.21080631017684937,
0.0224158838391304,
0.04345399886369705,
0.13889163732528687,
0.03319050371646881,
-0.03218802437186241,
0.02225274033844471,
0.06154278293251991,
-0.00392611138522625,
-0.07224246859550476,
0.033904410898685455,
0.01046171598136425,
-0.16331537067890167,
0.01889701746404171,
-0.17041146755218506,
0.07274198532104492,
0.09109200537204742,
-0.013118795119225979,
-0.0651094987988472,
-0.051302362233400345,
-0.049386631697416306,
-0.06143399327993393,
-0.004881261847913265,
-0.022275730967521667,
0.21063245832920074,
-0.002302659209817648,
0.1175975501537323,
-0.07054296880960464,
-0.040966786444187164,
0.017342805862426758,
-0.009942453354597092,
-0.013112077489495277,
0.1515854299068451,
-0.007301309611648321,
-0.08486589044332504,
0.0874251276254654,
0.07035285979509354,
-0.07670432329177856,
0.18966060876846313,
-0.07680310308933258,
-0.08340104669332504,
-0.029295047745108604,
0.027713948860764503,
0.029257172718644142,
0.09451746195554733,
-0.16145087778568268,
-0.021327998489141464,
0.02191914990544319,
-0.0007999096415005624,
0.028847279027104378,
-0.1950313150882721,
0.0011845672270283103,
0.043410882353782654,
-0.07287479937076569,
-0.00721906078979373,
-0.000363020459190011,
-0.00526834512129426,
0.08032962679862976,
-0.0009646612452343106,
-0.0657634511590004,
0.0026608542539179325,
-0.040974464267492294,
-0.09279463440179825,
0.16396628320217133,
-0.10657322406768799,
-0.14950168132781982,
-0.13948984444141388,
-0.009628706611692905,
-0.009600820951163769,
-0.007778490893542767,
0.04787183552980423,
-0.11397246271371841,
-0.026097919791936874,
-0.07303489744663239,
0.04477508366107941,
-0.035704873502254486,
0.017666146159172058,
0.030702702701091766,
-0.0004991233581677079,
0.09152830392122269,
-0.10487569868564606,
0.01736394502222538,
-0.010418004356324673,
-0.013777988962829113,
-0.000014423424545384478,
0.016183897852897644,
0.0715319812297821,
0.1650349646806717,
0.04021540656685829,
0.02092662639915943,
-0.04084715619683266,
0.18394683301448822,
-0.12049367278814316,
-0.004460249096155167,
0.1263548880815506,
-0.024781079962849617,
0.03921952471137047,
0.1485130488872528,
0.04514968395233154,
-0.07201533019542694,
0.01953943632543087,
0.021402742713689804,
-0.015379486605525017,
-0.22574669122695923,
-0.0224838275462389,
-0.06873023509979248,
-0.03289186209440231,
0.08583342283964157,
0.027808448299765587,
-0.0005202070460654795,
0.02646040916442871,
-0.03064371459186077,
-0.003819557838141918,
0.04406675323843956,
0.0541888102889061,
0.10723631083965302,
0.01640131138265133,
0.10304631292819977,
-0.026111040264368057,
-0.04845264554023743,
0.02175268903374672,
0.010843138210475445,
0.20056264102458954,
0.02365078777074814,
0.18831273913383484,
0.0499265119433403,
0.12526953220367432,
-0.0005745196249336004,
0.022274237126111984,
0.01959780789911747,
-0.012121849693357944,
0.019384577870368958,
-0.05942931026220322,
-0.04391578212380409,
0.05276360362768173,
0.09863157570362091,
0.04575453698635101,
-0.09764786064624786,
0.010447700507938862,
0.021288428455591202,
0.34344756603240967,
0.050641290843486786,
-0.25419172644615173,
-0.10201475769281387,
0.03347606956958771,
-0.08831644803285599,
-0.027714785188436508,
0.02168741263449192,
0.14133432507514954,
-0.07725265622138977,
0.05717248469591141,
-0.06666659563779831,
0.08186984062194824,
-0.061868615448474884,
0.007795384153723717,
0.033484503626823425,
0.1088908240199089,
0.0024941754527390003,
0.05843184515833855,
-0.26542019844055176,
0.2929925322532654,
0.0006177847390063107,
0.08682110905647278,
-0.035288672894239426,
0.03521748259663582,
0.04975248500704765,
-0.03082028590142727,
0.06179547682404518,
-0.004893685225397348,
-0.1224639043211937,
-0.16890637576580048,
-0.07535456120967865,
0.019520524889230728,
0.12834419310092926,
-0.04128861799836159,
0.10573608428239822,
-0.04586312547326088,
-0.014639689587056637,
0.0542362742125988,
-0.06648033857345581,
-0.1161765605211258,
-0.10800155252218246,
0.032086241990327835,
0.061872027814388275,
0.056925609707832336,
-0.11093208938837051,
-0.1055803894996643,
-0.06261620670557022,
0.12011315673589706,
-0.11508350074291229,
-0.019030068069696426,
-0.11958912014961243,
0.05811852216720581,
0.14347365498542786,
-0.06675738841295242,
0.027287742123007774,
0.028226619586348534,
0.11555430293083191,
0.019736753776669502,
-0.0199707243591547,
0.09976988285779953,
-0.07557876408100128,
-0.19471177458763123,
-0.03768211230635643,
0.1893354207277298,
0.043842826038599014,
0.08369920402765274,
-0.01117004081606865,
0.016409991309046745,
-0.006584479007869959,
-0.05376984179019928,
0.072263203561306,
0.0674867033958435,
-0.03709273040294647,
0.06487462669610977,
-0.03622768446803093,
-0.015521319583058357,
-0.09140180051326752,
-0.06562623381614685,
0.1554122269153595,
0.2637213468551636,
-0.07145162671804428,
0.06256476044654846,
0.060328077524900436,
-0.07898593693971634,
-0.1534118503332138,
0.007506567519158125,
0.11554920673370361,
0.049167875200510025,
0.004727997817099094,
-0.2059725522994995,
0.019008731469511986,
0.043541911989450455,
-0.020315399393439293,
0.04221256449818611,
-0.3258070647716522,
-0.1389627903699875,
0.12920241057872772,
0.07644244283437729,
-0.012292583473026752,
-0.13940122723579407,
-0.06143256276845932,
-0.024519098922610283,
-0.0709250345826149,
0.014418714679777622,
-0.05556013435125351,
0.14084576070308685,
0.011602451093494892,
0.05836522951722145,
0.029022255912423134,
-0.04343458265066147,
0.1473856121301651,
-0.029350286349654198,
0.053278736770153046,
-0.02691994607448578,
0.02794025093317032,
-0.004183025099337101,
-0.053660497069358826,
0.012771078385412693,
-0.08513117581605911,
0.018464243039488792,
-0.11320962011814117,
-0.030214371159672737,
-0.08149667829275131,
0.02272561565041542,
-0.027353255078196526,
-0.03434378281235695,
-0.011124873533844948,
0.04415178298950195,
0.07850012183189392,
0.014746480621397495,
0.10427888482809067,
-0.08746910840272903,
0.15755634009838104,
0.10363388806581497,
0.16761192679405212,
-0.014657584019005299,
-0.07536893337965012,
-0.016665752977132797,
-0.010840283706784248,
0.05257144570350647,
-0.10350573807954788,
0.051119960844516754,
0.13496270775794983,
0.039293210953474045,
0.16349488496780396,
0.0487331748008728,
-0.08932944387197495,
0.025030750781297684,
0.05416114255785942,
-0.07325414568185806,
-0.2031760960817337,
-0.03688257187604904,
0.05370241403579712,
-0.15317325294017792,
-0.00226681143976748,
0.1304289996623993,
-0.04946761950850487,
-0.011760004796087742,
0.01050412654876709,
0.031090131029486656,
-0.0606224350631237,
0.21657061576843262,
0.03483283147215843,
0.078209787607193,
-0.09112297743558884,
0.07827823609113693,
0.03664668649435043,
-0.11069247871637344,
0.05814403295516968,
0.09824696183204651,
-0.03119742125272751,
-0.026162510737776756,
0.008931420743465424,
0.09548944234848022,
0.05338745936751366,
-0.04330214485526085,
-0.1274440586566925,
-0.14670981466770172,
0.058992382138967514,
0.11301780492067337,
0.029094111174345016,
0.018366079777479172,
-0.04102906212210655,
0.034463636577129364,
-0.08575458824634552,
0.10221800953149796,
0.08250130712985992,
0.046259861439466476,
-0.13398657739162445,
0.160525843501091,
0.00014587095938622952,
0.00035262113669887185,
-0.00808032788336277,
-0.016923105344176292,
-0.10198985040187836,
0.026320111006498337,
-0.1296781301498413,
-0.01915046200156212,
-0.03896838799118996,
0.013359349220991135,
0.01454878132790327,
-0.0682779848575592,
-0.05304152891039848,
0.034464750438928604,
-0.1220151036977768,
-0.04004550352692604,
-0.005529175046831369,
0.07588419318199158,
-0.06898798048496246,
-0.032167304307222366,
0.04453333467245102,
-0.09887637197971344,
0.08399274945259094,
0.04653552547097206,
-0.012181475758552551,
0.025165701285004616,
-0.13143999874591827,
-0.010649149306118488,
0.02480616420507431,
0.008060013875365257,
0.014347091317176819,
-0.17168404161930084,
-0.031439006328582764,
-0.0216682106256485,
0.028585972264409065,
-0.005016442388296127,
0.055141884833574295,
-0.10226383060216904,
-0.03294378146529198,
-0.03502826765179634,
-0.05590909719467163,
-0.06460505723953247,
0.05607398971915245,
0.05438840016722679,
0.03768907114863396,
0.1489873081445694,
-0.10380081832408905,
0.05733921006321907,
-0.2145691215991974,
0.011884372681379318,
-0.023675929754972458,
-0.08191234618425369,
-0.0629524365067482,
-0.04325210303068161,
0.08865512907505035,
-0.06111437827348709,
0.0752260833978653,
-0.04863188788294792,
0.025136245414614677,
0.03321186825633049,
-0.11052850633859634,
0.047248292714357376,
0.047578148543834686,
0.2088700234889984,
0.04008397459983826,
-0.027682363986968994,
0.07953482866287231,
-0.016356395557522774,
0.05326656997203827,
0.15722231566905975,
0.10719113051891327,
0.1682734340429306,
0.046144306659698486,
0.08733800053596497,
0.09807567298412323,
-0.08735164254903793,
-0.11948401480913162,
0.12626677751541138,
-0.019390128552913666,
0.13053369522094727,
-0.04005742073059082,
0.2169460654258728,
0.12508395314216614,
-0.16765308380126953,
0.06894508749246597,
-0.04868948087096214,
-0.08457835763692856,
-0.1024484783411026,
-0.08810193091630936,
-0.08187887072563171,
-0.16900037229061127,
0.0018533513648435473,
-0.10027889907360077,
0.04081053286790848,
0.03879188746213913,
0.035864755511283875,
0.02187742479145527,
0.12337464839220047,
0.030037550255656242,
-0.0028266003355383873,
0.11453485488891602,
-0.014200672507286072,
-0.016116982325911522,
-0.07446461170911789,
-0.11348643898963928,
0.08390001207590103,
-0.014555603265762329,
0.03597031533718109,
-0.044096190482378006,
-0.08400306850671768,
0.03840465471148491,
-0.00928172841668129,
-0.11441700160503387,
0.027601664885878563,
-0.010678465478122234,
0.07598239183425903,
0.06027037277817726,
0.04345560073852539,
-0.023270566016435623,
-0.014055891893804073,
0.22280849516391754,
-0.09495571255683899,
-0.08227711170911789,
-0.14056551456451416,
0.2135348916053772,
-0.025636352598667145,
-0.010599144734442234,
0.003401142545044422,
-0.06970655918121338,
0.002946474589407444,
0.17337338626384735,
0.15295220911502838,
-0.024379543960094452,
-0.007880893535912037,
0.002450644038617611,
-0.008638952858746052,
-0.05435631051659584,
0.06338296085596085,
0.12135772407054901,
0.002354918047785759,
-0.05253119394183159,
-0.020291462540626526,
-0.025610877200961113,
-0.05871042236685753,
-0.03035738877952099,
0.09002070873975754,
0.02250106818974018,
0.0013417936861515045,
-0.019030088558793068,
0.11114905774593353,
-0.05162413790822029,
-0.13505761325359344,
-0.01737452857196331,
-0.17367178201675415,
-0.16377875208854675,
-0.04334971681237221,
0.0731668770313263,
0.05333632975816727,
0.03652302548289299,
-0.006739363539963961,
-0.00780835235491395,
0.09775916486978531,
0.0013758894056081772,
-0.03897041827440262,
-0.09763261675834656,
0.09176790714263916,
-0.1524912416934967,
0.1636170744895935,
-0.04241855442523956,
0.028292134404182434,
0.12406791746616364,
0.06060418114066124,
-0.06252697110176086,
0.03551008179783821,
0.06626812368631363,
-0.1277836263179779,
0.04569413512945175,
0.2121536284685135,
-0.03586006537079811,
0.1603524088859558,
0.04194392263889313,
-0.10336305946111679,
0.020426174625754356,
-0.0978369191288948,
-0.07039818167686462,
-0.043181341141462326,
0.0008126780739985406,
-0.05665912479162216,
0.13692787289619446,
0.20882675051689148,
-0.07677379995584488,
-0.025670137256383896,
-0.057625528424978256,
-0.0053104860708117485,
0.059547752141952515,
0.09564445912837982,
-0.050615645945072174,
-0.2671702802181244,
-0.0000781123380875215,
-0.014217108488082886,
0.009858782403171062,
-0.21759073436260223,
-0.10517168045043945,
0.014582880772650242,
-0.050897832959890366,
-0.04910558834671974,
0.11255370825529099,
0.10442234575748444,
0.04561673104763031,
-0.04987512156367302,
-0.14368735253810883,
-0.020731007680296898,
0.17322127521038055,
-0.1676434725522995,
-0.042436666786670685
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8559
- Bleu: 52.9173
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
### Framework versions
- Transformers 4.33.1
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
| {"license": "apache-2.0", "tags": ["translation", "generated_from_trainer"], "datasets": ["kde4"], "metrics": ["bleu"], "base_model": "Helsinki-NLP/opus-mt-en-fr", "model-index": [{"name": "marian-finetuned-kde4-en-to-fr", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "kde4", "type": "kde4", "config": "en-fr", "split": "train", "args": "en-fr"}, "metrics": [{"type": "bleu", "value": 52.917266098420704, "name": "Bleu"}]}]}]} | translation | akuzdeuov/marian-finetuned-kde4-en-to-fr | [
"transformers",
"pytorch",
"marian",
"text2text-generation",
"translation",
"generated_from_trainer",
"dataset:kde4",
"base_model:Helsinki-NLP/opus-mt-en-fr",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:29:27+00:00 | [] | [] | TAGS
#transformers #pytorch #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8559
- Bleu: 52.9173
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
### Framework versions
- Transformers 4.33.1
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
| [
"# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.8559\n- Bleu: 52.9173",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3",
"### Training results",
"### Framework versions\n\n- Transformers 4.33.1\n- Pytorch 2.0.1+cu117\n- Datasets 2.14.5\n- Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.8559\n- Bleu: 52.9173",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3",
"### Training results",
"### Framework versions\n\n- Transformers 4.33.1\n- Pytorch 2.0.1+cu117\n- Datasets 2.14.5\n- Tokenizers 0.13.3"
] | [
89,
73,
6,
12,
8,
3,
90,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.8559\n- Bleu: 52.9173## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3### Training results### Framework versions\n\n- Transformers 4.33.1\n- Pytorch 2.0.1+cu117\n- Datasets 2.14.5\n- Tokenizers 0.13.3"
] | [
-0.13950586318969727,
0.12094132602214813,
-0.00348891899921,
0.09427111595869064,
0.10663501918315887,
0.010824562981724739,
0.07512333244085312,
0.14393794536590576,
-0.06023121625185013,
0.08737733215093613,
0.06587541848421097,
0.0802229642868042,
0.07375792413949966,
0.1651628166437149,
-0.0482148639857769,
-0.25360220670700073,
0.03454647585749626,
0.00943764578551054,
-0.1441529542207718,
0.07797516882419586,
0.1288132667541504,
-0.0711185410618782,
0.07205677032470703,
0.041500262916088104,
-0.11366491764783859,
0.036392129957675934,
0.0008911974146030843,
-0.06645779311656952,
0.07099463790655136,
0.05409564822912216,
0.07823923975229263,
0.013887036591768265,
0.08906209468841553,
-0.17419880628585815,
-0.00686242338269949,
0.05733228847384453,
0.02944977954030037,
0.06609628349542618,
0.11238063871860504,
0.025539815425872803,
0.16686105728149414,
-0.13792501389980316,
0.054370537400245667,
0.02469104900956154,
-0.03857862576842308,
-0.15897253155708313,
-0.09256921708583832,
0.09457776695489883,
0.09447517991065979,
0.10214102268218994,
-0.004870662000030279,
0.14310429990291595,
-0.07301092892885208,
0.0953192189335823,
0.16904118657112122,
-0.2815442681312561,
-0.06489305198192596,
0.06725626438856125,
0.038727663457393646,
-0.0028040495235472918,
-0.0767868310213089,
0.014096065424382687,
0.04169552028179169,
0.0207010880112648,
0.03946731984615326,
-0.046890873461961746,
-0.05850474163889885,
0.00037350578350014985,
-0.10628907382488251,
-0.015468576923012733,
0.21639013290405273,
0.05730970948934555,
-0.033887382596731186,
-0.07724092155694962,
-0.030762651935219765,
-0.0841732993721962,
0.003399994457140565,
-0.058409687131643295,
0.02701660431921482,
-0.05144478380680084,
-0.04275006800889969,
-0.04057967662811279,
-0.08008361607789993,
-0.048242080956697464,
0.013546370901167393,
0.11286046355962753,
0.04175883159041405,
0.033167239278554916,
-0.025200817734003067,
0.09746787697076797,
-0.047969818115234375,
-0.1303054839372635,
-0.04859370365738869,
-0.021916188299655914,
-0.08722010999917984,
-0.05108537897467613,
-0.04347981885075569,
-0.07110930234193802,
0.013136999681591988,
0.09578833729028702,
-0.07956212759017944,
0.05860146880149841,
0.062306974083185196,
-0.0033182657789438963,
-0.03497159853577614,
0.17778387665748596,
-0.05936240404844284,
-0.10060202330350876,
-0.002164974343031645,
0.058266349136829376,
-0.008392655290663242,
0.001032999251037836,
-0.06435192376375198,
-0.04972304776310921,
0.060793437063694,
0.051665280014276505,
-0.0520218126475811,
0.03924739360809326,
-0.019996117800474167,
-0.03083224967122078,
-0.005466711241751909,
-0.13746735453605652,
0.04331616684794426,
-0.008678260259330273,
-0.11935523897409439,
0.056079622358083725,
0.028797047212719917,
-0.03407743200659752,
-0.0853433609008789,
0.09801674634218216,
-0.0621950589120388,
-0.013521562330424786,
-0.07440298050642014,
-0.0730736032128334,
0.022089026868343353,
-0.07090147584676743,
0.00031587749253958464,
-0.0908198431134224,
-0.16001197695732117,
-0.04214814305305481,
0.06482649594545364,
-0.07271713018417358,
-0.0474228709936142,
-0.0458347350358963,
-0.08714403957128525,
0.031866416335105896,
-0.016400666907429695,
0.0771656483411789,
-0.028849471360445023,
0.029611533507704735,
0.002546873176470399,
0.03280264884233475,
0.033690858632326126,
0.028423964977264404,
-0.05559585243463516,
0.04209012910723686,
-0.09956368058919907,
0.08629775792360306,
-0.09318427741527557,
0.013754821382462978,
-0.11353907734155655,
-0.09804897010326385,
0.0076522259041666985,
-0.0010533200111240149,
0.09172411262989044,
0.1283646821975708,
-0.15420418977737427,
-0.034502655267715454,
0.11532130092382431,
-0.08682452142238617,
-0.08048728108406067,
0.0641428679227829,
-0.007008968852460384,
0.020932044833898544,
0.06525710225105286,
0.12955227494239807,
0.15620988607406616,
-0.09969843924045563,
-0.06646028161048889,
0.04006688669323921,
0.043083805590867996,
-0.026742184534668922,
0.0502476766705513,
0.004490857012569904,
0.014778679236769676,
0.05392982065677643,
-0.08763294667005539,
-0.009218706749379635,
-0.04313671961426735,
-0.07931257784366608,
-0.052405450493097305,
-0.06801503896713257,
-0.02077684924006462,
0.023699196055531502,
0.06816153973340988,
-0.056763675063848495,
-0.09496836364269257,
0.10350894927978516,
0.1513201892375946,
-0.04816611111164093,
0.019004270434379578,
-0.0770149752497673,
0.0846891924738884,
-0.04345053806900978,
-0.008091485127806664,
-0.18371300399303436,
-0.0675765797495842,
0.03173689916729927,
-0.1371457427740097,
0.006290715653449297,
0.05556531623005867,
0.07202755659818649,
0.05560649186372757,
-0.03704891353845596,
-0.025420790538191795,
-0.07438848912715912,
-0.018098268657922745,
-0.0710877776145935,
-0.1678406149148941,
-0.04216494411230087,
-0.014100692234933376,
0.10874468833208084,
-0.22291338443756104,
-0.003981025889515877,
0.046596795320510864,
0.17971040308475494,
0.01239006593823433,
-0.02157573774456978,
-0.0221028383821249,
0.024860715493559837,
-0.022382229566574097,
-0.08762257546186447,
0.0197247713804245,
0.004481069277971983,
-0.07378839701414108,
0.020400913432240486,
-0.097823865711689,
0.021681135520339012,
0.07099959254264832,
0.09125236421823502,
-0.09600832313299179,
0.0021305803675204515,
-0.05025273188948631,
-0.03827054426074028,
-0.04177122563123703,
0.0006730183959007263,
0.14202512800693512,
0.012205732055008411,
0.1266113668680191,
-0.0669354572892189,
-0.07110976427793503,
0.017794974148273468,
-0.005404864903539419,
-0.044912587851285934,
0.11101741343736649,
0.04201705381274223,
-0.137058287858963,
0.07247577607631683,
0.043291982263326645,
-0.08876662701368332,
0.19864867627620697,
-0.04490003362298012,
-0.10402368009090424,
-0.053525470197200775,
0.013309245929121971,
0.020223848521709442,
0.15128308534622192,
-0.04553106427192688,
0.011045967228710651,
0.03636997193098068,
0.04692257568240166,
0.0528864860534668,
-0.13105224072933197,
0.012435765005648136,
0.027789568528532982,
-0.04630383849143982,
0.023180944845080376,
0.026492644101381302,
0.011314757168293,
0.08122986555099487,
0.002687410218641162,
-0.041688308119773865,
-0.002608547918498516,
-0.017910519614815712,
-0.05744005739688873,
0.18943600356578827,
-0.12024448066949844,
-0.18678505718708038,
-0.1854090690612793,
0.09065081179141998,
-0.11501411348581314,
-0.04084805026650429,
0.01698395609855652,
-0.080274298787117,
-0.08503083139657974,
-0.08855092525482178,
0.02934318780899048,
-0.04206226393580437,
-0.0005070698098279536,
-0.016694286838173866,
0.05331597477197647,
0.08743599057197571,
-0.13893112540245056,
0.013482227921485901,
-0.0016692167846485972,
-0.05777290090918541,
-0.027450915426015854,
0.013316880911588669,
0.06624120473861694,
0.08605685085058212,
-0.01587754301726818,
0.05548153072595596,
0.003485875204205513,
0.18160180747509003,
-0.07680200785398483,
0.029651453718543053,
0.1059357225894928,
0.060516007244586945,
0.039135485887527466,
0.12307721376419067,
0.015483487397432327,
-0.06194330379366875,
0.00330370687879622,
0.0652458444237709,
-0.00013548394781537354,
-0.2547738254070282,
-0.04034643620252609,
-0.05810089409351349,
-0.013428832404315472,
0.08689581602811813,
0.05717068910598755,
-0.0033199144527316093,
0.07359970360994339,
-0.002120975637808442,
0.04501985386013985,
0.0012607263633981347,
0.07751303911209106,
0.10944364219903946,
0.06873055547475815,
0.0784284919500351,
-0.052521318197250366,
0.004914790857583284,
0.08958493173122406,
0.020820148289203644,
0.25794053077697754,
-0.06859558075666428,
0.0821257159113884,
0.03460342437028885,
0.12024140357971191,
-0.012446017935872078,
0.07622021436691284,
0.03390973061323166,
-0.0007275210809893906,
0.046189747750759125,
-0.07353415340185165,
-0.0013883874053135514,
0.03484850376844406,
-0.003012530505657196,
0.04084751382470131,
-0.09215131402015686,
0.06579435616731644,
0.030176635831594467,
0.24602465331554413,
0.08021619915962219,
-0.27951645851135254,
-0.07709838449954987,
0.008850907906889915,
-0.02147877775132656,
-0.07649661600589752,
0.0010172859765589237,
0.13281382620334625,
-0.1461200714111328,
0.08500129729509354,
-0.06548987329006195,
0.09990154951810837,
-0.025792839005589485,
-0.039122242480516434,
0.08827824890613556,
0.12075095623731613,
0.026179615408182144,
0.09581030905246735,
-0.16712461411952972,
0.23605185747146606,
0.018018413335084915,
0.11391226947307587,
-0.02506960555911064,
0.05482003465294838,
0.03083769977092743,
0.106088787317276,
0.10932579636573792,
0.03430303558707237,
-0.12220074236392975,
-0.15886390209197998,
-0.08715400099754333,
0.02431556209921837,
0.07977814972400665,
-0.032925497740507126,
0.08271724730730057,
-0.05538613349199295,
-0.022622639313340187,
-0.002851717872545123,
-0.050534334033727646,
-0.14549462497234344,
-0.13497774302959442,
0.015890806913375854,
0.028866788372397423,
-0.04744089022278786,
-0.07435733079910278,
-0.10219293087720871,
0.00003261776510044001,
0.19322475790977478,
-0.00532486941665411,
-0.054690707474946976,
-0.14888733625411987,
0.038143329322338104,
0.14463704824447632,
-0.08765304833650589,
0.014401035383343697,
0.00433030491694808,
0.1336095780134201,
0.03231112286448479,
-0.07735630869865417,
0.031454846262931824,
-0.06822073459625244,
-0.1363525241613388,
-0.018778594210743904,
0.1173194944858551,
0.02497544325888157,
0.017955563962459564,
0.03238992765545845,
0.014602516777813435,
0.01752334088087082,
-0.08967321366071701,
-0.05464642494916916,
0.07922064512968063,
0.05368277430534363,
0.045811545103788376,
-0.06456765532493591,
0.025720620527863503,
-0.037441786378622055,
-0.008966086432337761,
0.08230522274971008,
0.19290858507156372,
-0.092181496322155,
0.048737477511167526,
0.035335712134838104,
-0.08772794902324677,
-0.18545091152191162,
0.06486526131629944,
0.12215187400579453,
0.06379760801792145,
0.010044234804809093,
-0.1347496509552002,
0.09485524892807007,
0.09179265052080154,
-0.01980249583721161,
0.04709329828619957,
-0.26751765608787537,
-0.13662554323673248,
0.06860131025314331,
0.10585520416498184,
0.019624492153525352,
-0.09826009720563889,
-0.0544452890753746,
-0.05165397748351097,
-0.13482046127319336,
0.06070130690932274,
-0.05839606001973152,
0.1010226234793663,
-0.02691163308918476,
0.05930829048156738,
0.0353659987449646,
-0.029846852645277977,
0.1701563596725464,
0.042438093572854996,
0.0527443066239357,
-0.05443824082612991,
0.09749418497085571,
0.07124854624271393,
-0.10150472074747086,
0.11206264048814774,
-0.063751719892025,
0.06431113928556442,
-0.165262371301651,
-0.02596319280564785,
-0.057596124708652496,
0.10733290761709213,
-0.057398054748773575,
-0.036565348505973816,
-0.0522160604596138,
0.07373311370611191,
0.07713719457387924,
-0.040544722229242325,
0.055688463151454926,
0.01402239315211773,
0.02935139089822769,
0.1363455057144165,
0.08249422162771225,
0.03400816023349762,
-0.13704168796539307,
0.03549765795469284,
-0.0005538035184144974,
0.06920138001441956,
-0.08384376019239426,
0.019779829308390617,
0.13630737364292145,
0.0005719669279642403,
0.10552334785461426,
0.012271641753613949,
-0.07405342161655426,
-0.02340901456773281,
0.07099705189466476,
-0.09161356836557388,
-0.09654610604047775,
-0.043161895126104355,
-0.06540607661008835,
-0.0729939267039299,
0.028342675417661667,
0.14171768724918365,
-0.07963944971561432,
0.001799275865778327,
-0.014531156048178673,
0.009929697960615158,
-0.05413125082850456,
0.2078038454055786,
0.030191859230399132,
0.06958356499671936,
-0.07017887383699417,
0.12930354475975037,
0.03206711262464523,
-0.09138286858797073,
0.03765885531902313,
0.06751646846532822,
-0.09938955307006836,
-0.030190523713827133,
0.07449828833341599,
0.12422369420528412,
-0.05263728275895119,
-0.09306734800338745,
-0.10242900252342224,
-0.10604368150234222,
0.03940688446164131,
0.0726112350821495,
0.06194993481040001,
0.017633015289902687,
-0.03340838477015495,
-0.024833647534251213,
-0.13268721103668213,
0.09688248485326767,
0.11041309684515,
0.05639231577515602,
-0.16937240958213806,
0.0937376320362091,
-0.0037112231366336346,
0.042568229138851166,
-0.004968198947608471,
0.02149217575788498,
-0.09536337852478027,
-0.03878255560994148,
-0.13026241958141327,
0.005043561104685068,
-0.039181310683488846,
-0.0028650008607655764,
-0.022788584232330322,
-0.02106386423110962,
-0.07771944999694824,
0.06536614894866943,
-0.07153888791799545,
-0.05065491795539856,
-0.017746612429618835,
0.05259397625923157,
-0.10029800236225128,
-0.005280294455587864,
0.009802187792956829,
-0.1113501638174057,
0.03563134744763374,
0.050824593752622604,
0.01438765786588192,
0.0559772290289402,
-0.017875367775559425,
0.004702860489487648,
0.0010229094186797738,
0.04914291948080063,
0.07878467440605164,
-0.10412203520536423,
0.03166899085044861,
0.0010391583200544119,
0.06352455914020538,
0.006325021851807833,
0.01698596030473709,
-0.1119961217045784,
-0.042096514254808426,
-0.06295222789049149,
-0.046867042779922485,
-0.06883063167333603,
0.0719313696026802,
0.09528859704732895,
0.054466430097818375,
0.1765100061893463,
-0.08558051288127899,
0.018153829500079155,
-0.16899284720420837,
-0.018031636252999306,
0.014107616618275642,
-0.05125083029270172,
-0.02192908339202404,
-0.029369834810495377,
0.08301428705453873,
-0.05632351338863373,
0.09626097977161407,
0.02873331308364868,
0.05935927480459213,
0.0412619523704052,
-0.1076808050274849,
0.012029627338051796,
0.02639160491526127,
0.12245870381593704,
0.030988939106464386,
0.012371701188385487,
0.042102664709091187,
-0.0070371804758906364,
0.03214409202337265,
0.07030559331178665,
0.16164636611938477,
0.1536421924829483,
-0.036913637071847916,
0.1288328319787979,
0.06637070327997208,
-0.062447383999824524,
-0.10834188014268875,
0.02174544148147106,
-0.09847258031368256,
0.12876491248607635,
-0.032050762325525284,
0.12030810117721558,
0.10716650635004044,
-0.21120832860469818,
0.07533777505159378,
-0.06991793215274811,
-0.12956960499286652,
-0.13029314577579498,
-0.13998575508594513,
-0.10086820274591446,
-0.12159886211156845,
0.029445629566907883,
-0.11638736724853516,
0.03864330053329468,
0.05966139957308769,
0.04485020041465759,
-0.01372991967946291,
0.139241561293602,
-0.08510854095220566,
0.0027497531846165657,
0.09150339663028717,
0.024922754615545273,
-0.0043312557972967625,
-0.0078479228541255,
-0.00558820366859436,
-0.007976124063134193,
-0.02337813563644886,
0.039763059467077255,
-0.03136048465967178,
-0.0031822402961552143,
0.018041636794805527,
0.020616449415683746,
-0.0423470176756382,
0.010487476363778114,
-0.0025225935969501734,
0.038440197706222534,
0.03886309638619423,
0.06803884357213974,
-0.028436146676540375,
-0.047654833644628525,
0.25910189747810364,
-0.0728466585278511,
-0.07292469590902328,
-0.17011971771717072,
0.14618037641048431,
0.06507165729999542,
0.007641978096216917,
0.06371961534023285,
-0.11078844219446182,
-0.03780805319547653,
0.15875481069087982,
0.15076470375061035,
0.0014100726693868637,
-0.032601676881313324,
-0.02581600286066532,
-0.019181575626134872,
-0.05392644926905632,
0.10291449725627899,
0.05741008371114731,
0.09684467315673828,
-0.022572776302695274,
-0.0024837569799274206,
-0.02597300335764885,
-0.022943174466490746,
-0.05247873812913895,
0.11574622988700867,
0.011765889823436737,
-0.041108183562755585,
-0.02794056199491024,
0.0733204185962677,
-0.012302055023610592,
-0.139896422624588,
0.08252546936273575,
-0.1394433230161667,
-0.18539537489414215,
-0.048531752079725266,
0.0745520070195198,
-0.028409171849489212,
0.07550954818725586,
-0.022024543955922127,
-0.014478754252195358,
0.13252680003643036,
-0.0032128291204571724,
-0.08078006654977798,
-0.13144055008888245,
0.053969427943229675,
-0.08265061676502228,
0.2111559957265854,
0.02104784920811653,
0.05995139107108116,
0.11214970052242279,
0.0007442072383128107,
-0.1257227510213852,
0.029842032119631767,
0.07272740453481674,
-0.0936637818813324,
0.044311363250017166,
0.16611602902412415,
-0.06101532280445099,
0.07217588275671005,
0.019488131627440453,
-0.10686878859996796,
-0.019367948174476624,
-0.03944353759288788,
0.00898838136345148,
-0.09336705505847931,
0.01970972679555416,
-0.08066241443157196,
0.13444414734840393,
0.2562810778617859,
-0.02751174196600914,
-0.018738647922873497,
-0.08363793790340424,
0.07419072091579437,
0.03495332598686218,
0.05851321667432785,
0.028866592794656754,
-0.1804828643798828,
0.0021199837792664766,
-0.029374420642852783,
0.03196193650364876,
-0.20486664772033691,
-0.0949300080537796,
0.01429442223161459,
-0.08124013245105743,
-0.047282494604587555,
0.1293577253818512,
0.04407218471169472,
0.04110918566584587,
-0.03992997109889984,
-0.06749698519706726,
-0.03486732020974159,
0.11407554894685745,
-0.12897206842899323,
-0.0883057713508606
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | noza-kit/Adapter_llama2_translate_Q_enpt_ex3-3epoch | [
"peft",
"safetensors",
"region:us"
] | 2024-02-12T12:30:57+00:00 | [] | [] | TAGS
#peft #safetensors #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: bfloat16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #safetensors #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: bfloat16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
14,
164,
11
] | [
"passage: TAGS\n#peft #safetensors #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: bfloat16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.07453527301549911,
0.05672025308012962,
-0.0029302015900611877,
0.12918446958065033,
0.08402831107378006,
0.03368997201323509,
0.11096592992544174,
0.10542113333940506,
0.02287278324365616,
0.0953136682510376,
0.1131134033203125,
0.03712237998843193,
0.04852422699332237,
0.1885252743959427,
-0.031569480895996094,
0.030067527666687965,
0.048593711107969284,
-0.010659906081855297,
0.01831846870481968,
0.09225422143936157,
0.05085039138793945,
-0.06070663407444954,
0.02455233968794346,
-0.09134310483932495,
-0.13494300842285156,
0.004855876322835684,
0.016481900587677956,
0.015014139004051685,
0.04234103485941887,
0.025766482576727867,
0.08055310696363449,
0.016672568395733833,
-0.025261245667934418,
-0.2119787186384201,
-0.006826059427112341,
0.11276665329933167,
-0.026966989040374756,
0.06058482825756073,
-0.0768154039978981,
0.11133003234863281,
-0.12787394225597382,
-0.06762976199388504,
0.00870905164629221,
0.011327606625854969,
-0.08346157521009445,
-0.12404114753007889,
-0.05803650990128517,
0.07093892246484756,
0.047375280410051346,
0.06703553348779678,
-0.020026782527565956,
0.19431978464126587,
-0.13521946966648102,
0.09132549166679382,
0.07933776825666428,
-0.24198181927204132,
-0.03482532873749733,
0.09177332371473312,
-0.017464879900217056,
0.1635972559452057,
-0.08187544345855713,
-0.0939556434750557,
0.0728202536702156,
0.03404154255986214,
0.005561288446187973,
-0.011645126156508923,
-0.10962631553411484,
-0.0012364726280793548,
-0.14694398641586304,
-0.041757646948099136,
0.12922339141368866,
0.031558793038129807,
-0.057849276810884476,
-0.0561692900955677,
-0.10106953978538513,
-0.345947265625,
0.031990889459848404,
-0.02434849739074707,
-0.07495184987783432,
0.036499906331300735,
-0.00704183941707015,
-0.0090585732832551,
-0.02461344748735428,
-0.0765213891863823,
-0.02301194705069065,
0.08447002619504929,
0.05495472252368927,
0.019824255257844925,
0.007900258526206017,
0.10671905428171158,
-0.09520875662565231,
-0.035336196422576904,
-0.04644833132624626,
-0.03745805844664574,
-0.03382916375994682,
-0.01123035978525877,
-0.053050022572278976,
0.16669845581054688,
0.09249624609947205,
0.14213572442531586,
-0.18807707726955414,
0.11025521159172058,
-0.028741484507918358,
0.045033495873212814,
-0.04428386315703392,
0.01591639779508114,
-0.10671577602624893,
0.11309785395860672,
0.04467947781085968,
0.1684921830892563,
0.04472010210156441,
-0.02066531777381897,
-0.04747699573636055,
-0.007422163616865873,
0.17650453746318817,
0.0232903603464365,
-0.09967294335365295,
0.020891545340418816,
-0.1315230280160904,
-0.004083019215613604,
0.08347529172897339,
-0.08486035466194153,
0.009008309803903103,
0.04072410985827446,
-0.031658269464969635,
-0.0587322898209095,
0.11448393017053604,
-0.04607329145073891,
-0.020696036517620087,
-0.021331876516342163,
-0.08345860987901688,
-0.030993575230240822,
-0.08431331068277359,
-0.13357765972614288,
0.05287472531199455,
-0.15527646243572235,
0.0028629989828914404,
-0.07735834270715714,
-0.06110675632953644,
0.03965908661484718,
-0.013271228410303593,
-0.051117151975631714,
0.07370369136333466,
-0.08409968763589859,
-0.1512153595685959,
-0.03397209942340851,
0.0015836912207305431,
-0.013690887950360775,
-0.028606683015823364,
0.10426909476518631,
0.04597330093383789,
0.10466328263282776,
-0.14548587799072266,
-0.02373979240655899,
-0.02897789515554905,
0.0752343013882637,
0.031582895666360855,
0.10575884580612183,
-0.09681099653244019,
-0.02667132206261158,
-0.05423391982913017,
-0.051506925374269485,
-0.1155124232172966,
-0.02089465595781803,
0.1303206980228424,
0.11001842468976974,
-0.1436445564031601,
0.003421124769374728,
0.12492787092924118,
-0.05372503399848938,
-0.09895416349172592,
0.16517679393291473,
-0.04658776894211769,
0.09611191600561142,
-0.02273191697895527,
0.07429033517837524,
0.23439718782901764,
-0.11924499273300171,
-0.01995956152677536,
0.09844992309808731,
0.058452147990465164,
0.011279714293777943,
0.008250311948359013,
0.07656245678663254,
-0.12388023734092712,
0.028743496164679527,
0.07123924046754837,
0.045590583235025406,
-0.05937817320227623,
-0.060832101851701736,
-0.03297826275229454,
-0.0714627206325531,
0.12664875388145447,
0.018630238249897957,
-0.0026331294793635607,
-0.07893449813127518,
-0.07984703779220581,
0.10442128777503967,
0.11084877699613571,
-0.02576097659766674,
-0.011570636183023453,
-0.1332901567220688,
0.01385668758302927,
-0.06000708043575287,
0.017119701951742172,
-0.10608156770467758,
-0.018460994586348534,
0.08949125558137894,
-0.04138871654868126,
-0.004088582005351782,
-0.004524412099272013,
0.0631813183426857,
0.05650990083813667,
-0.061315808445215225,
-0.01833820343017578,
-0.02623862586915493,
0.013025004416704178,
-0.08827390521764755,
-0.0755123719573021,
0.016560645774006844,
-0.018709814175963402,
0.2513205409049988,
-0.13908502459526062,
0.04999112710356712,
0.09971005469560623,
-0.00987799372524023,
0.007947816513478756,
-0.034925658255815506,
-0.048242103308439255,
0.1002047136425972,
-0.0127094192430377,
-0.03627065196633339,
0.033551398664712906,
0.03157352656126022,
-0.059017281979322433,
-0.1434866487979889,
-0.1126585379242897,
0.06866701692342758,
0.12563154101371765,
0.09638676792383194,
-0.0775245949625969,
-0.05462497100234032,
-0.026248624548316002,
-0.03657117113471031,
0.03836259990930557,
-0.04828217253088951,
0.004764229524880648,
0.0010749101638793945,
0.06939499825239182,
-0.11714894324541092,
-0.03485434129834175,
0.06400632113218307,
-0.038004856556653976,
-0.039996661245822906,
0.1087803915143013,
0.010819503106176853,
-0.12793982028961182,
0.07559659332036972,
0.08842159062623978,
-0.13383613526821136,
0.09590750932693481,
-0.0028637137729674578,
-0.005309233907610178,
-0.09237763285636902,
0.19879533350467682,
0.03237396851181984,
0.11132846027612686,
-0.09295126795768738,
0.11590158939361572,
-0.01718589849770069,
0.00020890985615551472,
0.057636335492134094,
-0.19735373556613922,
-0.0064532519318163395,
-0.04112154245376587,
-0.08197004348039627,
-0.05563674867153168,
-0.009490575641393661,
0.005312664434313774,
0.0359378457069397,
-0.015816813334822655,
0.06244548037648201,
0.1411624401807785,
-0.011706400662660599,
-0.09129936248064041,
0.19432353973388672,
-0.22053758800029755,
-0.23859453201293945,
-0.2090514898300171,
0.023685840889811516,
-0.10573247075080872,
-0.024484699591994286,
-0.02527044154703617,
-0.06987837702035904,
0.037737224251031876,
-0.10980663448572159,
-0.07130855321884155,
0.028474964201450348,
0.012113752774894238,
0.0170458871871233,
0.022363774478435516,
0.17315997183322906,
-0.07409664243459702,
0.02882210724055767,
0.052968453615903854,
-0.05349377915263176,
0.12976272404193878,
-0.06462682038545609,
-0.005862697493284941,
0.09855834394693375,
-0.019562197849154472,
0.0032337920274585485,
0.005966575816273689,
0.312248557806015,
0.014505959115922451,
0.03919876366853714,
0.09490207582712173,
-0.0005610818625427783,
0.056110724806785583,
0.10383185744285583,
0.00473805470392108,
-0.09106523543596268,
0.08092432469129562,
0.045209795236587524,
-0.07773471623659134,
-0.12484415620565414,
-0.034405000507831573,
-0.04292053356766701,
0.032136041671037674,
0.07482845336198807,
0.06710931658744812,
0.08111753314733505,
0.07620391994714737,
0.011253349483013153,
0.06221330165863037,
0.0378451906144619,
0.003082308918237686,
0.0887143686413765,
-0.010844302363693714,
0.05958619713783264,
-0.01787867583334446,
0.023802412673830986,
0.04368376359343529,
0.12984830141067505,
0.06453941017389297,
-0.0877237617969513,
-0.012963186018168926,
0.05976909399032593,
0.30079957842826843,
0.001703330664895475,
0.09524872899055481,
-0.06422717124223709,
-0.01914786919951439,
-0.004450966138392687,
-0.04535892978310585,
-0.0732153058052063,
0.028908109292387962,
-0.050591856241226196,
0.09057585150003433,
-0.012144862674176693,
0.01970677264034748,
0.08399123698472977,
0.08554349094629288,
0.16883639991283417,
-0.2827686667442322,
-0.10914783924818039,
-0.005156747531145811,
0.11810720711946487,
-0.09273596853017807,
0.021748976781964302,
0.23115390539169312,
0.03409026190638542,
-0.08356752246618271,
-0.029648557305336,
0.011595956981182098,
0.0012271356536075473,
0.014364535920321941,
0.09888807684183121,
0.1083734855055809,
-0.010421541519463062,
0.07215214520692825,
-0.30064713954925537,
0.03492109104990959,
0.06463161110877991,
0.051675330847501755,
-0.016254767775535583,
0.004377559758722782,
-0.0681276023387909,
-0.039601437747478485,
0.05475940927863121,
0.0032947761937975883,
0.1653853803873062,
-0.2836952209472656,
-0.08245179802179337,
0.0040844217874109745,
0.10798820108175278,
0.07251309603452682,
0.04867156967520714,
0.024016045033931732,
0.03471764549612999,
0.06680657714605331,
0.06555314362049103,
-0.053423378616571426,
-0.09334948658943176,
-0.013234530575573444,
0.16782410442829132,
-0.12056409567594528,
-0.08494801074266434,
-0.039446670562028885,
-0.032574962824583054,
0.01528918743133545,
-0.18656723201274872,
-0.06157201528549194,
-0.04652497172355652,
0.014009855687618256,
0.11560789495706558,
-0.043703049421310425,
0.017459457740187645,
-0.025330526754260063,
0.03267478197813034,
-0.049356844276189804,
-0.06448514014482498,
0.10717439651489258,
-0.05839689448475838,
-0.14560763537883759,
-0.050329964607954025,
0.13346117734909058,
0.06817521154880524,
-0.02085650898516178,
-0.06815460324287415,
-0.054959241300821304,
0.03404691442847252,
-0.12688283622264862,
-0.00444478215649724,
0.09098512679338455,
-0.053426969796419144,
0.08297149091959,
-0.12247109413146973,
0.14400608837604523,
-0.05491652712225914,
0.08397864550352097,
0.05611664056777954,
0.3305107653141022,
-0.08004265278577805,
0.01135397981852293,
0.11764869838953018,
-0.004719098098576069,
-0.2637125849723816,
0.026469850912690163,
0.05434798076748848,
0.029891585931181908,
-0.0282036941498518,
-0.1317860335111618,
0.037872206419706345,
0.08485037833452225,
0.0007513535092584789,
0.15252260863780975,
-0.31008192896842957,
-0.07138115912675858,
0.035853061825037,
0.07493745535612106,
0.11727992445230484,
-0.055897410959005356,
0.015814758837223053,
0.007999288849532604,
-0.0064612519927322865,
0.14194786548614502,
-0.1429509073495865,
0.07896118611097336,
0.006883859634399414,
0.0021183656062930822,
0.0044470918364822865,
-0.050903916358947754,
0.1352005898952484,
-0.015023368410766125,
0.10188204795122147,
0.018910063430666924,
-0.03483083099126816,
0.04656843841075897,
-0.07938482612371445,
0.032474253326654434,
-0.07549914717674255,
0.0974266454577446,
-0.01371077448129654,
0.008442017249763012,
-0.06518817692995071,
-0.0011418788926675916,
-0.07654297351837158,
-0.05935432016849518,
-0.10049846023321152,
0.08921563625335693,
0.00007475275924662128,
-0.029378065839409828,
-0.05557582154870033,
0.04753444716334343,
0.07362274825572968,
0.44554808735847473,
-0.047555696219205856,
-0.0519176721572876,
0.0978567972779274,
0.08912154287099838,
-0.03383857384324074,
0.09002438932657242,
-0.1082867756485939,
0.053948406130075455,
0.09869905561208725,
0.0014765063533559442,
0.12464144825935364,
0.0727442055940628,
-0.10421831160783768,
-0.002084739739075303,
0.04356558620929718,
-0.1612345427274704,
-0.07101737707853317,
-0.01960318349301815,
0.025721555575728416,
-0.10159236192703247,
0.02594982646405697,
0.10416273027658463,
-0.041569165885448456,
0.04309748485684395,
0.018699878826737404,
0.052913274616003036,
-0.11190131306648254,
0.15306450426578522,
0.03502379730343819,
0.07957912236452103,
-0.06998652964830399,
0.0924820825457573,
0.0367986336350441,
-0.0026913080364465714,
0.04025959596037865,
-0.0277254655957222,
-0.11716416478157043,
-0.0048188441433012486,
-0.02993469499051571,
-0.07783868908882141,
0.12412923574447632,
-0.049683719873428345,
-0.03942154347896576,
-0.09436392039060593,
-0.010484152473509312,
0.11176150292158127,
0.042228322476148605,
0.09517300873994827,
0.01405021920800209,
0.01321012619882822,
-0.13630636036396027,
0.10785749554634094,
-0.02372516505420208,
0.02938421256840229,
-0.14647571742534637,
0.07298747450113297,
-0.014822459779679775,
0.04179554060101509,
-0.016972851008176804,
0.006439431104809046,
-0.20735956728458405,
0.02178529091179371,
-0.035037361085414886,
0.03156621381640434,
0.04145469143986702,
0.03050958178937435,
0.013928748667240143,
0.058162983506917953,
-0.024889351800084114,
0.03678032383322716,
-0.016862498596310616,
-0.03274361044168472,
0.0466717965900898,
-0.00867349375039339,
-0.05397464334964752,
-0.06672143191099167,
0.04484523460268974,
-0.10464757680892944,
0.05115405097603798,
0.016241174191236496,
-0.06263012439012527,
0.0682709813117981,
0.01046114694327116,
0.031328972429037094,
0.09963250905275345,
0.039484091103076935,
0.02428165078163147,
-0.07200931757688522,
0.03696676716208458,
-0.011100761592388153,
-0.0354752354323864,
0.04846044257283211,
0.12916992604732513,
-0.04588703811168671,
-0.045397818088531494,
-0.12153512239456177,
0.009334450587630272,
-0.05025159940123558,
0.03245856612920761,
0.14497186243534088,
0.06792300939559937,
0.10182499140501022,
-0.10694753378629684,
-0.03463906794786453,
-0.13659638166427612,
-0.08420486003160477,
0.043016206473112106,
-0.061113789677619934,
-0.08368314057588577,
-0.022373097017407417,
0.06458088755607605,
-0.0062421164475381374,
0.1159406527876854,
-0.09249752014875412,
-0.08400550484657288,
-0.051796942949295044,
-0.21874475479125977,
-0.10985803604125977,
0.01249344926327467,
0.2591431438922882,
0.014574572443962097,
-0.04733354225754738,
-0.0842360183596611,
-0.0018487382913008332,
0.07433507591485977,
0.1151801347732544,
0.025576947256922722,
0.11712482571601868,
-0.1345689445734024,
0.09369487315416336,
0.03894540295004845,
-0.04579859972000122,
0.11140527576208115,
0.2720058560371399,
-0.08403877168893814,
0.012069866992533207,
-0.0936768651008606,
0.10660451650619507,
0.041208669543266296,
-0.1143890917301178,
-0.01654689572751522,
-0.02938520349562168,
-0.14252008497714996,
-0.11252091079950333,
0.02446169964969158,
-0.07867883145809174,
-0.15200982987880707,
-0.027023719623684883,
-0.10641223937273026,
-0.0906573235988617,
0.06777424365282059,
0.03525780513882637,
-0.04030640423297882,
0.23540270328521729,
-0.042789097875356674,
0.044019024819135666,
-0.010767166502773762,
-0.004977116361260414,
-0.019459092989563942,
0.0007784778717905283,
-0.10880770534276962,
0.14135625958442688,
0.015247986651957035,
0.0905333161354065,
0.000977704650722444,
0.08683203905820847,
0.04910573735833168,
-0.030855052173137665,
-0.05393563210964203,
-0.012257409282028675,
0.01566295325756073,
-0.054429829120635986,
0.107729472219944,
0.053526509553194046,
-0.08524296432733536,
-0.06813377141952515,
-0.0031761161517351866,
-0.06653166562318802,
-0.055667679756879807,
-0.1370646208524704,
0.26870331168174744,
-0.02722533978521824,
0.11592331528663635,
-0.0014433804899454117,
-0.06927157193422318,
-0.06387840211391449,
0.13213929533958435,
0.11873839050531387,
-0.10775790363550186,
0.00790384877473116,
0.059936534613370895,
-0.0030022691935300827,
-0.10397547483444214,
0.15089398622512817,
0.08754771947860718,
-0.006650677416473627,
0.03205007687211037,
-0.00015904195606708527,
-0.02111419290304184,
0.010007224045693874,
-0.01973712630569935,
-0.02886245958507061,
-0.0003920215822290629,
0.05424710735678673,
-0.1387188881635666,
-0.028425097465515137,
-0.05750333145260811,
-0.08240268379449844,
0.15211033821105957,
-0.12902764976024628,
-0.07242991775274277,
-0.034308061003685,
-0.05497187003493309,
-0.10183306783437729,
0.01747305318713188,
-0.1057824194431305,
0.06574340909719467,
0.052960846573114395,
-0.05126681923866272,
0.02095749042928219,
-0.035985786467790604,
-0.022899987176060677,
0.04471733793616295,
0.11470385640859604,
-0.0020694206468760967,
0.0627727061510086,
0.1270943284034729,
-0.04092846438288689,
-0.07114578038454056,
0.13854539394378662,
0.025652283802628517,
-0.04725794494152069,
-0.12916885316371918,
0.02968071587383747,
-0.028041580691933632,
0.14007607102394104,
0.035487476736307144,
-0.06207725778222084,
-0.00884179212152958,
-0.211495503783226,
-0.01880897954106331,
-0.14793545007705688,
-0.05283280089497566,
-0.05305294319987297,
0.10631292313337326,
0.14952032268047333,
-0.06082995608448982,
0.032887052744627,
-0.02885325811803341,
0.05036427453160286,
-0.03952554613351822,
0.07914081960916519,
0.03381401672959328,
-0.15669609606266022,
0.07804804295301437,
-0.044815924018621445,
0.022120162844657898,
-0.3223344087600708,
0.011946987360715866,
0.0014338381588459015,
-0.031393930315971375,
-0.059358149766922,
0.14773760735988617,
0.031608715653419495,
0.06965851038694382,
-0.06343872100114822,
-0.24901317059993744,
-0.06645871698856354,
0.13729976117610931,
0.007686936762183905,
-0.07767429202795029
] |
null | null | null |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-base-rank2
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.5797
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 13.8439 | 1.0 | 180 | 9.7690 |
| 8.174 | 2.0 | 360 | 5.4265 |
| 5.4657 | 3.0 | 540 | 4.5797 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "microsoft/deberta-v3-base", "model-index": [{"name": "deberta-v3-base-rank2", "results": []}]} | null | alitolga/deberta-v3-base-rank2 | [
"safetensors",
"generated_from_trainer",
"base_model:microsoft/deberta-v3-base",
"license:mit",
"region:us"
] | 2024-02-12T12:31:13+00:00 | [] | [] | TAGS
#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us
| deberta-v3-base-rank2
=====================
This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 4.5797
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.1+cu118
* Datasets 2.15.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
37,
98,
4,
33
] | [
"passage: TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
-0.10977909713983536,
-0.02383926510810852,
-0.000272897508693859,
0.0962144061923027,
0.18739436566829681,
0.03188342973589897,
0.1521315574645996,
0.04621896147727966,
-0.11039812862873077,
0.026030896231532097,
0.10344589501619339,
0.12346210330724716,
-0.01214214600622654,
0.12621824443340302,
-0.05480639636516571,
-0.22153477370738983,
0.015339897945523262,
0.017923252657055855,
-0.055215757340192795,
0.11165095120668411,
0.10440938174724579,
-0.16147270798683167,
0.0682869404554367,
-0.013526142574846745,
-0.23760250210762024,
0.03791137784719467,
0.04424617439508438,
-0.06407684087753296,
0.14295685291290283,
-0.009633136913180351,
0.18386055529117584,
0.012823836877942085,
0.13880614936351776,
-0.16603654623031616,
0.01076443400233984,
0.06743723154067993,
0.0072526587173342705,
0.05671124532818794,
0.06414835900068283,
0.008718214929103851,
0.08650407195091248,
-0.10005319863557816,
0.0757589116692543,
0.031208310276269913,
-0.13932667672634125,
-0.27417436242103577,
-0.08789606392383575,
0.0025896879378706217,
0.07555236667394638,
0.08566202968358994,
-0.013975093141198158,
0.17519612610340118,
-0.10044020414352417,
0.08535193651914597,
0.25552237033843994,
-0.25281932950019836,
-0.07021211832761765,
0.071146659553051,
0.023459527641534805,
0.08894380927085876,
-0.10143274813890457,
-0.02900632470846176,
0.0903119370341301,
0.04442868381738663,
0.12188786268234253,
-0.012826957739889622,
-0.11815144866704941,
0.009122032672166824,
-0.14692682027816772,
-0.004426182713359594,
0.0912783071398735,
0.03500741720199585,
-0.054377976804971695,
-0.004236732143908739,
-0.07277829945087433,
-0.1371414214372635,
-0.051367513835430145,
-0.04145136475563049,
0.06730754673480988,
-0.06141535937786102,
-0.08764902502298355,
0.01499799732118845,
-0.11625413596630096,
-0.11828317493200302,
-0.024499638006091118,
0.24395588040351868,
0.052924994379282,
0.028124088421463966,
-0.04764493182301521,
0.1291472613811493,
-0.05694502219557762,
-0.1366644948720932,
0.024524841457605362,
0.019550397992134094,
0.024960318580269814,
-0.05345170944929123,
-0.07289297133684158,
-0.0356169231235981,
0.027074923738837242,
0.1499943733215332,
-0.1311299353837967,
0.0237225741147995,
0.04559052735567093,
0.028581978753209114,
-0.11121973395347595,
0.16195543110370636,
-0.035876959562301636,
0.010333948768675327,
0.028275663033127785,
0.07690852880477905,
0.03842984512448311,
0.0128098726272583,
-0.07406949251890182,
0.005984250921756029,
0.1138528510928154,
0.03108055889606476,
-0.10245327651500702,
0.03943431377410889,
-0.04984300583600998,
0.02609124220907688,
0.022958341985940933,
-0.09585303068161011,
0.02446972019970417,
0.01637401059269905,
-0.0685202106833458,
-0.0464635044336319,
0.017798306420445442,
0.028543973341584206,
0.041776999831199646,
0.1314091831445694,
-0.09103567898273468,
0.033552948385477066,
-0.1253967434167862,
-0.11012474447488785,
-0.012358481995761395,
-0.013520389795303345,
0.038584187626838684,
-0.11949725449085236,
-0.1693306863307953,
-0.004231118597090244,
0.03667663410305977,
-0.015293833799660206,
0.01881149411201477,
-0.03158961609005928,
-0.0996800884604454,
-0.012826534919440746,
-0.026565516367554665,
0.11635850369930267,
-0.0694487988948822,
0.11646972596645355,
0.09728671610355377,
0.05548146739602089,
-0.09908585995435715,
0.02373643033206463,
-0.10591994971036911,
0.0024640432093292475,
-0.2569774389266968,
-0.005124390125274658,
-0.06571763008832932,
0.06924369931221008,
-0.03966303914785385,
-0.1075524166226387,
0.022764498367905617,
0.005322130396962166,
0.10874362289905548,
0.1000693067908287,
-0.17926017940044403,
-0.05148404836654663,
0.12636883556842804,
-0.1271693855524063,
-0.1113729402422905,
0.08568505942821503,
-0.04757939279079437,
0.06583791971206665,
0.07703462988138199,
0.16356876492500305,
-0.00956286396831274,
-0.15184687077999115,
0.010717145167291164,
-0.0590687021613121,
0.04493590071797371,
-0.05465210974216461,
0.0488610714673996,
0.0008339445339515805,
-0.021729113534092903,
0.032641030848026276,
-0.06953171640634537,
0.04686718434095383,
-0.1324886828660965,
-0.07485809177160263,
-0.06488070636987686,
-0.1212497353553772,
0.041503455489873886,
0.06225567311048508,
0.06155192852020264,
-0.1368376612663269,
-0.061608269810676575,
0.1359492540359497,
0.0928584486246109,
-0.05217166990041733,
0.004119411576539278,
-0.05063227564096451,
0.05520080775022507,
-0.041433896869421005,
-0.04931100085377693,
-0.1645110845565796,
-0.08925328403711319,
0.015393294394016266,
-0.0047068120911717415,
0.00327594974078238,
-0.044142138212919235,
0.07455213367938995,
0.11798553168773651,
-0.07449058443307877,
-0.04214201495051384,
-0.10275856405496597,
0.019334375858306885,
-0.1075754463672638,
-0.19162553548812866,
-0.03287539631128311,
-0.00888550654053688,
0.07661817222833633,
-0.20160453021526337,
0.04192623123526573,
-0.029132604598999023,
0.09545008093118668,
0.02637353353202343,
-0.005475367419421673,
-0.06444989144802094,
0.09747123718261719,
0.0045448471792042255,
-0.0580558106303215,
0.02937583066523075,
-0.021635031327605247,
-0.04795526713132858,
-0.09300464391708374,
-0.09899953752756119,
0.19842319190502167,
0.14743494987487793,
-0.10093656182289124,
-0.09002619981765747,
0.0297528188675642,
-0.06982388347387314,
-0.016209444031119347,
-0.06776061654090881,
0.020620150491595268,
0.11483170837163925,
-0.02282099425792694,
0.11661163717508316,
-0.09248635172843933,
-0.022356461733579636,
0.0014529010513797402,
-0.041367556899785995,
0.057399142533540726,
0.06753043085336685,
0.11225567758083344,
-0.053833018988370895,
0.11423299461603165,
0.15965376794338226,
-0.11110177636146545,
0.10334773361682892,
-0.06334836781024933,
-0.07238177955150604,
-0.015347606502473354,
-0.01436277199536562,
-0.007964075542986393,
0.1776101291179657,
-0.04427338019013405,
0.03807692229747772,
-0.009708701632916927,
0.017880810424685478,
0.027409786358475685,
-0.2506949007511139,
-0.052766069769859314,
-0.006493645720183849,
-0.04180464893579483,
-0.046832308173179626,
-0.01916314661502838,
0.020225372165441513,
0.10081995278596878,
-0.04058639332652092,
-0.057044293731451035,
0.0223590936511755,
0.00557175325229764,
-0.07168775796890259,
0.22610197961330414,
-0.07881317287683487,
-0.04608485847711563,
-0.10444154590368271,
0.012919694185256958,
-0.04146885499358177,
0.009725523181259632,
0.0729912519454956,
-0.11279396712779999,
-0.044001005589962006,
-0.06022777780890465,
0.05084332823753357,
0.07876109331846237,
0.039546042680740356,
0.007717863656580448,
0.01782396249473095,
0.09497642517089844,
-0.14543552696704865,
0.0006467378116212785,
-0.07175392657518387,
-0.08312161266803741,
0.056910187005996704,
0.08347267657518387,
0.11983032524585724,
0.13179658353328705,
-0.040337517857551575,
-0.02079136297106743,
-0.02325625531375408,
0.27377429604530334,
-0.06932184845209122,
-0.062200676649808884,
0.14498460292816162,
-0.0005378815694712102,
0.03789127990603447,
0.11220243573188782,
0.08702115714550018,
-0.13622024655342102,
0.029426908120512962,
0.025967992842197418,
-0.021020567044615746,
-0.21206539869308472,
-0.028446493670344353,
-0.0029190380591899157,
-0.07482894510030746,
0.0390218086540699,
0.021479828283190727,
-0.0022142441011965275,
0.06011484935879707,
0.02166934125125408,
0.048596709966659546,
-0.0249907448887825,
0.06053663790225983,
0.0602521076798439,
0.046088092029094696,
0.10996872186660767,
-0.04331322759389877,
-0.06801527738571167,
0.02170231193304062,
-0.04719885438680649,
0.22817444801330566,
0.00001491811781306751,
0.041673850268125534,
0.07939895242452621,
0.17735464870929718,
-0.002462110947817564,
0.10430179536342621,
0.019770093262195587,
-0.06619296967983246,
0.011421018280088902,
-0.07086790353059769,
-0.003634663764387369,
0.02175724133849144,
-0.14258019626140594,
0.08539474755525589,
-0.12784256041049957,
-0.004376190714538097,
0.0801885724067688,
0.22792589664459229,
0.02038872055709362,
-0.33866608142852783,
-0.07218042016029358,
0.0015513667603954673,
-0.007956861518323421,
-0.023518022149801254,
0.011211591772735119,
0.1214594691991806,
-0.04184851422905922,
0.027020303532481194,
-0.03522983193397522,
0.06242155283689499,
0.031152624636888504,
0.04701671749353409,
0.05710339918732643,
0.13777777552604675,
-0.01353135984390974,
0.03180978074669838,
-0.2967328429222107,
0.2733897864818573,
0.019609946757555008,
0.13773353397846222,
-0.023160841315984726,
-0.02387249656021595,
0.020564470440149307,
0.06624981760978699,
0.01824108138680458,
-0.030782584100961685,
-0.05063082277774811,
-0.23543617129325867,
-0.039089519530534744,
0.06226648762822151,
0.12959490716457367,
0.01751827634871006,
0.09988957643508911,
-0.00317860534414649,
0.012835978530347347,
0.10463982820510864,
-0.06836505234241486,
-0.19078455865383148,
-0.010738518089056015,
-0.0613342821598053,
0.02804485149681568,
0.016211025416851044,
-0.11761446297168732,
-0.1024993360042572,
-0.0700412392616272,
0.06540956348180771,
0.0037123053334653378,
-0.03701884299516678,
-0.11041536182165146,
0.07908197492361069,
0.09566164761781693,
-0.048732247203588486,
0.07471208274364471,
0.041078776121139526,
0.056677158921957016,
0.024378404021263123,
-0.03807076811790466,
0.10815038532018661,
-0.06901469081640244,
-0.2009645253419876,
-0.05414474010467529,
0.08314872533082962,
0.05137045681476593,
0.03196307271718979,
-0.01318739727139473,
0.015162421390414238,
0.008538338355720043,
-0.09961853921413422,
0.0038757321890443563,
-0.02996899001300335,
0.05577516183257103,
0.028683772310614586,
-0.051311735063791275,
-0.042351674288511276,
-0.050810445100069046,
-0.0363025926053524,
0.10146798938512802,
0.32721322774887085,
-0.06988053023815155,
-0.04230755195021629,
0.08364763855934143,
-0.03463691845536232,
-0.17070871591567993,
0.09220478683710098,
0.06034506857395172,
-0.0038430248387157917,
0.07597258687019348,
-0.11784222722053528,
0.12991152703762054,
0.14157889783382416,
-0.028757434338331223,
0.1240784227848053,
-0.28924304246902466,
-0.1585167944431305,
0.11064843088388443,
0.21439293026924133,
0.1088043749332428,
-0.1557728499174118,
-0.01764458417892456,
-0.04504897817969322,
-0.13108333945274353,
0.09324704110622406,
-0.15120381116867065,
0.08701140433549881,
-0.01612214185297489,
0.06821642071008682,
-0.0008300155750475824,
-0.06112472340464592,
0.14311541616916656,
0.0007622124394401908,
0.1586318016052246,
-0.03935972973704338,
-0.012870344333350658,
0.09020872414112091,
-0.021951651200652122,
0.0271515604108572,
-0.01850840076804161,
0.04274798557162285,
0.009107017889618874,
-0.023169470950961113,
-0.08148299902677536,
0.049713414162397385,
-0.0533994697034359,
-0.08378700911998749,
-0.024846956133842468,
0.018639277666807175,
-0.024928512051701546,
-0.04630934074521065,
0.09266750514507294,
0.03939923644065857,
0.16225983202457428,
0.07548313587903976,
0.029977168887853622,
-0.0779544860124588,
-0.00511569669470191,
0.03007619082927704,
-0.0344298779964447,
0.07267012447118759,
-0.13134963810443878,
0.012572791427373886,
0.1029602587223053,
0.020421480759978294,
0.10135439038276672,
0.08415261656045914,
-0.04837226867675781,
0.028388533741235733,
0.09198541939258575,
-0.16636347770690918,
-0.0741669088602066,
0.008622251451015472,
-0.039275527000427246,
-0.07221722602844238,
0.10561412572860718,
0.0799483135342598,
-0.09373471885919571,
-0.007967788726091385,
-0.03593408688902855,
-0.012631812132894993,
-0.07303313910961151,
0.2227294147014618,
0.10064034163951874,
0.05390103533864021,
-0.09364035725593567,
0.09540357440710068,
0.037226930260658264,
-0.019714446738362312,
-0.02629864402115345,
0.050324659794569016,
-0.0708092525601387,
-0.0081774378195405,
0.11950493603944778,
0.19487260282039642,
-0.07515739649534225,
-0.06573905795812607,
-0.18015339970588684,
-0.11609358340501785,
0.016829947009682655,
0.19300605356693268,
0.10496100038290024,
0.005628860089927912,
0.018334483727812767,
0.04501958191394806,
-0.13519175350666046,
0.08371682465076447,
0.013451937586069107,
0.09526120126247406,
-0.15351462364196777,
0.1688528209924698,
0.02303115464746952,
-0.0004302372981328517,
-0.03028959222137928,
0.06828100979328156,
-0.1316944658756256,
0.024528106674551964,
-0.1442081779241562,
-0.04870409145951271,
0.009894388727843761,
-0.0011268117232248187,
0.009574041701853275,
-0.07749912887811661,
-0.07624939829111099,
0.047325003892183304,
-0.10701669752597809,
-0.00812611822038889,
0.05367180332541466,
0.03715867921710014,
-0.15122228860855103,
-0.03743722289800644,
0.019229386001825333,
-0.06613703072071075,
0.03823935240507126,
0.0487651601433754,
0.03849530592560768,
0.09630363434553146,
-0.20997434854507446,
-0.0029430179856717587,
0.08880948275327682,
-0.016367431730031967,
0.0696609690785408,
-0.05353253334760666,
-0.02602485939860344,
-0.007858239114284515,
0.0978042483329773,
0.008416155353188515,
0.0867205411195755,
-0.13997472822666168,
-0.006191175431013107,
-0.02968805655837059,
-0.06472725421190262,
-0.04450218006968498,
-0.0349254235625267,
0.09429186582565308,
-0.013142097741365433,
0.18351885676383972,
-0.09520009905099869,
0.011849514208734035,
-0.21236686408519745,
-0.014442931860685349,
-0.028647301718592644,
-0.09178724139928818,
-0.15416806936264038,
-0.026780616492033005,
0.06469432264566422,
-0.04646223038434982,
0.14698436856269836,
-0.0015701946103945374,
0.06641637533903122,
0.0374632254242897,
-0.03034866787493229,
0.026866896077990532,
0.0457080602645874,
0.24175049364566803,
0.031129976734519005,
-0.00613299710676074,
0.03124288097023964,
0.0734565481543541,
0.11419452726840973,
0.01158747635781765,
0.22155217826366425,
0.18127477169036865,
-0.07043489068746567,
0.1019318550825119,
0.0614655576646328,
-0.06218587979674339,
-0.11618132889270782,
0.0043325223959982395,
-0.024654695764183998,
0.025565601885318756,
-0.037003397941589355,
0.18632014095783234,
0.10361658036708832,
-0.16563965380191803,
0.013917487114667892,
-0.0577508844435215,
-0.07344912737607956,
-0.09634224325418472,
0.07478474825620651,
-0.08116906881332397,
-0.19115056097507477,
0.026669880375266075,
-0.11170000582933426,
-0.01101001352071762,
0.1307188719511032,
-0.015901604667305946,
-0.008357066661119461,
0.235883891582489,
0.07531672716140747,
0.062061015516519547,
0.01887267641723156,
0.002806885400786996,
-0.03576918691396713,
-0.07197372615337372,
-0.1012999638915062,
0.005275350995361805,
-0.0329875648021698,
0.015152910724282265,
-0.05596904084086418,
-0.12911638617515564,
0.0355282686650753,
0.015552476979792118,
-0.09444280713796616,
0.024008115753531456,
0.033259183168411255,
0.050202906131744385,
0.02025768719613552,
0.011496256105601788,
0.022110527381300926,
-0.0040362440049648285,
0.2072332501411438,
-0.057010989636182785,
-0.13155171275138855,
-0.06247086822986603,
0.26245516538619995,
0.04406530410051346,
0.021840333938598633,
0.02593136578798294,
-0.12180887162685394,
0.010624225251376629,
0.15366508066654205,
0.14873500168323517,
-0.08162666112184525,
-0.0008891946054063737,
-0.04128885641694069,
-0.02712344564497471,
-0.09549082070589066,
0.12738096714019775,
0.1272391825914383,
0.04202356934547424,
-0.09513653069734573,
-0.020700950175523758,
-0.05248567834496498,
-0.008205144666135311,
-0.04144055396318436,
0.02868380770087242,
0.03159588947892189,
0.009482614696025848,
-0.059001702815294266,
0.07555936276912689,
-0.01746372878551483,
-0.11142945289611816,
0.0962703675031662,
-0.15968038141727448,
-0.14188013970851898,
-0.010838326066732407,
0.1389695703983307,
-0.023466376587748528,
0.06749363988637924,
-0.052638206630945206,
-0.005400174763053656,
0.04293496161699295,
-0.02973678521811962,
-0.051605794578790665,
-0.127696231007576,
0.0767015814781189,
-0.0755990594625473,
0.236514151096344,
-0.0324367955327034,
0.12624210119247437,
0.10700509697198868,
0.03140915557742119,
-0.09085901826620102,
0.12035630643367767,
0.03917442262172699,
-0.1436665952205658,
-0.0003976380976382643,
0.0872146487236023,
-0.052429888397455215,
0.0752221941947937,
0.01663416437804699,
-0.1661776900291443,
0.028236106038093567,
0.006344062741845846,
-0.0737876445055008,
-0.057386551052331924,
-0.06284373253583908,
-0.06381191313266754,
0.09128324687480927,
0.1598876714706421,
-0.03899944946169853,
0.04929110035300255,
-0.0702781155705452,
0.07627421617507935,
0.0825834795832634,
0.04035616293549538,
-0.028585845604538918,
-0.2825299799442291,
0.058108292520046234,
0.14923399686813354,
-0.056315988302230835,
-0.2234940379858017,
-0.07451563328504562,
0.01051599532365799,
-0.0654032900929451,
-0.09339899569749832,
0.06626714020967484,
0.11650731414556503,
0.05525080859661102,
-0.045404739677906036,
-0.1677107810974121,
-0.10386636853218079,
0.1699165403842926,
-0.14799459278583527,
-0.12518051266670227
] |
null | null | null |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-base-rank4
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.8714
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 13.9933 | 1.0 | 179 | 8.9588 |
| 7.6099 | 2.0 | 358 | 5.4768 |
| 5.6275 | 3.0 | 537 | 4.8714 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "microsoft/deberta-v3-base", "model-index": [{"name": "deberta-v3-base-rank4", "results": []}]} | null | alitolga/deberta-v3-base-rank4 | [
"safetensors",
"generated_from_trainer",
"base_model:microsoft/deberta-v3-base",
"license:mit",
"region:us"
] | 2024-02-12T12:35:22+00:00 | [] | [] | TAGS
#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us
| deberta-v3-base-rank4
=====================
This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 4.8714
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.1+cu118
* Datasets 2.15.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
37,
98,
4,
33
] | [
"passage: TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
-0.10977909713983536,
-0.02383926510810852,
-0.000272897508693859,
0.0962144061923027,
0.18739436566829681,
0.03188342973589897,
0.1521315574645996,
0.04621896147727966,
-0.11039812862873077,
0.026030896231532097,
0.10344589501619339,
0.12346210330724716,
-0.01214214600622654,
0.12621824443340302,
-0.05480639636516571,
-0.22153477370738983,
0.015339897945523262,
0.017923252657055855,
-0.055215757340192795,
0.11165095120668411,
0.10440938174724579,
-0.16147270798683167,
0.0682869404554367,
-0.013526142574846745,
-0.23760250210762024,
0.03791137784719467,
0.04424617439508438,
-0.06407684087753296,
0.14295685291290283,
-0.009633136913180351,
0.18386055529117584,
0.012823836877942085,
0.13880614936351776,
-0.16603654623031616,
0.01076443400233984,
0.06743723154067993,
0.0072526587173342705,
0.05671124532818794,
0.06414835900068283,
0.008718214929103851,
0.08650407195091248,
-0.10005319863557816,
0.0757589116692543,
0.031208310276269913,
-0.13932667672634125,
-0.27417436242103577,
-0.08789606392383575,
0.0025896879378706217,
0.07555236667394638,
0.08566202968358994,
-0.013975093141198158,
0.17519612610340118,
-0.10044020414352417,
0.08535193651914597,
0.25552237033843994,
-0.25281932950019836,
-0.07021211832761765,
0.071146659553051,
0.023459527641534805,
0.08894380927085876,
-0.10143274813890457,
-0.02900632470846176,
0.0903119370341301,
0.04442868381738663,
0.12188786268234253,
-0.012826957739889622,
-0.11815144866704941,
0.009122032672166824,
-0.14692682027816772,
-0.004426182713359594,
0.0912783071398735,
0.03500741720199585,
-0.054377976804971695,
-0.004236732143908739,
-0.07277829945087433,
-0.1371414214372635,
-0.051367513835430145,
-0.04145136475563049,
0.06730754673480988,
-0.06141535937786102,
-0.08764902502298355,
0.01499799732118845,
-0.11625413596630096,
-0.11828317493200302,
-0.024499638006091118,
0.24395588040351868,
0.052924994379282,
0.028124088421463966,
-0.04764493182301521,
0.1291472613811493,
-0.05694502219557762,
-0.1366644948720932,
0.024524841457605362,
0.019550397992134094,
0.024960318580269814,
-0.05345170944929123,
-0.07289297133684158,
-0.0356169231235981,
0.027074923738837242,
0.1499943733215332,
-0.1311299353837967,
0.0237225741147995,
0.04559052735567093,
0.028581978753209114,
-0.11121973395347595,
0.16195543110370636,
-0.035876959562301636,
0.010333948768675327,
0.028275663033127785,
0.07690852880477905,
0.03842984512448311,
0.0128098726272583,
-0.07406949251890182,
0.005984250921756029,
0.1138528510928154,
0.03108055889606476,
-0.10245327651500702,
0.03943431377410889,
-0.04984300583600998,
0.02609124220907688,
0.022958341985940933,
-0.09585303068161011,
0.02446972019970417,
0.01637401059269905,
-0.0685202106833458,
-0.0464635044336319,
0.017798306420445442,
0.028543973341584206,
0.041776999831199646,
0.1314091831445694,
-0.09103567898273468,
0.033552948385477066,
-0.1253967434167862,
-0.11012474447488785,
-0.012358481995761395,
-0.013520389795303345,
0.038584187626838684,
-0.11949725449085236,
-0.1693306863307953,
-0.004231118597090244,
0.03667663410305977,
-0.015293833799660206,
0.01881149411201477,
-0.03158961609005928,
-0.0996800884604454,
-0.012826534919440746,
-0.026565516367554665,
0.11635850369930267,
-0.0694487988948822,
0.11646972596645355,
0.09728671610355377,
0.05548146739602089,
-0.09908585995435715,
0.02373643033206463,
-0.10591994971036911,
0.0024640432093292475,
-0.2569774389266968,
-0.005124390125274658,
-0.06571763008832932,
0.06924369931221008,
-0.03966303914785385,
-0.1075524166226387,
0.022764498367905617,
0.005322130396962166,
0.10874362289905548,
0.1000693067908287,
-0.17926017940044403,
-0.05148404836654663,
0.12636883556842804,
-0.1271693855524063,
-0.1113729402422905,
0.08568505942821503,
-0.04757939279079437,
0.06583791971206665,
0.07703462988138199,
0.16356876492500305,
-0.00956286396831274,
-0.15184687077999115,
0.010717145167291164,
-0.0590687021613121,
0.04493590071797371,
-0.05465210974216461,
0.0488610714673996,
0.0008339445339515805,
-0.021729113534092903,
0.032641030848026276,
-0.06953171640634537,
0.04686718434095383,
-0.1324886828660965,
-0.07485809177160263,
-0.06488070636987686,
-0.1212497353553772,
0.041503455489873886,
0.06225567311048508,
0.06155192852020264,
-0.1368376612663269,
-0.061608269810676575,
0.1359492540359497,
0.0928584486246109,
-0.05217166990041733,
0.004119411576539278,
-0.05063227564096451,
0.05520080775022507,
-0.041433896869421005,
-0.04931100085377693,
-0.1645110845565796,
-0.08925328403711319,
0.015393294394016266,
-0.0047068120911717415,
0.00327594974078238,
-0.044142138212919235,
0.07455213367938995,
0.11798553168773651,
-0.07449058443307877,
-0.04214201495051384,
-0.10275856405496597,
0.019334375858306885,
-0.1075754463672638,
-0.19162553548812866,
-0.03287539631128311,
-0.00888550654053688,
0.07661817222833633,
-0.20160453021526337,
0.04192623123526573,
-0.029132604598999023,
0.09545008093118668,
0.02637353353202343,
-0.005475367419421673,
-0.06444989144802094,
0.09747123718261719,
0.0045448471792042255,
-0.0580558106303215,
0.02937583066523075,
-0.021635031327605247,
-0.04795526713132858,
-0.09300464391708374,
-0.09899953752756119,
0.19842319190502167,
0.14743494987487793,
-0.10093656182289124,
-0.09002619981765747,
0.0297528188675642,
-0.06982388347387314,
-0.016209444031119347,
-0.06776061654090881,
0.020620150491595268,
0.11483170837163925,
-0.02282099425792694,
0.11661163717508316,
-0.09248635172843933,
-0.022356461733579636,
0.0014529010513797402,
-0.041367556899785995,
0.057399142533540726,
0.06753043085336685,
0.11225567758083344,
-0.053833018988370895,
0.11423299461603165,
0.15965376794338226,
-0.11110177636146545,
0.10334773361682892,
-0.06334836781024933,
-0.07238177955150604,
-0.015347606502473354,
-0.01436277199536562,
-0.007964075542986393,
0.1776101291179657,
-0.04427338019013405,
0.03807692229747772,
-0.009708701632916927,
0.017880810424685478,
0.027409786358475685,
-0.2506949007511139,
-0.052766069769859314,
-0.006493645720183849,
-0.04180464893579483,
-0.046832308173179626,
-0.01916314661502838,
0.020225372165441513,
0.10081995278596878,
-0.04058639332652092,
-0.057044293731451035,
0.0223590936511755,
0.00557175325229764,
-0.07168775796890259,
0.22610197961330414,
-0.07881317287683487,
-0.04608485847711563,
-0.10444154590368271,
0.012919694185256958,
-0.04146885499358177,
0.009725523181259632,
0.0729912519454956,
-0.11279396712779999,
-0.044001005589962006,
-0.06022777780890465,
0.05084332823753357,
0.07876109331846237,
0.039546042680740356,
0.007717863656580448,
0.01782396249473095,
0.09497642517089844,
-0.14543552696704865,
0.0006467378116212785,
-0.07175392657518387,
-0.08312161266803741,
0.056910187005996704,
0.08347267657518387,
0.11983032524585724,
0.13179658353328705,
-0.040337517857551575,
-0.02079136297106743,
-0.02325625531375408,
0.27377429604530334,
-0.06932184845209122,
-0.062200676649808884,
0.14498460292816162,
-0.0005378815694712102,
0.03789127990603447,
0.11220243573188782,
0.08702115714550018,
-0.13622024655342102,
0.029426908120512962,
0.025967992842197418,
-0.021020567044615746,
-0.21206539869308472,
-0.028446493670344353,
-0.0029190380591899157,
-0.07482894510030746,
0.0390218086540699,
0.021479828283190727,
-0.0022142441011965275,
0.06011484935879707,
0.02166934125125408,
0.048596709966659546,
-0.0249907448887825,
0.06053663790225983,
0.0602521076798439,
0.046088092029094696,
0.10996872186660767,
-0.04331322759389877,
-0.06801527738571167,
0.02170231193304062,
-0.04719885438680649,
0.22817444801330566,
0.00001491811781306751,
0.041673850268125534,
0.07939895242452621,
0.17735464870929718,
-0.002462110947817564,
0.10430179536342621,
0.019770093262195587,
-0.06619296967983246,
0.011421018280088902,
-0.07086790353059769,
-0.003634663764387369,
0.02175724133849144,
-0.14258019626140594,
0.08539474755525589,
-0.12784256041049957,
-0.004376190714538097,
0.0801885724067688,
0.22792589664459229,
0.02038872055709362,
-0.33866608142852783,
-0.07218042016029358,
0.0015513667603954673,
-0.007956861518323421,
-0.023518022149801254,
0.011211591772735119,
0.1214594691991806,
-0.04184851422905922,
0.027020303532481194,
-0.03522983193397522,
0.06242155283689499,
0.031152624636888504,
0.04701671749353409,
0.05710339918732643,
0.13777777552604675,
-0.01353135984390974,
0.03180978074669838,
-0.2967328429222107,
0.2733897864818573,
0.019609946757555008,
0.13773353397846222,
-0.023160841315984726,
-0.02387249656021595,
0.020564470440149307,
0.06624981760978699,
0.01824108138680458,
-0.030782584100961685,
-0.05063082277774811,
-0.23543617129325867,
-0.039089519530534744,
0.06226648762822151,
0.12959490716457367,
0.01751827634871006,
0.09988957643508911,
-0.00317860534414649,
0.012835978530347347,
0.10463982820510864,
-0.06836505234241486,
-0.19078455865383148,
-0.010738518089056015,
-0.0613342821598053,
0.02804485149681568,
0.016211025416851044,
-0.11761446297168732,
-0.1024993360042572,
-0.0700412392616272,
0.06540956348180771,
0.0037123053334653378,
-0.03701884299516678,
-0.11041536182165146,
0.07908197492361069,
0.09566164761781693,
-0.048732247203588486,
0.07471208274364471,
0.041078776121139526,
0.056677158921957016,
0.024378404021263123,
-0.03807076811790466,
0.10815038532018661,
-0.06901469081640244,
-0.2009645253419876,
-0.05414474010467529,
0.08314872533082962,
0.05137045681476593,
0.03196307271718979,
-0.01318739727139473,
0.015162421390414238,
0.008538338355720043,
-0.09961853921413422,
0.0038757321890443563,
-0.02996899001300335,
0.05577516183257103,
0.028683772310614586,
-0.051311735063791275,
-0.042351674288511276,
-0.050810445100069046,
-0.0363025926053524,
0.10146798938512802,
0.32721322774887085,
-0.06988053023815155,
-0.04230755195021629,
0.08364763855934143,
-0.03463691845536232,
-0.17070871591567993,
0.09220478683710098,
0.06034506857395172,
-0.0038430248387157917,
0.07597258687019348,
-0.11784222722053528,
0.12991152703762054,
0.14157889783382416,
-0.028757434338331223,
0.1240784227848053,
-0.28924304246902466,
-0.1585167944431305,
0.11064843088388443,
0.21439293026924133,
0.1088043749332428,
-0.1557728499174118,
-0.01764458417892456,
-0.04504897817969322,
-0.13108333945274353,
0.09324704110622406,
-0.15120381116867065,
0.08701140433549881,
-0.01612214185297489,
0.06821642071008682,
-0.0008300155750475824,
-0.06112472340464592,
0.14311541616916656,
0.0007622124394401908,
0.1586318016052246,
-0.03935972973704338,
-0.012870344333350658,
0.09020872414112091,
-0.021951651200652122,
0.0271515604108572,
-0.01850840076804161,
0.04274798557162285,
0.009107017889618874,
-0.023169470950961113,
-0.08148299902677536,
0.049713414162397385,
-0.0533994697034359,
-0.08378700911998749,
-0.024846956133842468,
0.018639277666807175,
-0.024928512051701546,
-0.04630934074521065,
0.09266750514507294,
0.03939923644065857,
0.16225983202457428,
0.07548313587903976,
0.029977168887853622,
-0.0779544860124588,
-0.00511569669470191,
0.03007619082927704,
-0.0344298779964447,
0.07267012447118759,
-0.13134963810443878,
0.012572791427373886,
0.1029602587223053,
0.020421480759978294,
0.10135439038276672,
0.08415261656045914,
-0.04837226867675781,
0.028388533741235733,
0.09198541939258575,
-0.16636347770690918,
-0.0741669088602066,
0.008622251451015472,
-0.039275527000427246,
-0.07221722602844238,
0.10561412572860718,
0.0799483135342598,
-0.09373471885919571,
-0.007967788726091385,
-0.03593408688902855,
-0.012631812132894993,
-0.07303313910961151,
0.2227294147014618,
0.10064034163951874,
0.05390103533864021,
-0.09364035725593567,
0.09540357440710068,
0.037226930260658264,
-0.019714446738362312,
-0.02629864402115345,
0.050324659794569016,
-0.0708092525601387,
-0.0081774378195405,
0.11950493603944778,
0.19487260282039642,
-0.07515739649534225,
-0.06573905795812607,
-0.18015339970588684,
-0.11609358340501785,
0.016829947009682655,
0.19300605356693268,
0.10496100038290024,
0.005628860089927912,
0.018334483727812767,
0.04501958191394806,
-0.13519175350666046,
0.08371682465076447,
0.013451937586069107,
0.09526120126247406,
-0.15351462364196777,
0.1688528209924698,
0.02303115464746952,
-0.0004302372981328517,
-0.03028959222137928,
0.06828100979328156,
-0.1316944658756256,
0.024528106674551964,
-0.1442081779241562,
-0.04870409145951271,
0.009894388727843761,
-0.0011268117232248187,
0.009574041701853275,
-0.07749912887811661,
-0.07624939829111099,
0.047325003892183304,
-0.10701669752597809,
-0.00812611822038889,
0.05367180332541466,
0.03715867921710014,
-0.15122228860855103,
-0.03743722289800644,
0.019229386001825333,
-0.06613703072071075,
0.03823935240507126,
0.0487651601433754,
0.03849530592560768,
0.09630363434553146,
-0.20997434854507446,
-0.0029430179856717587,
0.08880948275327682,
-0.016367431730031967,
0.0696609690785408,
-0.05353253334760666,
-0.02602485939860344,
-0.007858239114284515,
0.0978042483329773,
0.008416155353188515,
0.0867205411195755,
-0.13997472822666168,
-0.006191175431013107,
-0.02968805655837059,
-0.06472725421190262,
-0.04450218006968498,
-0.0349254235625267,
0.09429186582565308,
-0.013142097741365433,
0.18351885676383972,
-0.09520009905099869,
0.011849514208734035,
-0.21236686408519745,
-0.014442931860685349,
-0.028647301718592644,
-0.09178724139928818,
-0.15416806936264038,
-0.026780616492033005,
0.06469432264566422,
-0.04646223038434982,
0.14698436856269836,
-0.0015701946103945374,
0.06641637533903122,
0.0374632254242897,
-0.03034866787493229,
0.026866896077990532,
0.0457080602645874,
0.24175049364566803,
0.031129976734519005,
-0.00613299710676074,
0.03124288097023964,
0.0734565481543541,
0.11419452726840973,
0.01158747635781765,
0.22155217826366425,
0.18127477169036865,
-0.07043489068746567,
0.1019318550825119,
0.0614655576646328,
-0.06218587979674339,
-0.11618132889270782,
0.0043325223959982395,
-0.024654695764183998,
0.025565601885318756,
-0.037003397941589355,
0.18632014095783234,
0.10361658036708832,
-0.16563965380191803,
0.013917487114667892,
-0.0577508844435215,
-0.07344912737607956,
-0.09634224325418472,
0.07478474825620651,
-0.08116906881332397,
-0.19115056097507477,
0.026669880375266075,
-0.11170000582933426,
-0.01101001352071762,
0.1307188719511032,
-0.015901604667305946,
-0.008357066661119461,
0.235883891582489,
0.07531672716140747,
0.062061015516519547,
0.01887267641723156,
0.002806885400786996,
-0.03576918691396713,
-0.07197372615337372,
-0.1012999638915062,
0.005275350995361805,
-0.0329875648021698,
0.015152910724282265,
-0.05596904084086418,
-0.12911638617515564,
0.0355282686650753,
0.015552476979792118,
-0.09444280713796616,
0.024008115753531456,
0.033259183168411255,
0.050202906131744385,
0.02025768719613552,
0.011496256105601788,
0.022110527381300926,
-0.0040362440049648285,
0.2072332501411438,
-0.057010989636182785,
-0.13155171275138855,
-0.06247086822986603,
0.26245516538619995,
0.04406530410051346,
0.021840333938598633,
0.02593136578798294,
-0.12180887162685394,
0.010624225251376629,
0.15366508066654205,
0.14873500168323517,
-0.08162666112184525,
-0.0008891946054063737,
-0.04128885641694069,
-0.02712344564497471,
-0.09549082070589066,
0.12738096714019775,
0.1272391825914383,
0.04202356934547424,
-0.09513653069734573,
-0.020700950175523758,
-0.05248567834496498,
-0.008205144666135311,
-0.04144055396318436,
0.02868380770087242,
0.03159588947892189,
0.009482614696025848,
-0.059001702815294266,
0.07555936276912689,
-0.01746372878551483,
-0.11142945289611816,
0.0962703675031662,
-0.15968038141727448,
-0.14188013970851898,
-0.010838326066732407,
0.1389695703983307,
-0.023466376587748528,
0.06749363988637924,
-0.052638206630945206,
-0.005400174763053656,
0.04293496161699295,
-0.02973678521811962,
-0.051605794578790665,
-0.127696231007576,
0.0767015814781189,
-0.0755990594625473,
0.236514151096344,
-0.0324367955327034,
0.12624210119247437,
0.10700509697198868,
0.03140915557742119,
-0.09085901826620102,
0.12035630643367767,
0.03917442262172699,
-0.1436665952205658,
-0.0003976380976382643,
0.0872146487236023,
-0.052429888397455215,
0.0752221941947937,
0.01663416437804699,
-0.1661776900291443,
0.028236106038093567,
0.006344062741845846,
-0.0737876445055008,
-0.057386551052331924,
-0.06284373253583908,
-0.06381191313266754,
0.09128324687480927,
0.1598876714706421,
-0.03899944946169853,
0.04929110035300255,
-0.0702781155705452,
0.07627421617507935,
0.0825834795832634,
0.04035616293549538,
-0.028585845604538918,
-0.2825299799442291,
0.058108292520046234,
0.14923399686813354,
-0.056315988302230835,
-0.2234940379858017,
-0.07451563328504562,
0.01051599532365799,
-0.0654032900929451,
-0.09339899569749832,
0.06626714020967484,
0.11650731414556503,
0.05525080859661102,
-0.045404739677906036,
-0.1677107810974121,
-0.10386636853218079,
0.1699165403842926,
-0.14799459278583527,
-0.12518051266670227
] |
null | null | null |
# MusicGen - Melody - 1.5B
Audiocraft provides the code and models for MusicGen, a simple and controllable model for music generation.
MusicGen is a single stage auto-regressive Transformer model trained over a 32kHz EnCodec tokenizer with 4 codebooks sampled at 50 Hz.
Unlike existing methods like MusicLM, MusicGen doesn't not require a self-supervised semantic representation, and it generates all 4 codebooks in one pass.
By introducing a small delay between the codebooks, we show we can predict them in parallel, thus having only 50 auto-regressive steps per second of audio.
MusicGen was published in [Simple and Controllable Music Generation](https://arxiv.org/abs/2306.05284) by *Jade Copet, Felix Kreuk, Itai Gat, Tal Remez, David Kant, Gabriel Synnaeve, Yossi Adi, Alexandre Défossez*.
Four checkpoints are released:
- [small](https://huggingface.co/facebook/musicgen-small)
- [medium](https://huggingface.co/facebook/musicgen-medium)
- [large](https://huggingface.co/facebook/musicgen-large)
- [**melody** (this checkpoint)](https://huggingface.co/facebook/musicgen-melody)
## Example
Try out MusicGen yourself!
- <a target="_blank" href="https://colab.research.google.com/drive/1fxGqfg96RBUvGxZ1XXN07s3DthrKUl4-?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
- <a target="_blank" href="https://huggingface.co/spaces/facebook/MusicGen">
<img src="https://huggingface.co/datasets/huggingface/badges/raw/main/open-in-hf-spaces-sm.svg" alt="Open in HugginFace"/>
</a>
- You can run MusicGen locally as well:
1. First install the [`audiocraft` library](https://github.com/facebookresearch/audiocraft)
```
pip install git+https://github.com/facebookresearch/audiocraft.git
```
2. Make sure to have [`ffmpeg`](https://ffmpeg.org/download.html) installed:
```
apt get install ffmpeg
```
3. Run the following Python code:
```py
import torchaudio
from audiocraft.models import MusicGen
from audiocraft.data.audio import audio_write
model = MusicGen.get_pretrained('melody')
model.set_generation_params(duration=8) # generate 8 seconds.
descriptions = ['happy rock', 'energetic EDM', 'sad jazz']
melody, sr = torchaudio.load('./assets/bach.mp3')
# generates using the melody from the given audio and the provided descriptions.
wav = model.generate_with_chroma(descriptions, melody[None].expand(3, -1, -1), sr)
for idx, one_wav in enumerate(wav):
# Will save under {idx}.wav, with loudness normalization at -14 db LUFS.
audio_write(f'{idx}', one_wav.cpu(), model.sample_rate, strategy="loudness")
```
## Model details
**Organization developing the model:** The FAIR team of Meta AI.
**Model date:** MusicGen was trained between April 2023 and May 2023.
**Model version:** This is the version 1 of the model.
**Model type:** MusicGen consists of an EnCodec model for audio tokenization, an auto-regressive language model based on the transformer architecture for music modeling. The model comes in different sizes: 300M, 1.5B and 3.3B parameters ; and two variants: a model trained for text-to-music generation task and a model trained for melody-guided music generation.
**Paper or resources for more information:** More information can be found in the paper [Simple and Controllable Music Generation](https://arxiv.org/abs/2306.05284).
**Citation details:**
```
@misc{copet2023simple,
title={Simple and Controllable Music Generation},
author={Jade Copet and Felix Kreuk and Itai Gat and Tal Remez and David Kant and Gabriel Synnaeve and Yossi Adi and Alexandre Défossez},
year={2023},
eprint={2306.05284},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
```
**License:** Code is released under MIT, model weights are released under CC-BY-NC 4.0.
**Where to send questions or comments about the model:** Questions and comments about MusicGen can be sent via the [Github repository](https://github.com/facebookresearch/audiocraft) of the project, or by opening an issue.
## Intended use
**Primary intended use:** The primary use of MusicGen is research on AI-based music generation, including:
- Research efforts, such as probing and better understanding the limitations of generative models to further improve the state of science
- Generation of music guided by text or melody to understand current abilities of generative AI models by machine learning amateurs
**Primary intended users:** The primary intended users of the model are researchers in audio, machine learning and artificial intelligence, as well as amateur seeking to better understand those models.
**Out-of-scope use cases:** The model should not be used on downstream applications without further risk evaluation and mitigation. The model should not be used to intentionally create or disseminate music pieces that create hostile or alienating environments for people. This includes generating music that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
## Metrics
**Models performance measures:** We used the following objective measure to evaluate the model on a standard music benchmark:
- Frechet Audio Distance computed on features extracted from a pre-trained audio classifier (VGGish)
- Kullback-Leibler Divergence on label distributions extracted from a pre-trained audio classifier (PaSST)
- CLAP Score between audio embedding and text embedding extracted from a pre-trained CLAP model
Additionally, we run qualitative studies with human participants, evaluating the performance of the model with the following axes:
- Overall quality of the music samples;
- Text relevance to the provided text input;
- Adherence to the melody for melody-guided music generation.
More details on performance measures and human studies can be found in the paper.
**Decision thresholds:** Not applicable.
## Evaluation datasets
The model was evaluated on the [MusicCaps benchmark](https://www.kaggle.com/datasets/googleai/musiccaps) and on an in-domain held-out evaluation set, with no artist overlap with the training set.
## Training datasets
The model was trained on licensed data using the following sources: the [Meta Music Initiative Sound Collection](https://www.fb.com/sound), [Shutterstock music collection](https://www.shutterstock.com/music) and the [Pond5 music collection](https://www.pond5.com/). See the paper for more details about the training set and corresponding preprocessing.
## Evaluation results
Below are the objective metrics obtained on MusicCaps with the released model. Note that for the publicly released models, we had all the datasets go through a state-of-the-art music source separation method, namely using the open source [Hybrid Transformer for Music Source Separation](https://github.com/facebookresearch/demucs) (HT-Demucs), in order to keep only the instrumental part. This explains the difference in objective metrics with the models used in the paper.
| Model | Frechet Audio Distance | KLD | Text Consistency | Chroma Cosine Similarity |
|---|---|---|---|---|
| facebook/musicgen-small | 4.88 | 1.42 | 0.27 | - |
| facebook/musicgen-medium | 5.14 | 1.38 | 0.28 | - |
| facebook/musicgen-large | 5.48 | 1.37 | 0.28 | - |
| **facebook/musicgen-melody** | 4.93 | 1.41 | 0.27 | 0.44 |
More information can be found in the paper [Simple and Controllable Music Generation](https://arxiv.org/abs/2306.05284), in the Results section.
## Limitations and biases
**Data:** The data sources used to train the model are created by music professionals and covered by legal agreements with the right holders. The model is trained on 20K hours of data, we believe that scaling the model on larger datasets can further improve the performance of the model.
**Mitigations:** Vocals have been removed from the data source using corresponding tags, and then using a state-of-the-art music source separation method, namely using the open source [Hybrid Transformer for Music Source Separation](https://github.com/facebookresearch/demucs) (HT-Demucs).
**Limitations:**
- The model is not able to generate realistic vocals.
- The model has been trained with English descriptions and will not perform as well in other languages.
- The model does not perform equally well for all music styles and cultures.
- The model sometimes generates end of songs, collapsing to silence.
- It is sometimes difficult to assess what types of text descriptions provide the best generations. Prompt engineering may be required to obtain satisfying results.
**Biases:** The source of data is potentially lacking diversity and all music cultures are not equally represented in the dataset. The model may not perform equally well on the wide variety of music genres that exists. The generated samples from the model will reflect the biases from the training data. Further work on this model should include methods for balanced and just representations of cultures, for example, by scaling the training data to be both diverse and inclusive.
**Risks and harms:** Biases and limitations of the model may lead to generation of samples that may be considered as biased, inappropriate or offensive. We believe that providing the code to reproduce the research and train new models will allow to broaden the application to new and more representative data.
**Use cases:** Users must be aware of the biases, limitations and risks of the model. MusicGen is a model developed for artificial intelligence research on controllable music generation. As such, it should not be used for downstream applications without further investigation and mitigation of risks. | {"license": "cc-by-nc-4.0", "tags": ["musicgen"], "inference": false} | null | favazmuhammed/musicgen-melody | [
"musicgen",
"arxiv:2306.05284",
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-12T12:38:03+00:00 | [
"2306.05284"
] | [] | TAGS
#musicgen #arxiv-2306.05284 #license-cc-by-nc-4.0 #region-us
| MusicGen - Melody - 1.5B
========================
Audiocraft provides the code and models for MusicGen, a simple and controllable model for music generation.
MusicGen is a single stage auto-regressive Transformer model trained over a 32kHz EnCodec tokenizer with 4 codebooks sampled at 50 Hz.
Unlike existing methods like MusicLM, MusicGen doesn't not require a self-supervised semantic representation, and it generates all 4 codebooks in one pass.
By introducing a small delay between the codebooks, we show we can predict them in parallel, thus having only 50 auto-regressive steps per second of audio.
MusicGen was published in Simple and Controllable Music Generation by *Jade Copet, Felix Kreuk, Itai Gat, Tal Remez, David Kant, Gabriel Synnaeve, Yossi Adi, Alexandre Défossez*.
Four checkpoints are released:
* small
* medium
* large
* melody (this checkpoint)
Example
-------
Try out MusicGen yourself!
* <a target="\_blank" href="URL
<img src="URL alt="Open In Colab"/>
* <a target="\_blank" href="URL
<img src="URL alt="Open in HugginFace"/>
* You can run MusicGen locally as well:
1. First install the 'audiocraft' library
2. Make sure to have 'ffmpeg' installed:
3. Run the following Python code:
Model details
-------------
Organization developing the model: The FAIR team of Meta AI.
Model date: MusicGen was trained between April 2023 and May 2023.
Model version: This is the version 1 of the model.
Model type: MusicGen consists of an EnCodec model for audio tokenization, an auto-regressive language model based on the transformer architecture for music modeling. The model comes in different sizes: 300M, 1.5B and 3.3B parameters ; and two variants: a model trained for text-to-music generation task and a model trained for melody-guided music generation.
Paper or resources for more information: More information can be found in the paper Simple and Controllable Music Generation.
Citation details:
License: Code is released under MIT, model weights are released under CC-BY-NC 4.0.
Where to send questions or comments about the model: Questions and comments about MusicGen can be sent via the Github repository of the project, or by opening an issue.
Intended use
------------
Primary intended use: The primary use of MusicGen is research on AI-based music generation, including:
* Research efforts, such as probing and better understanding the limitations of generative models to further improve the state of science
* Generation of music guided by text or melody to understand current abilities of generative AI models by machine learning amateurs
Primary intended users: The primary intended users of the model are researchers in audio, machine learning and artificial intelligence, as well as amateur seeking to better understand those models.
Out-of-scope use cases: The model should not be used on downstream applications without further risk evaluation and mitigation. The model should not be used to intentionally create or disseminate music pieces that create hostile or alienating environments for people. This includes generating music that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
Metrics
-------
Models performance measures: We used the following objective measure to evaluate the model on a standard music benchmark:
* Frechet Audio Distance computed on features extracted from a pre-trained audio classifier (VGGish)
* Kullback-Leibler Divergence on label distributions extracted from a pre-trained audio classifier (PaSST)
* CLAP Score between audio embedding and text embedding extracted from a pre-trained CLAP model
Additionally, we run qualitative studies with human participants, evaluating the performance of the model with the following axes:
* Overall quality of the music samples;
* Text relevance to the provided text input;
* Adherence to the melody for melody-guided music generation.
More details on performance measures and human studies can be found in the paper.
Decision thresholds: Not applicable.
Evaluation datasets
-------------------
The model was evaluated on the MusicCaps benchmark and on an in-domain held-out evaluation set, with no artist overlap with the training set.
Training datasets
-----------------
The model was trained on licensed data using the following sources: the Meta Music Initiative Sound Collection, Shutterstock music collection and the Pond5 music collection. See the paper for more details about the training set and corresponding preprocessing.
Evaluation results
------------------
Below are the objective metrics obtained on MusicCaps with the released model. Note that for the publicly released models, we had all the datasets go through a state-of-the-art music source separation method, namely using the open source Hybrid Transformer for Music Source Separation (HT-Demucs), in order to keep only the instrumental part. This explains the difference in objective metrics with the models used in the paper.
More information can be found in the paper Simple and Controllable Music Generation, in the Results section.
Limitations and biases
----------------------
Data: The data sources used to train the model are created by music professionals and covered by legal agreements with the right holders. The model is trained on 20K hours of data, we believe that scaling the model on larger datasets can further improve the performance of the model.
Mitigations: Vocals have been removed from the data source using corresponding tags, and then using a state-of-the-art music source separation method, namely using the open source Hybrid Transformer for Music Source Separation (HT-Demucs).
Limitations:
* The model is not able to generate realistic vocals.
* The model has been trained with English descriptions and will not perform as well in other languages.
* The model does not perform equally well for all music styles and cultures.
* The model sometimes generates end of songs, collapsing to silence.
* It is sometimes difficult to assess what types of text descriptions provide the best generations. Prompt engineering may be required to obtain satisfying results.
Biases: The source of data is potentially lacking diversity and all music cultures are not equally represented in the dataset. The model may not perform equally well on the wide variety of music genres that exists. The generated samples from the model will reflect the biases from the training data. Further work on this model should include methods for balanced and just representations of cultures, for example, by scaling the training data to be both diverse and inclusive.
Risks and harms: Biases and limitations of the model may lead to generation of samples that may be considered as biased, inappropriate or offensive. We believe that providing the code to reproduce the research and train new models will allow to broaden the application to new and more representative data.
Use cases: Users must be aware of the biases, limitations and risks of the model. MusicGen is a model developed for artificial intelligence research on controllable music generation. As such, it should not be used for downstream applications without further investigation and mitigation of risks.
| [] | [
"TAGS\n#musicgen #arxiv-2306.05284 #license-cc-by-nc-4.0 #region-us \n"
] | [
29
] | [
"passage: TAGS\n#musicgen #arxiv-2306.05284 #license-cc-by-nc-4.0 #region-us \n"
] | [
-0.04730624333024025,
0.1222345158457756,
-0.00732751889154315,
0.01888313516974449,
-0.02392328530550003,
0.015525449067354202,
0.12966634333133698,
-0.0021502564195543528,
0.09571061283349991,
0.06880836188793182,
0.12838217616081238,
0.11396721750497818,
-0.017020421102643013,
-0.09538406133651733,
-0.0071894945576786995,
-0.06961202621459961,
0.02058430016040802,
0.03680171072483063,
0.0680195689201355,
0.025407571345567703,
-0.030589841306209564,
0.015801794826984406,
0.007780582644045353,
-0.014048565179109573,
-0.1872199922800064,
0.0027986313216388226,
0.031053023412823677,
0.01715378649532795,
0.03243555873632431,
0.010068478062748909,
0.01557011529803276,
0.13273075222969055,
-0.0056776502169668674,
-0.10099317878484726,
-0.002532285638153553,
-0.05655774846673012,
-0.15300597250461578,
0.01306457445025444,
0.1536966860294342,
-0.02569960057735443,
0.07348582148551941,
0.14320158958435059,
-0.11396821588277817,
0.08339453488588333,
-0.1813209056854248,
-0.13216978311538696,
-0.13389632105827332,
0.0008148907800205052,
0.06327427923679352,
0.0870073139667511,
0.027843549847602844,
0.07532570511102676,
-0.08750970661640167,
-0.004718206822872162,
0.1464824229478836,
-0.3144402503967285,
0.029708342626690865,
0.26885873079299927,
0.10702823847532272,
0.06439269334077835,
-0.06898123025894165,
0.1488206386566162,
0.09703632444143295,
-0.011881455779075623,
-0.010600138455629349,
-0.10867403447628021,
-0.05211896449327469,
0.07100342959165573,
0.015385733917355537,
-0.030418558046221733,
0.3210059106349945,
0.03464809060096741,
-0.03655374050140381,
0.14056599140167236,
0.05632728710770607,
-0.13005773723125458,
-0.026288704946637154,
0.034559715539216995,
0.09234542399644852,
0.11316286772489548,
-0.015814755111932755,
-0.03025834821164608,
-0.10716719925403595,
-0.03788240998983383,
-0.11498414725065231,
-0.0962715670466423,
-0.08251501619815826,
0.023489048704504967,
0.011932640336453915,
-0.078562431037426,
-0.20471300184726715,
-0.011328932829201221,
-0.017057372257113457,
-0.018524060025811195,
0.08481349050998688,
0.02580631896853447,
-0.07608138024806976,
0.12695825099945068,
0.08704017847776413,
0.1052466332912445,
-0.03354410454630852,
-0.06387998908758163,
-0.0687837079167366,
0.17177057266235352,
0.03713790327310562,
-0.1408907175064087,
0.05310281738638878,
0.07666409015655518,
0.05411635711789131,
-0.10180699080228806,
0.08409108221530914,
-0.04526088759303093,
-0.21553324162960052,
-0.008071397431194782,
-0.04634437710046768,
0.06354354321956635,
-0.04944882169365883,
-0.13651537895202637,
-0.12884457409381866,
0.047097038477659225,
0.24804416298866272,
0.06514883786439896,
0.020979885011911392,
-0.0009329075110144913,
0.07969990372657776,
-0.0606146864593029,
-0.06496956199407578,
0.11915431916713715,
0.10708149522542953,
0.09557265043258667,
-0.13072387874126434,
-0.006901143118739128,
0.04874339699745178,
0.03189154341816902,
0.1989644467830658,
-0.08205658197402954,
0.1034950241446495,
-0.1038874015212059,
0.0515938438475132,
-0.01356692984700203,
0.0074110389687120914,
-0.016319643706083298,
-0.10055407136678696,
0.05554848164319992,
0.02266613580286503,
0.029065681621432304,
-0.11517124623060226,
-0.14578506350517273,
-0.1087900921702385,
0.10869652032852173,
-0.07302206754684448,
0.059957269579172134,
-0.17915573716163635,
0.04559025913476944,
-0.12057144194841385,
0.03356679901480675,
-0.03389650210738182,
-0.16080069541931152,
-0.10499082505702972,
0.011286157183349133,
-0.005845268256962299,
-0.04576990008354187,
-0.09583716094493866,
0.04597596079111099,
-0.008304125629365444,
0.059206772595644,
-0.12140912562608719,
-0.07045894116163254,
0.042570505291223526,
-0.06481083482503891,
-0.12599138915538788,
0.08038264513015747,
0.07107523828744888,
-0.08749894797801971,
0.00021124829072505236,
0.2864861786365509,
-0.01971265859901905,
-0.15895934402942657,
-0.05301236733794212,
0.15539442002773285,
-0.05527004599571228,
-0.2458006590604782,
0.101624496281147,
-0.1402263194322586,
-0.09557534009218216,
-0.044649600982666016,
0.017176995053887367,
0.03848453611135483,
0.007227159105241299,
0.005613014567643404,
0.002912123454734683,
-0.023690611124038696,
0.010782195255160332,
0.04125474393367767,
0.008590620942413807,
-0.05498763918876648,
0.0379500612616539,
0.02491169236600399,
0.028836572542786598,
0.10562705993652344,
0.035923413932323456,
-0.04793418198823929,
0.09035959094762802,
-0.07942216098308563,
-0.021224426105618477,
-0.07534100115299225,
0.0906192809343338,
0.0073215593583881855,
0.010660977102816105,
0.16631151735782623,
0.010944992303848267,
-0.04132245481014252,
-0.07282470166683197,
0.02630010060966015,
0.011581570841372013,
-0.007432958576828241,
0.007989708334207535,
0.017708564177155495,
-0.14829614758491516,
0.07154764980077744,
0.02680297940969467,
-0.01163289975374937,
-0.080633245408535,
-0.1267615407705307,
0.21937550604343414,
-0.11178476363420486,
-0.04007546231150627,
0.021414967253804207,
0.07382603734731674,
0.05199819803237915,
-0.002606283640488982,
0.09972389042377472,
0.09138979017734528,
0.09459812939167023,
-0.04804905131459236,
0.22052714228630066,
-0.04002564772963524,
0.19081030786037445,
0.18116489052772522,
-0.12918642163276672,
0.035998933017253876,
-0.03436873108148575,
0.08057773113250732,
-0.03002581186592579,
0.038889240473508835,
-0.012038035318255424,
-0.04560796171426773,
-0.05546169355511665,
0.058482758700847626,
-0.12369146943092346,
0.07447278499603271,
0.0592278316617012,
-0.051715414971113205,
-0.03633000701665878,
0.03034791350364685,
0.1692998707294464,
-0.12297084182500839,
0.15617533028125763,
0.4202234447002411,
-0.016573915258049965,
0.19405287504196167,
-0.03132869675755501,
-0.0057214112021028996,
-0.10607632249593735,
-0.04380693659186363,
-0.01065646018832922,
0.1752137541770935,
-0.003287180792540312,
0.03864211216568947,
0.0143921859562397,
0.016290057450532913,
0.02092364802956581,
-0.1841442883014679,
-0.150733083486557,
-0.01932668872177601,
-0.019554203376173973,
-0.22246064245700836,
0.10450049489736557,
-0.08727529644966125,
-0.03226887434720993,
-0.002303906949236989,
-0.1638953685760498,
0.1595889776945114,
-0.009443935938179493,
-0.11412853002548218,
-0.007952991873025894,
-0.23090006411075592,
-0.11889542639255524,
-0.18211305141448975,
-0.08043541759252548,
-0.07030301541090012,
0.04120133817195892,
0.07210791110992432,
-0.05255449563264847,
-0.005738064646720886,
0.014429696835577488,
-0.0862162858247757,
-0.11560258269309998,
-0.12151492387056351,
0.0004971930757164955,
0.08198188990354538,
-0.015496086329221725,
-0.0053633470088243484,
-0.013374297879636288,
0.0023429214488714933,
-0.039647992700338364,
0.0352015495300293,
-0.12231037765741348,
0.17338046431541443,
0.1662747859954834,
0.175307497382164,
0.007515204604715109,
-0.007861899212002754,
0.07592906802892685,
-0.08545459061861038,
-0.03038123808801174,
0.07643447071313858,
0.012206235900521278,
0.009642720222473145,
0.08710552752017975,
0.1649869978427887,
-0.03153720125555992,
-0.02972274087369442,
-0.0061291279271245,
-0.09933576732873917,
-0.20109589397907257,
-0.07310599833726883,
-0.09877949208021164,
0.11936957389116287,
0.007494574412703514,
0.041198790073394775,
0.21515119075775146,
-0.011785588227212429,
0.0402132086455822,
0.06726324558258057,
-0.0024743538815528154,
0.00885170791298151,
0.14487376809120178,
-0.020253991708159447,
-0.004568099044263363,
-0.07346756756305695,
0.09540930390357971,
0.17975574731826782,
0.11009834706783295,
0.15974460542201996,
0.2985908091068268,
0.20516343414783478,
0.1024635061621666,
0.17613476514816284,
0.10866685211658478,
0.02744891308248043,
0.056815117597579956,
-0.0024019975680857897,
0.0031410728115588427,
-0.10417752712965012,
0.09433295577764511,
0.027516914531588554,
0.0024934858083724976,
-0.1875087320804596,
0.019549792632460594,
-0.20190934836864471,
-0.10072355717420578,
-0.057583991438150406,
0.09559135884046555,
-0.0657796710729599,
0.10183531790971756,
0.03861163556575775,
0.1657450795173645,
0.01774180866777897,
0.19140523672103882,
-0.007933326065540314,
0.032351184636354446,
0.03150281682610512,
0.009260539896786213,
0.09582583606243134,
0.008589163422584534,
0.00023525457072537392,
0.038488343358039856,
-0.13157899677753448,
0.03777224197983742,
0.0436164028942585,
-0.06785762310028076,
0.18680916726589203,
0.055728547275066376,
-0.044394709169864655,
0.08969631791114807,
-0.06904979050159454,
0.03417978808283806,
0.20786841213703156,
0.17597219347953796,
0.05755383521318436,
-0.2462792843580246,
-0.10968788713216782,
-0.046790461987257004,
-0.03350666165351868,
0.11528167128562927,
0.03323180228471756,
-0.14892061054706573,
-0.02274731732904911,
0.014169499278068542,
0.027028901502490044,
0.07819890975952148,
-0.1267460137605667,
-0.004543073941022158,
0.03068411350250244,
0.13698457181453705,
0.058584120124578476,
-0.02398797869682312,
0.00039172329707071185,
-0.1028440073132515,
0.0004403663333505392,
-0.14368969202041626,
0.05606884881854057,
-0.027360066771507263,
-0.21786901354789734,
0.08364379405975342,
0.035694677382707596,
0.04231981933116913,
-0.028467310592532158,
-0.11377120018005371,
-0.1571967601776123,
-0.1368008255958557,
0.1422162503004074,
-0.04866068810224533,
-0.015563899651169777,
-0.018956800922751427,
0.15165874361991882,
0.0032058963552117348,
0.0899471566081047,
0.0005699917091988027,
0.10768875479698181,
0.0499236024916172,
-0.06566379964351654,
0.1882939338684082,
-0.1972305178642273,
0.01754096709191799,
-0.039420947432518005,
-0.02070976421236992,
0.034527737647295,
0.07685066014528275,
-0.11234887689352036,
0.09719941765069962,
0.3668507933616638,
-0.040367260575294495,
0.19693998992443085,
0.27268144488334656,
-0.13365870714187622,
-0.26291999220848083,
-0.01930752582848072,
-0.1746962070465088,
-0.08112744987010956,
0.10633481293916702,
-0.2736150324344635,
-0.019744858145713806,
0.1722395420074463,
-0.0884082242846489,
0.23465067148208618,
-0.28573328256607056,
-0.01922009512782097,
0.04679132252931595,
-0.1050039529800415,
0.34608349204063416,
-0.12083019316196442,
-0.13340236246585846,
-0.06783048808574677,
-0.20569531619548798,
0.15763741731643677,
0.0042429035529494286,
0.0769573301076889,
0.03668200597167015,
-0.08719931542873383,
0.01094777137041092,
0.018281014636158943,
0.09841310232877731,
0.02168481983244419,
0.10428040474653244,
-0.012165983207523823,
-0.079494409263134,
0.20063669979572296,
0.038173552602529526,
-0.10975692421197891,
-0.11388155817985535,
-0.08345156908035278,
-0.08760868012905121,
0.0059560928493738174,
-0.06636178493499756,
0.05497730150818825,
-0.049013350158929825,
-0.04104588180780411,
-0.07398151606321335,
0.05777516961097717,
-0.12976039946079254,
-0.03001842088997364,
0.3046223819255829,
-0.05153300240635872,
-0.052315108478069305,
0.08202289789915085,
0.008963096886873245,
-0.03390410915017128,
-0.15322472155094147,
0.019476069137454033,
-0.10444412380456924,
0.08256867527961731,
-0.15343305468559265,
0.03317154943943024,
0.04885254055261612,
0.052003562450408936,
-0.014078893698751926,
0.047234032303094864,
-0.09577789902687073,
0.038683418184518814,
0.1520538628101349,
-0.1640990972518921,
-0.011834354139864445,
-0.05120491981506348,
0.0358256958425045,
0.24102765321731567,
-0.024212472140789032,
0.06680353730916977,
-0.009282858110964298,
-0.024364523589611053,
0.06795865297317505,
0.009199203923344612,
-0.1329561471939087,
-0.10432726889848709,
0.036276187747716904,
-0.06511572003364563,
-0.09882580488920212,
0.18236276507377625,
0.05175802856683731,
-0.11555556207895279,
-0.05094766989350319,
0.039736948907375336,
-0.041827354580163956,
-0.07767761498689651,
-0.10001159459352493,
-0.038775987923145294,
-0.2502450942993164,
-0.09652305394411087,
0.060362979769706726,
-0.03857302665710449,
-0.054274432361125946,
0.11783808469772339,
0.048634450882673264,
0.07094722241163254,
0.06546217948198318,
-0.013430851511657238,
0.038212280720472336,
0.0021097424905747175,
-0.1876930445432663,
0.07079129666090012,
-0.08536487817764282,
-0.17130964994430542,
0.01453955378383398,
0.0181496012955904,
-0.07042719423770905,
-0.037676870822906494,
-0.0765867531299591,
0.05755497142672539,
-0.03194094076752663,
0.019520161673426628,
-0.06465290486812592,
-0.058302246034145355,
-0.008817066438496113,
0.06934353709220886,
-0.07710660994052887,
-0.003180893138051033,
-0.0720154270529747,
0.007661283016204834,
0.03251844644546509,
0.07474738359451294,
-0.056435104459524155,
-0.04475533217191696,
0.058056432753801346,
0.0354609377682209,
0.07684573531150818,
0.0578417032957077,
0.036662157624959946,
0.0648004412651062,
-0.13381071388721466,
-0.04991208016872406,
0.11274293065071106,
0.015657344833016396,
-0.0035622587893158197,
-0.03537289798259735,
-0.059397220611572266,
0.028047334402799606,
-0.07172991335391998,
0.0294199138879776,
-0.05621819943189621,
-0.12675052881240845,
-0.13339421153068542,
-0.036701641976833344,
-0.1303071528673172,
-0.004857305437326431,
-0.027258293703198433,
0.2225690484046936,
0.061460431665182114,
0.13976718485355377,
0.0419587567448616,
-0.01720474101603031,
0.050646379590034485,
-0.0226820670068264,
0.0008477877709083259,
-0.13768237829208374,
-0.2128971666097641,
-0.01298598013818264,
-0.06903139501810074,
-0.0549936406314373,
0.17435336112976074,
0.0041094315238296986,
-0.21026518940925598,
0.017170419916510582,
0.16671352088451385,
-0.030443953350186348,
-0.0063865031115710735,
0.282696932554245,
0.005028031766414642,
-0.06739497184753418,
-0.10952381789684296,
0.09140479564666748,
-0.04571953043341637,
-0.016891393810510635,
-0.07020710408687592,
0.03286966308951378,
0.12650936841964722,
-0.012529370374977589,
0.17775513231754303,
-0.09680505096912384,
-0.10307404398918152,
0.06947892904281616,
0.0791810005903244,
0.11039838194847107,
0.03547022119164467,
0.15618032217025757,
0.12558408081531525,
-0.006835190113633871,
0.039355047047138214,
-0.07698342949151993,
-0.006773431319743395,
-0.12413491308689117,
-0.16858501732349396,
-0.03233916312456131,
-0.16591762006282806,
0.058140456676483154,
-0.06439588963985443,
0.11311406642198563,
0.15903615951538086,
-0.02932186983525753,
-0.02632588893175125,
-0.07628897577524185,
-0.1540917009115219,
-0.07778467237949371,
0.08707360178232193,
-0.06611846387386322,
-0.034814562648534775,
-0.07597421854734421,
-0.05675124749541283,
0.10663141310214996,
-0.20632164180278778,
0.006702458020299673,
-0.01357443630695343,
0.11140301078557968,
-0.0016299323178827763,
-0.10153548419475555,
-0.05365889146924019,
-0.03134121000766754,
0.09103363752365112,
-0.028803691267967224,
0.16660919785499573,
0.054051440209150314,
0.02941690757870674,
0.07062575221061707,
0.02651275135576725,
0.026864709332585335,
0.11431360989809036,
0.05006567761301994,
0.007024100981652737,
-0.10755899548530579,
0.10530158877372742,
-0.0058225952088832855,
0.025657055899500847,
0.08215788751840591,
0.14269743859767914,
0.13058960437774658,
-0.07855486124753952,
-0.023199735209345818,
-0.003023202996701002,
0.010603989474475384,
0.004536523018032312,
0.10377832502126694,
0.042067211121320724,
0.21596363186836243,
-0.03797941654920578,
-0.07168212532997131,
-0.1761191189289093,
0.04833276942372322,
-0.03820537403225899,
-0.00011509566684253514,
0.08980841934680939,
-0.11710400134325027,
-0.012244394980370998,
0.13679534196853638,
-0.15828749537467957,
0.16873377561569214,
0.10867184400558472,
-0.043263573199510574,
0.008679107762873173,
-0.055608563125133514,
0.0764220803976059,
0.056466735899448395,
0.07959350198507309,
-0.1664094179868698,
-0.10083386301994324,
-0.0674716904759407,
0.016512397676706314,
-0.26593855023384094,
-0.08122813701629639,
-0.05446707457304001,
0.042305294424295425,
0.1203756108880043,
0.012885603122413158,
0.1526641845703125,
-0.01603316143155098,
0.1139930933713913,
-0.061035819351673126,
0.0908263549208641,
0.03595976531505585,
-0.10413466393947601,
-0.09313851594924927,
-0.020146731287240982,
0.0027843390125781298,
0.11751427501440048,
0.06852374225854874,
0.122538261115551,
0.04982532188296318,
0.1327521800994873,
-0.09433911740779877,
-0.01869765855371952,
-0.035917337983846664,
-0.16942048072814941,
0.03835958242416382,
-0.11842843145132065,
0.026742396876215935,
-0.16047079861164093,
0.006418270990252495,
-0.05072637274861336,
0.13174648582935333,
-0.08268627524375916,
0.023974545300006866,
0.11111302673816681,
0.04309593141078949,
0.19704194366931915,
-0.005508831702172756,
-0.08083418011665344,
0.02376655861735344,
-0.10576361417770386,
0.10714881867170334,
-0.13135690987110138,
0.014963771216571331,
0.05396945774555206,
-0.049359217286109924,
0.0073593249544501305,
-0.1379992961883545,
0.05801932141184807,
-0.07022591680288315,
-0.03179892897605896,
-0.13033318519592285
] |
null | null | transformers |
# Uploaded model
- **Developed by:** ISTNetworks
- **License:** apache-2.0
- **Finetuned from model :**mistral-7b-instruct-v0.2-bnb-4bit
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "mistral", "gguf"], "base_model": "mistral-7b-instruct-v0.2-bnb-4bit"} | null | ISTNetworks/Mistral-v2-updated | [
"transformers",
"gguf",
"mistral",
"text-generation-inference",
"en",
"base_model:mistral-7b-instruct-v0.2-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:41:25+00:00 | [] | [
"en"
] | TAGS
#transformers #gguf #mistral #text-generation-inference #en #base_model-mistral-7b-instruct-v0.2-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
|
# Uploaded model
- Developed by: ISTNetworks
- License: apache-2.0
- Finetuned from model :mistral-7b-instruct-v0.2-bnb-4bit
| [
"# Uploaded model\n\n- Developed by: ISTNetworks\n- License: apache-2.0\n- Finetuned from model :mistral-7b-instruct-v0.2-bnb-4bit"
] | [
"TAGS\n#transformers #gguf #mistral #text-generation-inference #en #base_model-mistral-7b-instruct-v0.2-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Uploaded model\n\n- Developed by: ISTNetworks\n- License: apache-2.0\n- Finetuned from model :mistral-7b-instruct-v0.2-bnb-4bit"
] | [
62,
42
] | [
"passage: TAGS\n#transformers #gguf #mistral #text-generation-inference #en #base_model-mistral-7b-instruct-v0.2-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: ISTNetworks\n- License: apache-2.0\n- Finetuned from model :mistral-7b-instruct-v0.2-bnb-4bit"
] | [
-0.07833067327737808,
0.030179964378476143,
-0.0031195422634482384,
0.03758208081126213,
0.020747732371091843,
0.010282915085554123,
0.16694416105747223,
0.08758167177438736,
0.039449356496334076,
-0.05501214787364006,
0.15069681406021118,
0.07030293345451355,
0.021539974957704544,
0.014868666417896748,
-0.008310454897582531,
-0.12426923960447311,
0.10412770509719849,
0.038079340010881424,
-0.114505834877491,
0.053647976368665695,
0.1307458132505417,
0.00999356061220169,
0.0976155623793602,
-0.01101942639797926,
-0.09111632406711578,
0.031228622421622276,
0.004113089293241501,
-0.03259435296058655,
0.0327194519340992,
0.08168438076972961,
-0.03165645897388458,
0.033416420221328735,
-0.001621624338440597,
-0.1312558799982071,
0.01824803464114666,
0.02590722031891346,
-0.06353402882814407,
0.05892311781644821,
-0.006934526842087507,
0.04875292256474495,
0.10472486913204193,
0.007021634839475155,
-0.11055067181587219,
0.049643535166978836,
-0.04641907662153244,
-0.13362358510494232,
-0.08928341418504715,
0.16591091454029083,
0.09659253060817719,
0.06761696934700012,
0.04042300209403038,
0.03901723399758339,
-0.029809443280100822,
0.050274308770895004,
0.1569802463054657,
-0.25955623388290405,
-0.027159707620739937,
0.16673339903354645,
0.018631674349308014,
0.05889642611145973,
0.006516521796584129,
0.016735123470425606,
0.07105830311775208,
0.0066061317920684814,
-0.017318880185484886,
-0.07032693177461624,
0.09052470326423645,
0.08144974708557129,
-0.09594935923814774,
-0.049275949597358704,
0.29787886142730713,
0.08526211231946945,
-0.03188212588429451,
0.01729619689285755,
-0.06990981847047806,
0.0478685237467289,
-0.05601539462804794,
0.07012759149074554,
0.03728114068508148,
0.13388824462890625,
0.04253876581788063,
-0.09030770510435104,
-0.09200996160507202,
-0.06801491975784302,
-0.09620540589094162,
-0.026937365531921387,
0.005724903661757708,
0.11688009649515152,
-0.12129724770784378,
0.03492822125554085,
-0.15955165028572083,
-0.10097803920507431,
-0.07377248257398605,
-0.07642293721437454,
0.12115522474050522,
0.06679341942071915,
-0.08051658421754837,
0.06149293854832649,
0.1444810926914215,
0.24892470240592957,
0.04494442045688629,
0.06581660360097885,
0.016937868669629097,
0.08440613746643066,
-0.036869511008262634,
0.03017313964664936,
-0.12295784056186676,
-0.0754927545785904,
0.13593342900276184,
-0.025863802060484886,
0.07053665071725845,
0.0038705600891262293,
-0.12113188952207565,
-0.06238687410950661,
0.0156466756016016,
-0.009323128499090672,
0.09460382908582687,
0.07203877717256546,
0.02132687345147133,
-0.029191207140684128,
0.19380594789981842,
-0.03140149265527725,
-0.04761045426130295,
-0.0036732240114361048,
-0.035031773149967194,
0.0635952427983284,
0.11911286413669586,
0.05589238181710243,
-0.02755892649292946,
-0.09670893102884293,
-0.0499543696641922,
-0.02363610453903675,
-0.0089179128408432,
-0.006204256322234869,
0.08900260925292969,
-0.04399016126990318,
0.05241227522492409,
-0.12029236555099487,
-0.24227438867092133,
0.056848034262657166,
0.1353101134300232,
-0.02653539553284645,
-0.04584672674536705,
0.02806648053228855,
-0.033175401389598846,
0.05013104900717735,
-0.06651202589273453,
0.028453417122364044,
-0.0712611973285675,
-0.023250393569469452,
-0.039265893399715424,
0.038032498210668564,
-0.23658333718776703,
0.033413078635931015,
-0.0545157864689827,
0.04688580706715584,
-0.10585585981607437,
0.0353323295712471,
-0.0643605887889862,
0.12815362215042114,
-0.14843228459358215,
-0.01088632456958294,
-0.016221659258008003,
-0.01654689945280552,
0.0358366034924984,
0.14063720405101776,
-0.04562337324023247,
-0.004352529067546129,
0.10123402625322342,
-0.05352311581373215,
-0.18591749668121338,
0.09843846410512924,
0.028006544336676598,
0.03901440650224686,
0.047326114028692245,
0.13256889581680298,
0.07469702512025833,
-0.03535904362797737,
0.06709295511245728,
0.1695243865251541,
0.009036301635205746,
-0.1640942394733429,
0.07643724232912064,
-0.025295628234744072,
-0.08924353867769241,
0.042439334094524384,
-0.11144893616437912,
0.1180386170744896,
0.041824646294116974,
-0.05902095139026642,
-0.11330369114875793,
-0.08013826608657837,
-0.03703906014561653,
-0.039340194314718246,
0.04175150766968727,
-0.006852828897535801,
-0.026042118668556213,
0.02662467211484909,
0.08701232820749283,
-0.022471167147159576,
0.07846172899007797,
-0.06416859477758408,
0.05387714132666588,
-0.10452093929052353,
0.06392855942249298,
-0.07465241849422455,
0.07176176458597183,
-0.03289783000946045,
0.009516640566289425,
0.039718933403491974,
0.02607581950724125,
0.08650608360767365,
-0.05757474899291992,
-0.05963914841413498,
-0.011067100800573826,
0.08869482576847076,
0.024699613451957703,
-0.05285262688994408,
-0.17545005679130554,
-0.010557867586612701,
-0.0038111971225589514,
0.1363254189491272,
-0.0417771153151989,
0.033107493072748184,
0.01696310192346573,
0.05890043079853058,
-0.05474213883280754,
0.048924531787633896,
0.004857890773564577,
-0.0979544147849083,
-0.035678163170814514,
-0.044644370675086975,
0.07454657554626465,
0.05785613879561424,
-0.16636492311954498,
0.13573771715164185,
0.0101547846570611,
0.12458482384681702,
0.1785857379436493,
-0.08135601133108139,
0.12886828184127808,
-0.020990809425711632,
-0.025024257600307465,
-0.06240952014923096,
0.14914600551128387,
-0.05900100991129875,
0.022827081382274628,
-0.014292414300143719,
0.0807715356349945,
-0.07160238176584244,
-0.012521612457931042,
-0.00814136117696762,
-0.09867836534976959,
-0.04443345591425896,
0.015417816117405891,
0.03929905220866203,
-0.20374345779418945,
0.12099943310022354,
0.29550671577453613,
-0.08904391527175903,
0.08704298734664917,
-0.08350871503353119,
-0.03843506798148155,
0.004733315669000149,
0.014040290378034115,
-0.012514145113527775,
0.05584743991494179,
-0.21965652704238892,
0.01466388814151287,
0.06727053970098495,
0.04156458005309105,
0.07642959803342819,
-0.0859999805688858,
0.010954614728689194,
-0.010237092152237892,
-0.06006862595677376,
-0.0512426532804966,
0.023106513544917107,
-0.10386990010738373,
0.04736728221178055,
0.02004948817193508,
-0.029072774574160576,
0.09910505264997482,
-0.015330248512327671,
-0.12104111164808273,
0.13878072798252106,
-0.15933358669281006,
-0.07433638721704483,
-0.20006448030471802,
-0.117007777094841,
-0.13614188134670258,
0.004196095280349255,
0.07199214398860931,
-0.06521615386009216,
-0.05303346738219261,
-0.09453579783439636,
-0.03159031644463539,
-0.004664331208914518,
0.0200020931661129,
0.1018829420208931,
0.003005107631906867,
0.062454842031002045,
-0.13836653530597687,
-0.030349357053637505,
0.02690938301384449,
-0.03685837239027023,
0.0381292961537838,
-0.14610512554645538,
0.04667109623551369,
0.09169553220272064,
0.09798050671815872,
0.014049356803297997,
0.03405250981450081,
0.2533617317676544,
0.0005083144642412663,
0.08591817319393158,
0.21878889203071594,
0.026831507682800293,
0.06634604185819626,
0.1246236190199852,
0.015188369899988174,
-0.04381740838289261,
-0.004451994318515062,
-0.04050036892294884,
-0.07386624068021774,
-0.17816026508808136,
-0.024423077702522278,
-0.09238959103822708,
0.03260310739278793,
0.042277973145246506,
0.06812620162963867,
0.10110706835985184,
0.1447603404521942,
-0.0896952822804451,
0.1259884089231491,
0.061389751732349396,
0.062493372708559036,
0.08388926088809967,
0.012898899614810944,
0.06139650568366051,
-0.12506559491157532,
0.09742426127195358,
0.14299276471138,
0.12008856236934662,
0.15137340128421783,
0.0543915331363678,
0.16068540513515472,
0.09108677506446838,
0.17674028873443604,
0.030110137537121773,
0.12877662479877472,
-0.06461097300052643,
0.0013197371736168861,
-0.04565194249153137,
-0.05981406569480896,
-0.032512493431568146,
0.10660220682621002,
-0.10817302018404007,
-0.048418935388326645,
0.06312461197376251,
-0.04286294803023338,
0.06943611055612564,
0.20951691269874573,
0.030255461111664772,
-0.1798541396856308,
-0.0861114114522934,
0.13065612316131592,
0.04075935110449791,
-0.008225508034229279,
0.04197646677494049,
-0.038117434829473495,
-0.016699407249689102,
0.10331510007381439,
-0.025623155757784843,
0.12600648403167725,
0.0833408460021019,
0.012159525416791439,
0.02764711156487465,
0.0234901811927557,
0.05509696528315544,
0.08718126267194748,
-0.2849828004837036,
0.10771077126264572,
0.027945157140493393,
-0.00688119838014245,
-0.08542091399431229,
0.057180166244506836,
0.07682136446237564,
0.19472871720790863,
0.06666049361228943,
0.048041947185993195,
-0.11003833264112473,
0.0565827339887619,
-0.05503203347325325,
0.051952749490737915,
-0.014678859151899815,
0.00912525411695242,
0.002241935348138213,
-0.04860091954469681,
0.007810346316546202,
0.05786491930484772,
0.13853155076503754,
-0.13269370794296265,
-0.1405760645866394,
0.04094994440674782,
0.12208767980337143,
-0.036320846527814865,
-0.1039021834731102,
0.05490279197692871,
-0.047750525176525116,
0.15938538312911987,
-0.12185807526111603,
-0.1106586903333664,
-0.08066044747829437,
-0.05102925002574921,
0.11419107019901276,
-0.07358898967504501,
0.04939911141991615,
-0.07064854353666306,
-0.0019088075496256351,
-0.003219734178856015,
-0.21706289052963257,
0.09556306153535843,
-0.1507246047258377,
-0.008909281343221664,
0.021429672837257385,
0.049368564039468765,
-0.042812880128622055,
0.007677489425987005,
0.010286234319210052,
-0.023430990055203438,
-0.10291904211044312,
-0.14110928773880005,
-0.037933532148599625,
0.09411854296922684,
-0.025363553315401077,
-0.016253234818577766,
-0.01786796748638153,
-0.01684654876589775,
0.03145992383360863,
-0.046511344611644745,
0.06536270678043365,
0.11554660648107529,
-0.03400549665093422,
0.05831177160143852,
0.25802838802337646,
-0.06629323214292526,
-0.25286367535591125,
-0.09080906212329865,
-0.09596962481737137,
-0.06857989728450775,
-0.10667027533054352,
-0.12429752945899963,
0.16906839609146118,
0.06076522171497345,
-0.07176490873098373,
0.09088002145290375,
-0.2117520272731781,
-0.08612006157636642,
0.13337312638759613,
0.033127401024103165,
0.296651154756546,
-0.14955607056617737,
-0.07217267900705338,
-0.14637991786003113,
-0.2935554087162018,
0.01472080871462822,
-0.22081619501113892,
0.04507938027381897,
-0.021511398255825043,
0.019694479182362556,
-0.059766724705696106,
-0.05831843242049217,
0.16470623016357422,
0.05519416928291321,
0.05744800716638565,
-0.11486716568470001,
0.10606122761964798,
0.18210630118846893,
-0.06263332813978195,
0.11615189164876938,
-0.1922641545534134,
0.03838714584708214,
-0.07061734795570374,
0.005821386817842722,
-0.043709203600883484,
0.01019318588078022,
-0.01679994724690914,
-0.06997569650411606,
-0.0637209340929985,
-0.01893005333840847,
0.0275162011384964,
0.01185623835772276,
0.18878215551376343,
0.027423929423093796,
-0.05764632299542427,
0.2201598435640335,
0.040079209953546524,
-0.2267569899559021,
-0.006377651356160641,
-0.053009890019893646,
-0.04303591698408127,
0.08263885229825974,
-0.2714010179042816,
0.049374621361494064,
0.032583385705947876,
-0.04832750931382179,
0.05672036111354828,
0.03950626030564308,
-0.02395172417163849,
-0.004626730922609568,
0.08387007564306259,
-0.09588061273097992,
-0.06958551704883575,
-0.02443230152130127,
0.05941992253065109,
-0.028183115646243095,
0.12428253144025803,
0.1709626317024231,
-0.010037609376013279,
0.017137499526143074,
0.0021171020343899727,
0.04929155856370926,
-0.15238407254219055,
0.051570966839790344,
0.066446952521801,
-0.006209700368344784,
-0.10463924705982208,
0.14339441061019897,
-0.021967343986034393,
-0.00603890186175704,
0.012685360386967659,
0.02676716074347496,
-0.16177986562252045,
-0.11817946285009384,
-0.058860450983047485,
0.00027005429728887975,
-0.134530171751976,
-0.10015637427568436,
-0.0028746065218001604,
-0.09128855168819427,
0.030661316588521004,
-0.00577292637899518,
0.0722249299287796,
0.012576109729707241,
0.020635411143302917,
-0.03485964983701706,
0.03168293833732605,
-0.016849717125296593,
-0.036508798599243164,
0.059990935027599335,
-0.11051065474748611,
-0.17534837126731873,
-0.04088456928730011,
0.045214295387268066,
-0.01787160523235798,
0.024571605026721954,
-0.07839924842119217,
0.022934528067708015,
-0.29210567474365234,
0.012071788311004639,
-0.1413850486278534,
0.0061164735816419125,
-0.0049425894394516945,
-0.05780988186597824,
-0.03827362880110741,
0.09163159877061844,
-0.06549161672592163,
-0.04027058184146881,
-0.03928011283278465,
0.0492449514567852,
-0.08842944353818893,
-0.03272565081715584,
0.05885990336537361,
-0.04854704812169075,
0.10862036794424057,
0.11322146654129028,
-0.09478869289159775,
0.09028307348489761,
-0.14003117382526398,
-0.04945788159966469,
0.030554072931408882,
0.0264886524528265,
-0.00309395813383162,
0.0015046672197058797,
-0.031653840094804764,
0.07223854213953018,
0.004003795329481363,
-0.025946129113435745,
0.04568292945623398,
-0.0571805015206337,
-0.10749746859073639,
-0.08854006230831146,
0.004668135195970535,
-0.07009996473789215,
-0.030041445046663284,
0.19721847772598267,
0.06495411694049835,
0.13864395022392273,
-0.002735202433541417,
-0.05068928375840187,
-0.096683569252491,
0.029961267486214638,
0.04336260259151459,
-0.0964503139257431,
-0.06842764467000961,
-0.0817301869392395,
-0.036641597747802734,
-0.03419909626245499,
0.206529438495636,
-0.12435091286897659,
-0.10938359051942825,
0.022254636511206627,
0.016031581908464432,
0.10321154445409775,
0.0014925171853974462,
0.3468064069747925,
0.03933561593294144,
0.022856662049889565,
-0.09363602846860886,
0.047604434192180634,
0.06937824189662933,
0.03597578778862953,
-0.03606447950005531,
0.12604741752147675,
0.04269824177026749,
0.1667296439409256,
0.0034530824050307274,
0.03962082415819168,
0.021882440894842148,
0.06597474962472916,
-0.002687937580049038,
0.07704564929008484,
-0.013936486095190048,
0.13355772197246552,
0.21840932965278625,
-0.1051851436495781,
-0.02128727175295353,
-0.030489608645439148,
-0.024635441601276398,
-0.07721492648124695,
-0.22323456406593323,
-0.07484541088342667,
-0.20571687817573547,
-0.005967402830719948,
-0.08399377018213272,
0.0007606060244143009,
0.11164150387048721,
0.050739139318466187,
-0.014208256267011166,
0.022801315411925316,
-0.017315847799181938,
-0.053263574838638306,
0.04041019082069397,
-0.032795533537864685,
-0.10745576024055481,
0.11516515165567398,
-0.0385301448404789,
0.05208737403154373,
-0.06270162016153336,
0.005549998953938484,
0.04532736912369728,
0.05928920954465866,
0.09588228166103363,
-0.02765815146267414,
-0.06833628565073013,
-0.07408914715051651,
0.04783301800489426,
-0.028874997049570084,
0.13199397921562195,
0.011632518842816353,
-0.0031131296418607235,
0.05873437225818634,
0.09344884753227234,
-0.08080065995454788,
-0.14836856722831726,
-0.11060851812362671,
0.01111240778118372,
-0.05845346301794052,
0.028087660670280457,
-0.030618082731962204,
-0.024067237973213196,
-0.0023790234699845314,
0.27275437116622925,
0.2053021490573883,
-0.06611202657222748,
-0.009317859075963497,
-0.011430752463638783,
0.005844563245773315,
-0.025142261758446693,
0.12254209816455841,
0.12285924702882767,
0.033489614725112915,
-0.033000197261571884,
-0.023272326216101646,
-0.02604782208800316,
-0.042159613221883774,
-0.12519054114818573,
0.0642382949590683,
-0.054673440754413605,
-0.09018687903881073,
-0.002136925933882594,
0.050067368894815445,
-0.060657575726509094,
-0.026308823376893997,
-0.02842385321855545,
0.021759850904345512,
0.013278300873935223,
-0.05262568220496178,
0.09094861894845963,
0.058719612658023834,
0.03066461905837059,
-0.06467282772064209,
0.04154489189386368,
0.11591362953186035,
-0.025023475289344788,
-0.15970532596111298,
-0.04766184836626053,
0.08629895746707916,
0.0035900487564504147,
0.12624108791351318,
0.023059071972966194,
0.020473845303058624,
0.10686510056257248,
0.016648754477500916,
-0.16063617169857025,
0.06889963150024414,
-0.008789384737610817,
-0.02794734016060829,
-0.056509703397750854,
-0.05881015956401825,
-0.06299818307161331,
0.0060066827572882175,
0.025448154658079147,
-0.03917188197374344,
-0.012929463759064674,
0.09512067586183548,
-0.03601851686835289,
-0.09513665735721588,
-0.022456256672739983,
-0.12128229439258575,
0.10432719439268112,
0.0026254148688167334,
-0.07878683507442474,
-0.05989369377493858,
-0.0618683323264122,
0.051939088851213455,
0.016765736043453217,
-0.08765105158090591,
-0.021556099876761436,
-0.03565937280654907,
-0.009594958275556564,
0.055489398539066315,
0.035396333783864975,
-0.09027501195669174,
-0.046990059316158295,
-0.0626661479473114,
-0.0008578128181397915,
-0.058447156101465225,
0.07356765866279602,
0.15000446140766144,
0.0018204397056251764,
-0.020214799791574478,
-0.09491802752017975,
-0.03329450264573097,
0.048932526260614395,
-0.03741895779967308,
-0.1329425722360611
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | feature-extraction | tommymarto/LernnaviBERT_baseline_students_answers_384_lstm_seq_len_30 | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:41:35+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
39,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.052746038883924484,
0.20255789160728455,
-0.0045078229159116745,
0.0248473659157753,
0.10497838258743286,
0.00675728265196085,
0.06521498411893845,
0.11486967653036118,
-0.0023755673319101334,
0.12028469145298004,
0.027631845325231552,
0.08119397610425949,
0.12110675126314163,
0.15393014252185822,
0.005160121712833643,
-0.24253977835178375,
0.05344875901937485,
-0.09366832673549652,
0.004077504388988018,
0.11452110856771469,
0.1343945860862732,
-0.10780399292707443,
0.08976872265338898,
-0.00683097867295146,
-0.01712046191096306,
-0.015751034021377563,
-0.07134060561656952,
-0.06668227165937424,
0.05541034787893295,
0.07649129629135132,
0.0725555345416069,
0.010986946523189545,
0.07830587029457092,
-0.2806258797645569,
0.014425364322960377,
0.08005264401435852,
0.0010765197221189737,
0.06795802712440491,
0.08151742070913315,
-0.06789936870336533,
0.1251654475927353,
-0.0605485662817955,
0.14059753715991974,
0.07639917731285095,
-0.08928128331899643,
-0.19590547680854797,
-0.06669555604457855,
0.07481247186660767,
0.129872128367424,
0.05026249960064888,
-0.02990107797086239,
0.1371748298406601,
-0.09688840061426163,
0.00786701962351799,
0.12302009761333466,
-0.07360870391130447,
-0.05524582043290138,
0.031063849106431007,
0.10805318504571915,
0.09297362715005875,
-0.11762315034866333,
-0.008467874489724636,
0.029582185670733452,
0.022175652906298637,
0.08627551048994064,
0.015828849747776985,
0.1525639444589615,
0.041341137140989304,
-0.14141254127025604,
-0.0526716373860836,
0.09056255221366882,
0.03701045364141464,
-0.050960201770067215,
-0.23367193341255188,
-0.026245610788464546,
-0.012442239560186863,
-0.03079850971698761,
-0.04234880208969116,
0.053594592958688736,
-0.03630254790186882,
0.07596245408058167,
-0.007196845952421427,
-0.07732249796390533,
-0.031211229041218758,
0.05230424553155899,
0.06785056740045547,
0.018615471199154854,
-0.006994647905230522,
0.019442738965153694,
0.11387838423252106,
0.07708574831485748,
-0.13029205799102783,
-0.07214002311229706,
-0.0739525631070137,
-0.09558356553316116,
-0.04332297295331955,
0.03707554563879967,
0.07106684148311615,
0.04390906170010567,
0.20283061265945435,
-0.017690327018499374,
0.046562306582927704,
0.0476159006357193,
0.005842953454703093,
0.07147589325904846,
0.10925443470478058,
-0.06689215451478958,
-0.14432233572006226,
-0.06022803485393524,
0.08875485509634018,
-0.009834992699325085,
-0.03670760244131088,
-0.049119677394628525,
0.04676154628396034,
0.03209913894534111,
0.11318106204271317,
0.08643888682126999,
-0.003593706525862217,
-0.0628826767206192,
-0.042073074728250504,
0.22331053018569946,
-0.14625342190265656,
0.043256524950265884,
0.007445589639246464,
-0.0429743155837059,
-0.0076383077539503574,
0.005870272871106863,
0.014089803211390972,
-0.03238216042518616,
0.10351061820983887,
-0.0778173878788948,
-0.035906463861465454,
-0.1116463914513588,
-0.06868703663349152,
0.024910317733883858,
0.0025890374090522528,
-0.018393149599432945,
-0.04424213990569115,
-0.11253650486469269,
-0.051282741129398346,
0.0724339634180069,
-0.07579848170280457,
-0.05524555593729019,
0.009976830333471298,
-0.04834962263703346,
0.0031978494953364134,
0.00010397454752819613,
0.11258035898208618,
-0.03314845636487007,
0.025259260088205338,
-0.04850656911730766,
0.06803499162197113,
0.10959596186876297,
0.038730688393116,
-0.0804535374045372,
0.07286878675222397,
-0.22788093984127045,
0.10223092138767242,
-0.09346398711204529,
0.025767935439944267,
-0.14578653872013092,
-0.04199126362800598,
0.02854149229824543,
0.02887420728802681,
-0.010361229069530964,
0.1268649846315384,
-0.1982942521572113,
-0.035082314163446426,
0.15190726518630981,
-0.11336656659841537,
-0.09347330778837204,
0.065653957426548,
-0.05610617995262146,
0.11296144872903824,
0.04835578054189682,
-0.019556574523448944,
0.06953749805688858,
-0.1281629204750061,
-0.04506009817123413,
-0.021473335102200508,
-0.008493004366755486,
0.14857245981693268,
0.06750676780939102,
-0.05737153813242912,
0.07104712724685669,
0.02051553688943386,
-0.037109848111867905,
-0.03301886469125748,
-0.03470754995942116,
-0.09331934154033661,
0.009520708583295345,
-0.07244295626878738,
0.03737799823284149,
-0.02224314957857132,
-0.08870045095682144,
-0.030656753107905388,
-0.17619828879833221,
0.043274905532598495,
0.08050142228603363,
0.008233942091464996,
-0.021131468936800957,
-0.09287237375974655,
0.02556683123111725,
-0.009385489858686924,
-0.021018607541918755,
-0.1641797423362732,
-0.044834475964307785,
0.04416196420788765,
-0.1971662938594818,
0.023802341893315315,
-0.03283040598034859,
0.05093098804354668,
0.03247829154133797,
-0.04019762575626373,
-0.005096070934087038,
0.0028117431793361902,
0.01809627003967762,
-0.026984719559550285,
-0.200385183095932,
-0.031109308823943138,
-0.029154371470212936,
0.1362139731645584,
-0.22226740419864655,
0.028292208909988403,
0.07483648508787155,
0.13521188497543335,
0.0009690870065242052,
-0.04426588490605354,
0.010693409480154514,
-0.05366935580968857,
-0.053671274334192276,
-0.06512755900621414,
-0.007102466654032469,
-0.03287021815776825,
-0.04422381520271301,
0.06460095942020416,
-0.19425635039806366,
-0.03641216829419136,
0.10608077049255371,
0.10164625942707062,
-0.14719000458717346,
-0.028969714418053627,
-0.04096706584095955,
-0.06081128865480423,
-0.09094393998384476,
-0.0630471333861351,
0.14371246099472046,
0.04861542955040932,
0.048413511365652084,
-0.08624191582202911,
-0.0630124881863594,
0.00895135197788477,
0.0006565740332007408,
-0.03649118170142174,
0.08907787501811981,
0.08782777935266495,
-0.10737399011850357,
0.08881597965955734,
0.08605224639177322,
0.06605713814496994,
0.10539878904819489,
0.001256609451957047,
-0.10750970244407654,
-0.029154706746339798,
0.005644100718200207,
0.01547710970044136,
0.14092515408992767,
-0.044270921498537064,
0.04743899777531624,
0.05656488984823227,
-0.027443327009677887,
0.01715722121298313,
-0.10313762724399567,
0.02984124980866909,
0.046840768307447433,
-0.010507673025131226,
0.012429861351847649,
-0.03895113617181778,
0.025837475433945656,
0.08796556293964386,
0.03584056720137596,
0.027896199375391006,
0.0029043578542768955,
-0.03437814116477966,
-0.10392027348279953,
0.17429527640342712,
-0.0878753736615181,
-0.28357240557670593,
-0.1356295943260193,
-0.00747122336179018,
0.05167245492339134,
-0.022715993225574493,
0.013256389647722244,
-0.04903135821223259,
-0.11467588692903519,
-0.10348290205001831,
0.008818334899842739,
0.0437844917178154,
-0.07700283080339432,
-0.07256268709897995,
0.046553414314985275,
0.033613573759794235,
-0.14174877107143402,
0.022300107404589653,
0.048012908548116684,
-0.03855963796377182,
-0.015413837507367134,
0.07170835882425308,
0.10258439928293228,
0.17387451231479645,
-0.004228805657476187,
-0.01945391111075878,
0.023280048742890358,
0.24459126591682434,
-0.14296141266822815,
0.10647262632846832,
0.15432609617710114,
-0.06630013138055801,
0.1025824174284935,
0.19176462292671204,
0.02610800787806511,
-0.07571171224117279,
0.03370760753750801,
0.03715203329920769,
-0.053104497492313385,
-0.23274335265159607,
-0.060641512274742126,
0.0011178229469805956,
-0.06850682199001312,
0.09104112535715103,
0.08915619552135468,
0.11183936148881912,
0.0454646460711956,
-0.08415863662958145,
-0.06847929954528809,
0.019614145159721375,
0.10642454773187637,
-0.03275766968727112,
0.007264797575771809,
0.09054313600063324,
-0.04184457287192345,
-0.005177726969122887,
0.10835286974906921,
0.007426192983984947,
0.1962665617465973,
0.031048519536852837,
0.15333782136440277,
0.07211130857467651,
0.0342402458190918,
0.026680786162614822,
0.025636766105890274,
0.023090654984116554,
0.009547512046992779,
-0.01598707027733326,
-0.08795502036809921,
0.027014199644327164,
0.13500221073627472,
0.07871367782354355,
0.029795078560709953,
0.020392734557390213,
-0.0429922379553318,
0.062152985483407974,
0.15964233875274658,
0.006258485373109579,
-0.2136749029159546,
-0.03950631618499756,
0.08867984265089035,
-0.0793125256896019,
-0.1237078458070755,
-0.02518491819500923,
0.03823186457157135,
-0.1809074580669403,
0.04127289727330208,
-0.01795332506299019,
0.11453432589769363,
-0.11700457334518433,
-0.028958700597286224,
0.039744846522808075,
0.08327627927064896,
-0.03253408893942833,
0.07922478020191193,
-0.1647184044122696,
0.1165376752614975,
0.012328862212598324,
0.05802180990576744,
-0.11617794632911682,
0.09878876805305481,
0.012594180181622505,
-0.009003117680549622,
0.16720694303512573,
-0.0008162438753060997,
-0.07339610159397125,
-0.06517832726240158,
-0.07867198437452316,
-0.022016214206814766,
0.09116258472204208,
-0.11647430807352066,
0.08271238952875137,
-0.012302344664931297,
-0.03819865360856056,
0.002976413816213608,
-0.1073245257139206,
-0.12343364208936691,
-0.191313698887825,
0.05862122401595116,
-0.11746024340391159,
0.00024363139527849853,
-0.10003595799207687,
-0.05551697313785553,
-0.04721582680940628,
0.19990667700767517,
-0.14306047558784485,
-0.09675363451242447,
-0.1526252180337906,
-0.09468596428632736,
0.1679719239473343,
-0.04768168181180954,
0.08716544508934021,
-0.00014324963558465242,
0.22273695468902588,
0.00589721417054534,
-0.010143720544874668,
0.07824880629777908,
-0.08608578145503998,
-0.17828822135925293,
-0.07740302383899689,
0.12055730819702148,
0.12802201509475708,
0.05279289186000824,
-0.012038013897836208,
0.020934196189045906,
-0.036648161709308624,
-0.11678951978683472,
0.003050430677831173,
0.1217387318611145,
0.05949230119585991,
0.039503831416368484,
-0.002558275358751416,
-0.10200468450784683,
-0.07551230490207672,
-0.0352395698428154,
0.02261841483414173,
0.18903005123138428,
-0.08441178500652313,
0.15781226754188538,
0.13112787902355194,
-0.05333179607987404,
-0.21253353357315063,
0.030583804473280907,
0.043237145990133286,
0.004318034742027521,
0.0612679123878479,
-0.17720702290534973,
0.08167627453804016,
0.025727098807692528,
-0.05116020143032074,
0.15224720537662506,
-0.16569727659225464,
-0.15514664351940155,
0.0824643224477768,
0.05010354146361351,
-0.22108957171440125,
-0.12386278063058853,
-0.0879128947854042,
-0.06589758396148682,
-0.1396872103214264,
0.08584427833557129,
0.014041651971638203,
-0.0018043812597170472,
0.05013851076364517,
0.033740755170583725,
0.018914686515927315,
-0.048698488622903824,
0.21615906059741974,
-0.0022440196480602026,
0.03326340764760971,
-0.07553089410066605,
-0.10180798172950745,
0.06950566172599792,
-0.05141735449433327,
0.08518881350755692,
-0.03099823370575905,
0.005753061734139919,
-0.08320630341768265,
-0.057475052773952484,
-0.05255331099033356,
0.03318103775382042,
-0.08139406144618988,
-0.10520965605974197,
-0.06759276986122131,
0.09429939836263657,
0.09139011800289154,
-0.03298058733344078,
-0.04032526910305023,
-0.08896728605031967,
0.039150089025497437,
0.20617929100990295,
0.17360219359397888,
0.05333937704563141,
-0.10111589729785919,
0.002542630536481738,
-0.01915728859603405,
0.040264517068862915,
-0.21200114488601685,
0.04798245429992676,
0.04617756977677345,
0.024147402495145798,
0.12109645456075668,
-0.0176423080265522,
-0.1646004468202591,
-0.047221194952726364,
0.0562983863055706,
-0.03494611009955406,
-0.20504815876483917,
-0.01314060389995575,
0.04864202439785004,
-0.18736153841018677,
-0.06957933306694031,
0.016700902953743935,
-0.014444489032030106,
-0.027432914823293686,
0.013032985851168633,
0.06286440044641495,
0.025481918826699257,
0.10238313674926758,
0.05989401787519455,
0.1000840812921524,
-0.112981878221035,
0.0795830711722374,
0.09043775498867035,
-0.08344172686338425,
0.009394102729856968,
0.06964189559221268,
-0.05280066654086113,
-0.02294989861547947,
0.022772129625082016,
0.06757686287164688,
-0.003049787599593401,
-0.057536181062459946,
-0.02079189568758011,
-0.10809285193681717,
0.06586270034313202,
0.1269281655550003,
0.0400845967233181,
-0.006831571459770203,
0.04905473813414574,
0.02419281378388405,
-0.07880669087171555,
0.11321208626031876,
0.03362756222486496,
0.03722309693694115,
-0.05989459529519081,
-0.01674187369644642,
0.04316421225667,
0.005734616424888372,
-0.02047782577574253,
-0.025104478001594543,
-0.05658029392361641,
-0.013948953710496426,
-0.18932224810123444,
0.014544147998094559,
-0.07588981091976166,
0.005138450767844915,
0.014814606867730618,
-0.040141742676496506,
-0.018671197816729546,
0.012856033630669117,
-0.08163223415613174,
-0.05027473345398903,
-0.0038707295898348093,
0.09766460955142975,
-0.1400173306465149,
0.008230311796069145,
0.09175591170787811,
-0.11852382868528366,
0.06848865002393723,
-0.019968708977103233,
-0.014717686921358109,
0.0038272906094789505,
-0.1270400881767273,
0.04572216048836708,
-0.004586559720337391,
0.02062096633017063,
0.04444560408592224,
-0.17065683007240295,
0.004877567756921053,
-0.0423397533595562,
-0.0478336401283741,
-0.015323328785598278,
-0.08405033499002457,
-0.11406292766332626,
0.10921793431043625,
0.002206311793997884,
-0.08430022746324539,
-0.010287429206073284,
0.04696008190512657,
0.10919637978076935,
-0.03898061811923981,
0.124757781624794,
0.0047785635106265545,
0.06639395654201508,
-0.18268363177776337,
-0.024298490956425667,
-0.014514438807964325,
0.007352736312896013,
0.027192458510398865,
-0.016180848702788353,
0.04238643869757652,
-0.01372526679188013,
0.2601816952228546,
-0.021822240203619003,
0.07231466472148895,
0.0637383759021759,
0.042024899274110794,
0.016651110723614693,
0.08318763226270676,
0.06755662709474564,
0.016758481040596962,
0.004258559085428715,
0.02265608124434948,
-0.03241465613245964,
-0.016654497012495995,
-0.15768693387508392,
0.07677853107452393,
0.14623822271823883,
0.08591317385435104,
0.007676990237087011,
0.06586159020662308,
-0.10330242663621902,
-0.10554943233728409,
0.08015866577625275,
-0.03888537734746933,
-0.0009790018666535616,
-0.058588381856679916,
0.15355949103832245,
0.14971502125263214,
-0.17422176897525787,
0.08231138437986374,
-0.03791337087750435,
-0.04883022606372833,
-0.11436772346496582,
-0.15839459002017975,
-0.06608819216489792,
-0.029153592884540558,
-0.0041826991364359856,
-0.05528274551033974,
0.06748054921627045,
0.10802645981311798,
-0.0021057529374957085,
-0.00038325722562149167,
0.09545762091875076,
-0.026331622153520584,
-0.01757199876010418,
0.03465426340699196,
0.04817976430058479,
0.033562518656253815,
-0.04831063002347946,
0.020485511049628258,
0.004976877011358738,
0.03976510092616081,
0.05864322930574417,
0.023703020066022873,
-0.03892989084124565,
0.014479226432740688,
-0.01092575490474701,
-0.1049860492348671,
0.022427968680858612,
-0.029776830226182938,
-0.07360642403364182,
0.13104131817817688,
0.029177764430642128,
0.019099419936537743,
-0.03228067234158516,
0.20109383761882782,
-0.07107947021722794,
-0.06925153732299805,
-0.14109766483306885,
0.10889512300491333,
-0.03372858464717865,
0.06323269009590149,
0.058447178453207016,
-0.1133023053407669,
-0.002398417331278324,
0.1314154714345932,
0.133079394698143,
-0.033533163368701935,
0.005780258681625128,
0.03008044883608818,
0.00756559893488884,
-0.0482633113861084,
0.045497048646211624,
0.031092669814825058,
0.15440985560417175,
-0.06949599832296371,
0.07780899107456207,
0.00008295764564536512,
-0.08774317800998688,
-0.036128852516412735,
0.1405542492866516,
0.006535779219120741,
0.03079606406390667,
-0.06559351831674576,
0.10371401906013489,
-0.07252706587314606,
-0.23936228454113007,
0.045033879578113556,
-0.07753164321184158,
-0.15683837234973907,
-0.013978141359984875,
0.02726292423903942,
-0.009009851142764091,
0.02702206000685692,
0.0654432401061058,
-0.06469112634658813,
0.161378413438797,
0.03472336754202843,
-0.08781957626342773,
-0.05673113837838173,
0.07957270741462708,
-0.09192227572202682,
0.2958409786224365,
0.013188840821385384,
0.029593972489237785,
0.10327941924333572,
-0.019989576190710068,
-0.13285429775714874,
0.030561091378331184,
0.10066051781177521,
-0.09982595592737198,
0.06684590131044388,
0.18159176409244537,
-0.009470577351748943,
0.10021016746759415,
0.07437440752983093,
-0.061603669077157974,
0.05807222053408623,
-0.0826035663485527,
-0.06770919263362885,
-0.09389114379882812,
0.05970105528831482,
-0.06468918174505234,
0.14543601870536804,
0.1228262409567833,
-0.04243761673569679,
-0.004415105562657118,
-0.02816380001604557,
0.043726447969675064,
0.012194468639791012,
0.12871193885803223,
0.008576037362217903,
-0.1618158370256424,
0.026840461418032646,
0.0030557403806596994,
0.10387714207172394,
-0.21997274458408356,
-0.08367477357387543,
0.04838619381189346,
-0.029553698375821114,
-0.05334814265370369,
0.10579082369804382,
0.06295353919267654,
0.0504634715616703,
-0.04548325017094612,
-0.05543007701635361,
-0.008723298087716103,
0.14979462325572968,
-0.1187625601887703,
-0.006005466915667057
] |
null | null | null |
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="HannoRE/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
| {"tags": ["FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "q-FrozenLake-v1-4x4-noSlippery", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "FrozenLake-v1-4x4-no_slippery", "type": "FrozenLake-v1-4x4-no_slippery"}, "metrics": [{"type": "mean_reward", "value": "1.00 +/- 0.00", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | HannoRE/q-FrozenLake-v1-4x4-noSlippery | [
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | 2024-02-12T12:43:37+00:00 | [] | [] | TAGS
#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
|
# Q-Learning Agent playing1 FrozenLake-v1
This is a trained model of a Q-Learning agent playing FrozenLake-v1 .
## Usage
| [
"# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
"TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n",
"# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
40,
39
] | [
"passage: TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
0.04578453302383423,
-0.08074592798948288,
-0.00430759321898222,
0.10720831900835037,
0.05034215748310089,
-0.040469273924827576,
0.11997015029191971,
0.018999949097633362,
0.20601962506771088,
-0.010012076236307621,
0.1455274522304535,
0.007022971753031015,
-0.006192410364747047,
0.1867983490228653,
0.04572829231619835,
-0.26324528455734253,
0.01831899583339691,
-0.09495259821414948,
-0.07281816750764847,
0.11870454251766205,
0.05470194295048714,
-0.01901467889547348,
-0.0007633853238075972,
0.056141503155231476,
-0.0673527717590332,
0.0007737681735306978,
0.031996939331293106,
-0.012976245954632759,
0.19804789125919342,
-0.02254498563706875,
0.06641989201307297,
0.054705578833818436,
0.0758768692612648,
-0.1998077929019928,
0.0358855277299881,
-0.04215473681688309,
-0.09439758956432343,
-0.03934839740395546,
-0.018780618906021118,
0.05878105387091637,
0.053356342017650604,
0.03858819976449013,
0.058354366570711136,
0.09384993463754654,
-0.0773480236530304,
0.04328357055783272,
0.04280758649110794,
0.024811049923300743,
0.04589218273758888,
-0.0237203948199749,
-0.027002155780792236,
0.08246652781963348,
-0.22182892262935638,
0.10318073630332947,
-0.010159241035580635,
-0.5270710587501526,
-0.00633762264624238,
0.24088262021541595,
0.11517096310853958,
0.05707438662648201,
-0.06903956830501556,
0.10566288232803345,
0.03913382440805435,
-0.007209456991404295,
0.03210983797907829,
0.02150118350982666,
0.12817370891571045,
0.06009242683649063,
-0.09581366181373596,
0.040699947625398636,
0.13722525537014008,
0.012822695076465607,
0.020306183025240898,
-0.08888901025056839,
0.0410032719373703,
-0.03461858257651329,
-0.007679527159780264,
-0.09758518636226654,
0.05478060990571976,
0.012466507963836193,
-0.0934976264834404,
-0.09247440844774246,
-0.04236573353409767,
-0.06708304584026337,
0.11252415925264359,
0.046419668942689896,
-0.0874939113855362,
0.03884070739150047,
-0.06760413944721222,
0.05918780341744423,
-0.16863860189914703,
0.02074250765144825,
-0.06627868115901947,
-0.09376336634159088,
-0.11799788475036621,
-0.01683047041296959,
-0.07946427166461945,
0.009092256426811218,
0.056664444506168365,
0.1447116881608963,
0.22076484560966492,
0.06690320372581482,
0.09728849679231644,
0.07456006109714508,
0.06531001627445221,
0.1538129299879074,
0.10918238013982773,
0.019075315445661545,
-0.015266558155417442,
0.0948706716299057,
-0.06445580720901489,
-0.1351388692855835,
-0.15579092502593994,
0.005488025024533272,
0.0983937531709671,
0.08871900290250778,
-0.044080477207899094,
-0.006702381651848555,
-0.024641724303364754,
0.08566431701183319,
-0.11314457654953003,
-0.024612564593553543,
-0.002267979085445404,
0.06882024556398392,
-0.024801667779684067,
0.020378148183226585,
-0.06242705136537552,
0.12715265154838562,
0.04222423583269119,
-0.059924717992544174,
-0.055308472365140915,
-0.03053177334368229,
-0.014276440255343914,
-0.027539284899830818,
0.02446848154067993,
-0.07659092545509338,
0.04767750948667526,
-0.16766095161437988,
-0.042871296405792236,
-0.04784649610519409,
0.025697942823171616,
-0.03907240927219391,
-0.13557587563991547,
-0.17699143290519714,
-0.048906855285167694,
-0.022438718006014824,
0.03549358621239662,
-0.038111843168735504,
0.006551501806825399,
-0.006318534724414349,
-0.1583600640296936,
0.09783563017845154,
0.09784027189016342,
-0.03643378987908363,
-0.02749447710812092,
0.056263517588377,
-0.07194498926401138,
0.1561182290315628,
-0.21054518222808838,
-0.054014235734939575,
-0.044764336198568344,
-0.06595750898122787,
0.19673264026641846,
0.012690845876932144,
-0.01202624011784792,
0.19873127341270447,
-0.29073721170425415,
-0.06078760325908661,
0.12533614039421082,
-0.07834373414516449,
-0.0936407670378685,
0.06941844522953033,
-0.04206686094403267,
0.023345354944467545,
0.046047765761613846,
0.36345911026000977,
-0.02069227211177349,
-0.16197136044502258,
-0.021782705560326576,
0.13971707224845886,
-0.1184760183095932,
0.059895481914281845,
0.04240793362259865,
0.12543781101703644,
-0.04250509291887283,
-0.018672896549105644,
-0.09023164212703705,
0.05999075248837471,
-0.05241934582591057,
-0.09016361832618713,
-0.03393383324146271,
-0.07645075023174286,
0.13294468820095062,
-0.0629684180021286,
0.05601520463824272,
-0.03255095332860947,
-0.07133250683546066,
-0.050324998795986176,
-0.016492370516061783,
0.04460815340280533,
0.05951254442334175,
-0.12794871628284454,
0.11029167473316193,
0.13025271892547607,
-0.0006193425506353378,
-0.07498852163553238,
-0.17872096598148346,
0.003240168560296297,
0.009576505981385708,
0.039837226271629333,
0.17141658067703247,
0.12209978699684143,
0.033295199275016785,
0.008770671673119068,
-0.06389404833316803,
-0.18276847898960114,
0.058129217475652695,
-0.056212130934000015,
-0.14230976998806,
-0.052409034222364426,
-0.0728459507226944,
0.017381802201271057,
-0.0859743058681488,
-0.017379917204380035,
0.021926190704107285,
0.006908397190272808,
0.02990424446761608,
-0.026645656675100327,
-0.049561817198991776,
0.021254703402519226,
0.06490101665258408,
-0.0037617047782987356,
0.12023693323135376,
0.008277264423668385,
-0.18308481574058533,
0.07930773496627808,
0.08478537946939468,
0.09196605533361435,
0.013250201940536499,
0.02685922384262085,
-0.021522263064980507,
-0.08061408251523972,
-0.054420311003923416,
0.02957955375313759,
0.11417073011398315,
0.1317172348499298,
0.2361993044614792,
0.08753683418035507,
0.04697408527135849,
-0.02164587564766407,
-0.016415923833847046,
0.002810494042932987,
-0.06318057328462601,
-0.029935607686638832,
0.10614971816539764,
0.05865858122706413,
-0.067733034491539,
-0.04576427489519119,
0.09590928256511688,
0.02732124738395214,
0.21205885708332062,
-0.03342745825648308,
0.01286078616976738,
-0.10957037657499313,
-0.06550975888967514,
-0.031982194632291794,
0.09201868623495102,
0.09498392790555954,
0.009755023755133152,
-0.022056059911847115,
-0.04259001836180687,
0.0012916827108711004,
-0.1334889680147171,
-0.10375088453292847,
0.026475343853235245,
0.013400445692241192,
-0.11206940561532974,
0.11674030870199203,
-0.11352457851171494,
0.039504457265138626,
0.06024791672825813,
-0.13837239146232605,
0.04428480193018913,
-0.029713207855820656,
-0.07886212319135666,
0.16866780817508698,
-0.11075661331415176,
-0.094340018928051,
-0.08831550180912018,
0.004082420375198126,
0.0075836325995624065,
-0.03922267258167267,
-0.009283260442316532,
-0.19952571392059326,
-0.005375816952437162,
-0.03544965013861656,
0.013616434298455715,
-0.06988783925771713,
-0.11287739872932434,
-0.010957922786474228,
0.07084179669618607,
-0.043388739228248596,
-0.07803605496883392,
0.007967432029545307,
-0.08923084288835526,
-0.10623309016227722,
0.028189711272716522,
0.019765101373195648,
-0.022883659228682518,
0.16152891516685486,
0.01816628873348236,
0.05626589432358742,
-0.03298520669341087,
0.30665266513824463,
-0.038163769990205765,
0.08371731638908386,
-0.02993497997522354,
-0.07433546334505081,
0.06130730360746384,
-0.022327827289700508,
0.06086638569831848,
-0.020221687853336334,
-0.02362890914082527,
0.0077952733263373375,
-0.08579335361719131,
-0.18365982174873352,
-0.05417544022202492,
0.03724347800016403,
0.195254847407341,
0.031118987128138542,
0.01910330168902874,
-0.0488768145442009,
-0.010547760874032974,
0.1665220558643341,
-0.10005921125411987,
0.04030545800924301,
-0.05366240441799164,
0.11506262421607971,
-0.08640182018280029,
0.06195629760622978,
0.020486772060394287,
0.04266135022044182,
-0.04877188801765442,
0.09486009180545807,
0.0826394334435463,
0.1121082529425621,
-0.02206910029053688,
0.046257395297288895,
0.019012698903679848,
0.07383184134960175,
0.11073657125234604,
0.0368414968252182,
-0.0729052945971489,
0.001982470043003559,
-0.006313489284366369,
-0.039427030831575394,
0.11933320760726929,
0.17963355779647827,
-0.11991413682699203,
-0.05106910318136215,
0.27167606353759766,
0.0031242913100868464,
0.19481229782104492,
-0.01315275114029646,
0.043591804802417755,
-0.04484925419092178,
0.04572054371237755,
-0.05338600277900696,
-0.04086209088563919,
0.2094656229019165,
0.08045925945043564,
-0.17165091633796692,
-0.08549032360315323,
-0.05912299454212189,
0.07081323862075806,
0.10728751868009567,
0.0013539529172703624,
-0.04156802222132683,
0.0004610282776411623,
0.0014198932331055403,
0.08339415490627289,
-0.14520122110843658,
0.11816094070672989,
-0.03172019124031067,
0.05612684786319733,
0.017555562779307365,
-0.045326150953769684,
0.04264266416430473,
0.07474290579557419,
0.26618310809135437,
0.0904107540845871,
-0.040318213403224945,
-0.0892091691493988,
-0.12260187417268753,
0.010461576282978058,
0.029102616012096405,
-0.03534553572535515,
0.0037547778338193893,
-0.020087555050849915,
0.0318896509706974,
0.008264793083071709,
0.016230624169111252,
-0.08987458795309067,
-0.03175399824976921,
-0.027736429125070572,
-0.023839212954044342,
0.10733365267515182,
-0.09495144337415695,
-0.1444292515516281,
-0.15713949501514435,
0.04191131144762039,
-0.0766405463218689,
-0.056593164801597595,
-0.054507751017808914,
-0.05239389091730118,
-0.0311186034232378,
-0.03773957118391991,
0.09099467098712921,
-0.0021037792321294546,
0.14807306230068207,
-0.1920108050107956,
-0.04220759496092796,
0.051812779158353806,
-0.07607918977737427,
-0.08729588985443115,
0.03410962224006653,
0.12136995792388916,
0.05116051807999611,
0.11504370719194412,
0.013609255664050579,
0.09567681699991226,
0.0045484392903745174,
-0.06713183224201202,
0.15302421152591705,
-0.14069625735282898,
-0.27875974774360657,
-0.03836318850517273,
0.016946332529187202,
0.1615200787782669,
-0.05613167956471443,
0.031766023486852646,
0.3335736393928528,
0.27782970666885376,
-0.1428707242012024,
0.25916144251823425,
0.019178593531250954,
0.004398873541504145,
-0.19130495190620422,
-0.10125631093978882,
0.025324683636426926,
0.04740457236766815,
0.12032642960548401,
-0.14564448595046997,
-0.010732659138739109,
-0.04543145373463631,
-0.025908485054969788,
0.10386138409376144,
-0.12300799041986465,
-0.07263197749853134,
0.07765276730060577,
0.039809420704841614,
0.1808302253484726,
0.03932500258088112,
0.0014799144119024277,
0.13626977801322937,
0.06612244248390198,
0.019124457612633705,
0.05216038227081299,
0.08028066903352737,
-0.018944554030895233,
0.14207926392555237,
0.05448179319500923,
-0.02551644667983055,
0.052681710571050644,
-0.0054580713622272015,
-0.03219012916088104,
0.015605825930833817,
-0.183198019862175,
-0.10147556662559509,
-0.0561356320977211,
-0.10798973590135574,
-0.04978342354297638,
0.056853994727134705,
-0.12395523488521576,
-0.007896827533841133,
-0.03841273859143257,
0.03718273714184761,
-0.07831971347332001,
-0.09360362589359283,
-0.036494381725788116,
0.1351792961359024,
0.07210618257522583,
0.04471297934651375,
0.035655103623867035,
-0.07390819489955902,
0.07097936421632767,
0.21671734750270844,
0.08159157633781433,
0.028919655829668045,
-0.19545674324035645,
-0.024042490869760513,
-0.0803457647562027,
0.06306298077106476,
-0.08856996893882751,
-0.016788700595498085,
0.11923003196716309,
0.08616556972265244,
0.05413002520799637,
0.09640096127986908,
-0.045083072036504745,
0.021686913445591927,
0.02684609219431877,
-0.15131035447120667,
-0.18501274287700653,
-0.08534606546163559,
-0.03519878163933754,
0.11561143398284912,
-0.06398691236972809,
0.10897188633680344,
-0.13615410029888153,
0.010051886551082134,
-0.006060056854039431,
0.02693452313542366,
-0.03596206381917,
-0.11251141875982285,
0.15348562598228455,
0.11999429017305374,
-0.06767056882381439,
0.03127254918217659,
-0.09527092427015305,
-0.04423454403877258,
0.12686803936958313,
-0.013623855076730251,
-0.0371493324637413,
-0.054547641426324844,
-0.03628576174378395,
0.15247689187526703,
-0.03436964750289917,
0.008244883269071579,
-0.041229065507650375,
-0.18217355012893677,
0.0798322781920433,
0.09045056998729706,
0.019827889278531075,
-0.031874191015958786,
-0.09797266125679016,
-0.010231015272438526,
-0.0011165260802954435,
0.11730700731277466,
-0.10696814209222794,
-0.10933240503072739,
-0.15144047141075134,
0.06713984161615372,
-0.0007159380475059152,
0.18502596020698547,
-0.06394898891448975,
-0.08904669433832169,
-0.12429379671812057,
0.02344517596065998,
-0.0027384376153349876,
-0.042264558374881744,
0.01618490368127823,
0.07992301136255264,
-0.04095321521162987,
0.02075677551329136,
-0.06651144474744797,
0.06372585147619247,
-0.11786920577287674,
0.09625071287155151,
0.01063506118953228,
0.016993753612041473,
-0.0417880080640316,
-0.01618220843374729,
0.039470795542001724,
-0.057925306260585785,
0.07921463251113892,
0.011758086271584034,
0.0010938759660348296,
0.10196787863969803,
-0.0034960443153977394,
0.06409632414579391,
-0.05372481048107147,
-0.023290161043405533,
0.06578411161899567,
-0.05874887853860855,
-0.03370826691389084,
-0.1573946475982666,
-0.0709633082151413,
0.020051732659339905,
-0.04775108024477959,
0.002077929675579071,
0.03673801198601723,
0.062159497290849686,
-0.06937079131603241,
-0.12125655263662338,
-0.043812792748212814,
-0.028638383373618126,
0.021301284432411194,
0.10829301923513412,
-0.07526551932096481,
0.1547859013080597,
-0.052787959575653076,
-0.00020603960729204118,
0.07437096536159515,
0.04048224538564682,
0.01393822580575943,
-0.10422444343566895,
-0.04698587954044342,
-0.11035211384296417,
0.1502903699874878,
-0.007902312092483044,
-0.03533121198415756,
0.03719403222203255,
-0.11946307867765427,
-0.1572723090648651,
0.03418220207095146,
0.10199101269245148,
0.0448341928422451,
0.025807438418269157,
0.027079269289970398,
-0.04042419046163559,
-0.021270349621772766,
-0.07034418731927872,
0.0882953479886055,
-0.12085357308387756,
-0.09669415652751923,
0.09555385261774063,
0.12178351730108261,
-0.0036850625183433294,
-0.07441367954015732,
0.11554073542356491,
-0.021787192672491074,
0.05525410920381546,
-0.02971339225769043,
0.10308072715997696,
0.0796005055308342,
-0.12273547053337097,
0.005693064536899328,
-0.036891788244247437,
-0.0741485133767128,
-0.12975730001926422,
0.019545545801520348,
-0.061916105449199677,
-0.13383042812347412,
0.12179028987884521,
-0.09376577287912369,
0.030037038028240204,
-0.10506992787122726,
0.021338803693652153,
0.01864001713693142,
0.061665527522563934,
-0.10988292098045349,
0.08575301617383957,
0.13424484431743622,
-0.043199893087148666,
-0.07184189558029175,
-0.12455986440181732,
-0.05022053420543671,
-0.04231856390833855,
-0.13957437872886658,
-0.11600435525178909,
0.0100301094353199,
-0.023418782278895378,
-0.05818291753530502,
0.0015462689334526658,
-0.03659068048000336,
0.008594646118581295,
0.021907730028033257,
0.04032021388411522,
-0.02693161368370056,
0.05134565755724907,
-0.057569269090890884,
-0.052510857582092285,
0.11489357799291611,
0.04113486409187317,
-0.03561042994260788,
-0.052359987050294876,
0.12997733056545258,
-0.11959461867809296,
0.07662346214056015,
-0.020313527435064316,
0.017129231244325638,
-0.06435854732990265,
0.17131924629211426,
0.11673715710639954,
-0.1367570012807846,
-0.005008010193705559,
-0.08210669457912445,
0.020409544929862022,
0.023555370047688484,
0.13693512976169586,
-0.03411718085408211,
-0.0012358218664303422,
-0.1580323874950409,
0.018575575202703476,
-0.18557456135749817,
-0.03716109320521355,
0.04671547934412956,
0.09917585551738739,
0.15293832123279572,
-0.0034432117827236652,
-0.1263325810432434,
0.10424192249774933,
-0.2118520885705948,
0.0907607227563858,
0.05121984705328941,
-0.11874113976955414,
-0.06765396893024445,
-0.06795281916856766,
0.1198519766330719,
0.009196433238685131,
0.2040700763463974,
-0.013615905307233334,
-0.09132910519838333,
-0.07060808688402176,
-0.01980910450220108,
-0.030524181202054024,
0.09714830666780472,
0.041414931416511536,
0.04653804749250412,
0.12821412086486816,
0.00368314771912992,
0.07533777505159378,
0.060310911387205124,
0.02759413793683052,
-0.012300663627684116,
0.04076618701219559,
0.08261215686798096,
-0.14588621258735657,
-0.1659701019525528,
0.1326720416545868,
0.025149408727884293,
0.11792458593845367,
0.03658788278698921,
-0.1549617499113083,
0.06687124073505402,
0.2523096203804016,
-0.11147607117891312,
0.02505038119852543,
0.12737524509429932,
-0.0366884209215641,
0.0672016367316246,
0.1144871786236763,
-0.02633814327418804,
-0.05217865854501724,
-0.011363590136170387,
0.10233135521411896,
0.028660254552960396,
-0.04646271467208862,
-0.02340836264193058,
-0.03373933956027031,
-0.019070526584982872,
-0.011738128960132599,
-0.0909019410610199,
-0.1543993502855301,
-0.10471053421497345,
-0.16619662940502167,
0.04399140924215317,
-0.04626438021659851,
0.13418889045715332,
0.09469578415155411,
-0.012723101302981377,
0.04568437114357948,
0.028575526550412178,
0.07275456190109253,
0.07916246354579926,
-0.02939477376639843,
-0.036159269511699677
] |
null | null | null | ## Overview
Welcome to the Face Swapping project! This project showcases the fun capability of replacing faces in images using the multiple libraby!
# 🌟 Image Showcase 🌟
## **Input Image**

## **Target Image**

## **Output Image**

## Interested in Face Swapping with Nettyfy Technologies!
Contact us at Nettyfy Technologies to discuss custom solutions and licensing options for commercial use. Enhance your applications and create engaging experiences with our advanced face swapping technology.
For inquiries, please reach out to [[email protected]](mailto:[email protected]).
| {} | null | nettyfy/faceswap | [
"region:us"
] | 2024-02-12T12:43:48+00:00 | [] | [] | TAGS
#region-us
| ## Overview
Welcome to the Face Swapping project! This project showcases the fun capability of replacing faces in images using the multiple libraby!
# Image Showcase
## Input Image
!input1
## Target Image
!Input2
## Output Image
!Output
## Interested in Face Swapping with Nettyfy Technologies!
Contact us at Nettyfy Technologies to discuss custom solutions and licensing options for commercial use. Enhance your applications and create engaging experiences with our advanced face swapping technology.
For inquiries, please reach out to info@URL.
| [
"## Overview\n\nWelcome to the Face Swapping project! This project showcases the fun capability of replacing faces in images using the multiple libraby!",
"# Image Showcase",
"## Input Image\n!input1",
"## Target Image\n!Input2",
"## Output Image\n\n!Output",
"## Interested in Face Swapping with Nettyfy Technologies!\n\nContact us at Nettyfy Technologies to discuss custom solutions and licensing options for commercial use. Enhance your applications and create engaging experiences with our advanced face swapping technology.\n\nFor inquiries, please reach out to info@URL."
] | [
"TAGS\n#region-us \n",
"## Overview\n\nWelcome to the Face Swapping project! This project showcases the fun capability of replacing faces in images using the multiple libraby!",
"# Image Showcase",
"## Input Image\n!input1",
"## Target Image\n!Input2",
"## Output Image\n\n!Output",
"## Interested in Face Swapping with Nettyfy Technologies!\n\nContact us at Nettyfy Technologies to discuss custom solutions and licensing options for commercial use. Enhance your applications and create engaging experiences with our advanced face swapping technology.\n\nFor inquiries, please reach out to info@URL."
] | [
6,
34,
4,
8,
7,
7,
66
] | [
"passage: TAGS\n#region-us \n## Overview\n\nWelcome to the Face Swapping project! This project showcases the fun capability of replacing faces in images using the multiple libraby!# Image Showcase## Input Image\n!input1## Target Image\n!Input2## Output Image\n\n!Output## Interested in Face Swapping with Nettyfy Technologies!\n\nContact us at Nettyfy Technologies to discuss custom solutions and licensing options for commercial use. Enhance your applications and create engaging experiences with our advanced face swapping technology.\n\nFor inquiries, please reach out to info@URL."
] | [
0.05627985671162605,
-0.03859143704175949,
-0.0012738194782286882,
0.004911910276859999,
0.09637701511383057,
0.10014668852090836,
0.19040746986865997,
0.058634329587221146,
0.09115558117628098,
-0.044687941670417786,
0.06278917193412781,
-0.030147701501846313,
0.12423113733530045,
0.24951300024986267,
0.11167506128549576,
-0.3274805545806885,
0.06875310838222504,
0.001415734994225204,
0.048340246081352234,
0.016441645100712776,
0.1237475648522377,
-0.0033293617889285088,
0.0737503245472908,
-0.019137458875775337,
-0.17535582184791565,
0.04572238773107529,
-0.07718643546104431,
0.04386823624372482,
0.13651536405086517,
0.08124203979969025,
0.11517776548862457,
0.011482990346848965,
-0.05275404825806618,
-0.1726534068584442,
0.04394824057817459,
-0.0007799663580954075,
-0.07096581161022186,
0.009561531245708466,
-0.013157199136912823,
0.004341982305049896,
0.2612013816833496,
0.015096530318260193,
-0.08823167532682419,
0.0737149640917778,
-0.1456904411315918,
-0.12974479794502258,
-0.05009954795241356,
-0.05501342564821243,
0.04409070312976837,
-0.047195784747600555,
0.000829632394015789,
0.13662834465503693,
-0.09693939238786697,
0.11158058792352676,
0.24702227115631104,
-0.11156050115823746,
0.01021982729434967,
-0.0271970983594656,
0.012384355999529362,
-0.07724403589963913,
-0.025419171899557114,
0.10551755875349045,
0.017894389107823372,
-0.038485459983348846,
-0.09902250021696091,
0.015661297366023064,
-0.18276667594909668,
-0.01324340607970953,
-0.07616465538740158,
-0.054586250334978104,
0.2288111299276352,
0.0768226683139801,
-0.07422560453414917,
-0.0533897690474987,
-0.05575050786137581,
0.050638340413570404,
-0.041895974427461624,
0.001914449268952012,
0.040255106985569,
0.050581999123096466,
-0.12032423168420792,
-0.14417815208435059,
-0.027897736057639122,
-0.05438430607318878,
-0.08948948234319687,
-0.08893430233001709,
0.01503745187073946,
0.11243927478790283,
-0.07988034933805466,
0.058637604117393494,
-0.0009405932505615056,
0.0013334831455722451,
-0.05935682728886604,
-0.09945330023765564,
-0.025561988353729248,
0.03981206938624382,
0.0005143921007402241,
-0.1601453721523285,
0.07559752464294434,
0.05840840935707092,
0.1293046772480011,
0.016726801171898842,
-0.26948443055152893,
0.13903187215328217,
0.03868427500128746,
0.03204348310828209,
-0.012791275978088379,
-0.18129569292068481,
-0.012461073696613312,
-0.08439890295267105,
0.07382232695817947,
-0.05465848743915558,
-0.09612778574228287,
0.00739465793594718,
0.017393866553902626,
0.11396463215351105,
0.05391469597816467,
0.04815547913312912,
-0.07320132106542587,
0.030192723497748375,
0.18864373862743378,
-0.007530602160841227,
0.0214750524610281,
-0.04545269161462784,
-0.05933544412255287,
0.11179395765066147,
0.18341118097305298,
-0.019568156450986862,
-0.05589793622493744,
-0.0552542507648468,
-0.06766309589147568,
0.018585773184895515,
-0.10004756599664688,
-0.054526187479496,
0.06773076951503754,
0.030731888487935066,
-0.025255508720874786,
-0.06304313242435455,
-0.05271388962864876,
0.09424267709255219,
0.06752711534500122,
-0.029499979689717293,
0.10725981742143631,
-0.06309258937835693,
0.005880055949091911,
-0.06298085302114487,
-0.004085001070052385,
0.07631812989711761,
-0.048237621784210205,
0.03832584619522095,
0.003951402846723795,
0.21813827753067017,
-0.1597963124513626,
0.015196557156741619,
0.006850073114037514,
0.0478813461959362,
-0.20799042284488678,
0.030403396114706993,
-0.050775907933712006,
0.029704583808779716,
0.013247762806713581,
-0.03366909921169281,
-0.09601875394582748,
0.040403250604867935,
0.022317737340927124,
0.1690361052751541,
-0.15046009421348572,
-0.09063377976417542,
0.27017319202423096,
-0.1777096539735794,
-0.12092504650354385,
0.15490879118442535,
-0.03826752305030823,
-0.017821546643972397,
0.09759286791086197,
0.1465800553560257,
-0.0349653959274292,
-0.15179406106472015,
0.017047522589564323,
-0.07211630791425705,
-0.1440318524837494,
-0.054007746279239655,
-0.0000880552179296501,
0.09820985049009323,
-0.04033239558339119,
0.04076182097196579,
-0.02462313510477543,
0.23163391649723053,
-0.10563857853412628,
-0.009573375806212425,
0.072104312479496,
-0.05633595585823059,
0.022013429552316666,
0.03954854607582092,
-0.04345426708459854,
-0.10629141330718994,
-0.024992313235998154,
-0.029400167986750603,
0.004511209204792976,
-0.06568781286478043,
0.10340545326471329,
-0.09541889280080795,
0.022948196157813072,
0.1557198017835617,
-0.002725911559537053,
0.076231949031353,
-0.1320200264453888,
0.02392834983766079,
0.06502138823270798,
0.04801228269934654,
0.11277864873409271,
0.03142966702580452,
0.04781259223818779,
-0.042254380881786346,
0.006278275512158871,
0.001833773567341268,
-0.06087541580200195,
0.08956034481525421,
-0.027992000803351402,
0.029225505888462067,
-0.04906734079122543,
0.07860849052667618,
0.1413072943687439,
-0.059763744473457336,
-0.1336444914340973,
0.18569210171699524,
0.06167439743876457,
0.014367771334946156,
-0.04286615923047066,
-0.02799147553741932,
-0.00565057760104537,
-0.03548390790820122,
0.08828715980052948,
-0.06830459088087082,
-0.007002766709774733,
0.18879136443138123,
-0.025816628709435463,
-0.07423387467861176,
0.12373608350753784,
-0.32414641976356506,
-0.009610895998775959,
-0.07806964963674545,
0.02808537520468235,
-0.0005439600790850818,
0.06541574001312256,
-0.09516151249408722,
-0.0672878846526146,
-0.09225361794233322,
0.03143050894141197,
-0.004480637144297361,
0.07850374281406403,
-0.04838850721716881,
-0.07190674543380737,
-0.07917328178882599,
-0.10590515285730362,
0.008700016885995865,
-0.2002190351486206,
0.03792272508144379,
-0.016371671110391617,
0.02466876059770584,
0.25045087933540344,
0.0544220432639122,
-0.023954153060913086,
-0.03181961923837662,
0.031180767342448235,
0.022098172456026077,
0.11766379326581955,
-0.2920551598072052,
-0.09088388085365295,
0.023446211591362953,
-0.041364602744579315,
0.09172285348176956,
-0.05883234739303589,
0.009481086395680904,
-0.04372343793511391,
-0.09371866285800934,
0.06345642358064651,
0.05427539348602295,
-0.017149414867162704,
0.049710214138031006,
-0.0038773592095822096,
-0.006480037700384855,
0.025378774851560593,
-0.03359008952975273,
-0.008512349799275398,
0.11771424114704132,
-0.08702938258647919,
-0.3223017752170563,
-0.2112990766763687,
-0.1831023097038269,
0.0017741944175213575,
0.022516829892992973,
0.03255157172679901,
0.008611313067376614,
-0.0458625964820385,
-0.010100683197379112,
0.11656738072633743,
-0.05279146134853363,
-0.040401484817266464,
-0.04979754239320755,
-0.029395272955298424,
-0.05246126651763916,
-0.10451650619506836,
-0.034631162881851196,
0.02509175054728985,
-0.028818491846323013,
0.07084216177463531,
-0.20009972155094147,
0.11528901755809784,
-0.10968419164419174,
-0.04060445353388786,
0.028802193701267242,
-0.06480281800031662,
0.1504521369934082,
-0.05382150411605835,
0.06653312593698502,
0.09117948263883591,
-0.028833549469709396,
0.056716468185186386,
0.018920496106147766,
-0.06678402423858643,
-0.0895085334777832,
0.029634349048137665,
0.02658662386238575,
-0.10595095157623291,
-0.057153694331645966,
-0.06563858687877655,
-0.11325624585151672,
0.0985698401927948,
-0.03256795182824135,
-0.003207256319001317,
0.05360335484147072,
0.013453856110572815,
-0.04986490681767464,
0.006733644753694534,
-0.02052955888211727,
0.050763294100761414,
0.00846121460199356,
-0.14297761023044586,
0.055966515094041824,
-0.10694827139377594,
-0.07465166598558426,
0.040781665593385696,
-0.013397255912423134,
0.2169254869222641,
0.07313637435436249,
0.10359776765108109,
0.09657582640647888,
0.11295793950557709,
0.10471738874912262,
0.13474085927009583,
0.02668576128780842,
-0.004271292593330145,
-0.06870885193347931,
0.015437301248311996,
-0.07580283284187317,
-0.001631299965083599,
0.2396712750196457,
-0.16131439805030823,
0.11138579249382019,
-0.14056740701198578,
0.05757515877485275,
0.05937360227108002,
0.0701027661561966,
-0.17264454066753387,
0.14759232103824615,
0.04025501012802124,
-0.016207309439778328,
-0.11274535208940506,
-0.03188411891460419,
0.020148323848843575,
-0.05313384532928467,
0.11392631381750107,
-0.04273488000035286,
0.04842772334814072,
-0.04340832307934761,
-0.05879233404994011,
-0.15359993278980255,
-0.06904242932796478,
0.013558704406023026,
0.03127086162567139,
-0.0470796637237072,
0.1965039074420929,
0.00974979531019926,
-0.043823014944791794,
-0.04168752208352089,
-0.036017198115587234,
0.07730543613433838,
0.09885601699352264,
0.06878987699747086,
0.043301407247781754,
-0.1081458330154419,
0.09180457890033722,
0.0780143290758133,
0.034851379692554474,
0.1035509705543518,
0.07172127068042755,
0.028698254376649857,
-0.028435522690415382,
-0.03184111788868904,
-0.03724953904747963,
0.313590407371521,
-0.06771513819694519,
-0.1357317715883255,
0.004851458594202995,
-0.031460314989089966,
0.20736658573150635,
-0.02735580876469612,
0.048625171184539795,
-0.03436467796564102,
0.05695299431681633,
0.17431092262268066,
0.025023354217410088,
-0.0362938828766346,
0.09711796790361404,
0.0909133031964302,
-0.026511123403906822,
0.04746447131037712,
-0.042136888951063156,
0.03433571010828018,
-0.11563483625650406,
-0.12024535983800888,
-0.004649680107831955,
-0.021746428683400154,
0.07708378881216049,
-0.005525555927306414,
0.04094882681965828,
0.03745725005865097,
-0.041828885674476624,
0.03447731211781502,
-0.04410559684038162,
-0.02255067229270935,
-0.006366884335875511,
0.08794701099395752,
-0.06531484425067902,
-0.02502988465130329,
0.18946173787117004,
-0.011001593433320522,
-0.1364867389202118,
-0.055282462388277054,
0.06398722529411316,
0.1247754767537117,
0.22039039433002472,
0.0004742841701954603,
0.048332687467336655,
0.14760099351406097,
-0.0545240193605423,
-0.24114421010017395,
-0.07655691355466843,
-0.07863577455282211,
0.04482432082295418,
0.024705197662115097,
-0.11467140913009644,
-0.10343655943870544,
-0.054358016699552536,
0.01551873330026865,
0.2370358258485794,
-0.08709347993135452,
-0.033239930868148804,
0.03763226792216301,
0.05690675601363182,
0.30770617723464966,
-0.09568102657794952,
0.023017505183815956,
-0.08030888438224792,
-0.1014033704996109,
0.03214370086789131,
0.09190690517425537,
0.12911579012870789,
-0.01955736055970192,
0.053577471524477005,
-0.0012533909175544977,
-0.052676960825920105,
0.13800176978111267,
-0.21633745729923248,
0.022947438061237335,
-0.1338767111301422,
-0.0544934943318367,
0.02754947543144226,
-0.057089321315288544,
0.10490705817937851,
-0.0025511744897812605,
-0.01890501379966736,
0.019622137770056725,
-0.02081991545855999,
-0.05139271542429924,
0.10272565484046936,
0.0575515478849411,
-0.027528813108801842,
-0.20490729808807373,
0.03207532688975334,
-0.04072190076112747,
0.09341210126876831,
0.11127611249685287,
-0.05921114608645439,
0.09185454249382019,
0.05525193363428116,
-0.027565494179725647,
-0.10259529203176498,
0.03151516243815422,
-0.04582580178976059,
-0.02562890201807022,
0.0829838216304779,
-0.08102131634950638,
0.03756144270300865,
-0.011534905061125755,
-0.05948998034000397,
0.11923594027757645,
0.048791445791721344,
-0.19332203269004822,
0.10453562438488007,
0.18373805284500122,
-0.0698205977678299,
-0.128929004073143,
-0.024729840457439423,
0.027256345376372337,
0.039517953991889954,
-0.012695604003965855,
0.11748471856117249,
0.010319934226572514,
0.003838817123323679,
-0.04176446422934532,
0.05228230729699135,
-0.10538075119256973,
-0.13240133225917816,
0.049445562064647675,
-0.0024790812749415636,
-0.05278082191944122,
0.05442299321293831,
-0.037733275443315506,
-0.11258531361818314,
-0.056627701967954636,
0.10411486029624939,
-0.0032870087306946516,
-0.0741710513830185,
-0.039988115429878235,
0.23581942915916443,
-0.09577111899852753,
-0.08492614328861237,
-0.030028948560357094,
0.020509690046310425,
-0.04553057625889778,
0.05067755654454231,
-0.023355579003691673,
0.06349874287843704,
0.0625879317522049,
0.05724748224020004,
0.057013221085071564,
-0.0752648115158081,
-0.0700855627655983,
0.03198479861021042,
0.033893734216690063,
0.007302545476704836,
0.021306470036506653,
0.08165092021226883,
-0.07549988478422165,
-0.036559853702783585,
-0.14120234549045563,
0.014209778979420662,
-0.06881589442491531,
-0.07575829327106476,
-0.09956081956624985,
-0.05959644913673401,
0.07550594955682755,
-0.02033841237425804,
-0.04329115152359009,
0.021369166672229767,
-0.0922461748123169,
-0.01977497898042202,
0.00366747728548944,
0.011579028330743313,
-0.029382824897766113,
0.02504011243581772,
0.12445013970136642,
-0.026690253987908363,
0.03545890748500824,
-0.004720973316580057,
-0.029973214492201805,
0.11622902750968933,
-0.15874484181404114,
-0.1148124411702156,
0.08714296668767929,
-0.0413648895919323,
-0.021315328776836395,
0.07937147468328476,
-0.035044826567173004,
0.02699742652475834,
0.07033395022153854,
0.002062438055872917,
0.05890129134058952,
-0.010201741009950638,
0.02200177311897278,
-0.053945768624544144,
-0.14830382168293,
-0.011968784034252167,
0.04968002811074257,
0.11497624218463898,
0.06989243626594543,
-0.011605347506701946,
-0.0514153428375721,
0.0330684520304203,
-0.03769010677933693,
0.028965337201952934,
0.009286357089877129,
-0.057524871081113815,
0.2719792127609253,
-0.11475501954555511,
-0.017425067722797394,
0.029682394117116928,
0.17617762088775635,
0.012889144942164421,
-0.06442558020353317,
-0.026214102283120155,
0.0685061439871788,
-0.07632718980312347,
-0.008876395411789417,
0.11818373203277588,
-0.09153776615858078,
0.001977005274966359,
-0.08045364916324615,
0.07987724989652634,
0.0004473552107810974,
0.008424993604421616,
0.028053484857082367,
0.0682704746723175,
-0.0643463283777237,
0.15830111503601074,
0.023471826687455177,
-0.007512514945119619,
0.02016284130513668,
-0.22382426261901855,
0.11414951086044312,
0.0015199369518086314,
-0.01613231934607029,
-0.0095675652846694,
0.18688581883907318,
-0.060681164264678955,
0.1062919944524765,
0.126875102519989,
-0.04666416719555855,
-0.005246295593678951,
-0.005214662756770849,
-0.04588054493069649,
-0.13763053715229034,
0.0605391301214695,
0.0009794944198802114,
0.04838889464735985,
0.19775724411010742,
0.0005533043877221644,
-0.06069033592939377,
0.06452038884162903,
0.00794145092368126,
-0.1239958181977272,
0.012570296414196491,
-0.004203935153782368,
-0.026459645479917526,
0.0724216178059578,
-0.043040819466114044,
-0.000782395713031292,
0.07175274938344955,
-0.012692862190306187,
0.043369174003601074,
0.023086577653884888,
0.06835763901472092,
-0.1198931410908699,
-0.0820176973938942,
-0.01974329724907875,
0.00233274232596159,
-0.06009449437260628,
0.17784550786018372,
-0.009874946437776089,
0.006733755115419626,
-0.018185844644904137,
-0.05571817606687546,
-0.018520835787057877,
-0.015595306642353535,
-0.14329664409160614,
-0.03012830950319767,
-0.15611040592193604,
0.06574942171573639,
-0.1367587298154831,
0.016299525275826454,
-0.06872286647558212,
0.21272017061710358,
0.25291183590888977,
0.009757851250469685,
-0.03969976305961609,
0.0016138034407049417,
0.018915537744760513,
0.03279413282871246,
0.13937486708164215,
-0.00014139307313598692,
0.11415008455514908,
0.014620785601437092,
0.07800451666116714,
-0.07492956519126892,
-0.05362776666879654,
-0.03247370943427086,
-0.03334333002567291,
0.043356653302907944,
-0.0357513427734375,
-0.061208903789520264,
0.1650915890932083,
-0.09508763253688812,
-0.08439214527606964,
0.2282778024673462,
-0.08376721292734146,
0.006778144743293524,
0.010797670111060143,
0.17838901281356812,
0.005192631855607033,
0.09230823814868927,
-0.08150181174278259,
0.10775434970855713,
0.04886336997151375,
0.025605564936995506,
-0.22867529094219208,
-0.1222609356045723,
-0.02150047942996025,
-0.1623130738735199,
0.0869736298918724,
0.03151218965649605,
-0.08777359873056412,
0.03384655714035034,
-0.045377347618341446,
0.03182319179177284,
0.006240979768335819,
-0.057223279029130936,
-0.015882786363363266,
-0.02175012230873108,
0.16933056712150574,
-0.03775695711374283,
-0.10211928188800812,
-0.02975272201001644,
-0.03876698389649391,
0.03725377470254898,
0.03429463133215904,
-0.044258058071136475,
-0.009539847262203693,
0.0616634376347065,
-0.14047743380069733,
0.05100725591182709,
0.05604579672217369,
0.0640282854437828,
-0.019829323515295982,
-0.02036893181502819,
-0.005413778126239777,
0.06769605726003647,
-0.07557620853185654,
-0.116181880235672,
-0.04994986578822136,
-0.178606316447258,
0.0044677043333649635,
-0.06696049869060516,
-0.09450657665729523,
0.03547041490674019,
-0.12044358998537064,
-0.006242895964533091,
-0.018205314874649048,
0.10933048278093338,
0.08553539961576462,
-0.03956463187932968,
0.03537415340542793,
-0.22872227430343628,
0.12983106076717377,
0.04697839543223381,
-0.03242180496454239,
-0.06864799559116364
] |
null | null | null |
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="Ostfriese/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
| {"tags": ["FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "q-FrozenLake-v1-4x4-noSlippery", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "FrozenLake-v1-4x4-no_slippery", "type": "FrozenLake-v1-4x4-no_slippery"}, "metrics": [{"type": "mean_reward", "value": "1.00 +/- 0.00", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | Ostfriese/q-FrozenLake-v1-4x4-noSlippery | [
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | 2024-02-12T12:44:58+00:00 | [] | [] | TAGS
#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
|
# Q-Learning Agent playing1 FrozenLake-v1
This is a trained model of a Q-Learning agent playing FrozenLake-v1 .
## Usage
| [
"# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
"TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n",
"# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
40,
39
] | [
"passage: TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage"
] | [
0.04578453302383423,
-0.08074592798948288,
-0.00430759321898222,
0.10720831900835037,
0.05034215748310089,
-0.040469273924827576,
0.11997015029191971,
0.018999949097633362,
0.20601962506771088,
-0.010012076236307621,
0.1455274522304535,
0.007022971753031015,
-0.006192410364747047,
0.1867983490228653,
0.04572829231619835,
-0.26324528455734253,
0.01831899583339691,
-0.09495259821414948,
-0.07281816750764847,
0.11870454251766205,
0.05470194295048714,
-0.01901467889547348,
-0.0007633853238075972,
0.056141503155231476,
-0.0673527717590332,
0.0007737681735306978,
0.031996939331293106,
-0.012976245954632759,
0.19804789125919342,
-0.02254498563706875,
0.06641989201307297,
0.054705578833818436,
0.0758768692612648,
-0.1998077929019928,
0.0358855277299881,
-0.04215473681688309,
-0.09439758956432343,
-0.03934839740395546,
-0.018780618906021118,
0.05878105387091637,
0.053356342017650604,
0.03858819976449013,
0.058354366570711136,
0.09384993463754654,
-0.0773480236530304,
0.04328357055783272,
0.04280758649110794,
0.024811049923300743,
0.04589218273758888,
-0.0237203948199749,
-0.027002155780792236,
0.08246652781963348,
-0.22182892262935638,
0.10318073630332947,
-0.010159241035580635,
-0.5270710587501526,
-0.00633762264624238,
0.24088262021541595,
0.11517096310853958,
0.05707438662648201,
-0.06903956830501556,
0.10566288232803345,
0.03913382440805435,
-0.007209456991404295,
0.03210983797907829,
0.02150118350982666,
0.12817370891571045,
0.06009242683649063,
-0.09581366181373596,
0.040699947625398636,
0.13722525537014008,
0.012822695076465607,
0.020306183025240898,
-0.08888901025056839,
0.0410032719373703,
-0.03461858257651329,
-0.007679527159780264,
-0.09758518636226654,
0.05478060990571976,
0.012466507963836193,
-0.0934976264834404,
-0.09247440844774246,
-0.04236573353409767,
-0.06708304584026337,
0.11252415925264359,
0.046419668942689896,
-0.0874939113855362,
0.03884070739150047,
-0.06760413944721222,
0.05918780341744423,
-0.16863860189914703,
0.02074250765144825,
-0.06627868115901947,
-0.09376336634159088,
-0.11799788475036621,
-0.01683047041296959,
-0.07946427166461945,
0.009092256426811218,
0.056664444506168365,
0.1447116881608963,
0.22076484560966492,
0.06690320372581482,
0.09728849679231644,
0.07456006109714508,
0.06531001627445221,
0.1538129299879074,
0.10918238013982773,
0.019075315445661545,
-0.015266558155417442,
0.0948706716299057,
-0.06445580720901489,
-0.1351388692855835,
-0.15579092502593994,
0.005488025024533272,
0.0983937531709671,
0.08871900290250778,
-0.044080477207899094,
-0.006702381651848555,
-0.024641724303364754,
0.08566431701183319,
-0.11314457654953003,
-0.024612564593553543,
-0.002267979085445404,
0.06882024556398392,
-0.024801667779684067,
0.020378148183226585,
-0.06242705136537552,
0.12715265154838562,
0.04222423583269119,
-0.059924717992544174,
-0.055308472365140915,
-0.03053177334368229,
-0.014276440255343914,
-0.027539284899830818,
0.02446848154067993,
-0.07659092545509338,
0.04767750948667526,
-0.16766095161437988,
-0.042871296405792236,
-0.04784649610519409,
0.025697942823171616,
-0.03907240927219391,
-0.13557587563991547,
-0.17699143290519714,
-0.048906855285167694,
-0.022438718006014824,
0.03549358621239662,
-0.038111843168735504,
0.006551501806825399,
-0.006318534724414349,
-0.1583600640296936,
0.09783563017845154,
0.09784027189016342,
-0.03643378987908363,
-0.02749447710812092,
0.056263517588377,
-0.07194498926401138,
0.1561182290315628,
-0.21054518222808838,
-0.054014235734939575,
-0.044764336198568344,
-0.06595750898122787,
0.19673264026641846,
0.012690845876932144,
-0.01202624011784792,
0.19873127341270447,
-0.29073721170425415,
-0.06078760325908661,
0.12533614039421082,
-0.07834373414516449,
-0.0936407670378685,
0.06941844522953033,
-0.04206686094403267,
0.023345354944467545,
0.046047765761613846,
0.36345911026000977,
-0.02069227211177349,
-0.16197136044502258,
-0.021782705560326576,
0.13971707224845886,
-0.1184760183095932,
0.059895481914281845,
0.04240793362259865,
0.12543781101703644,
-0.04250509291887283,
-0.018672896549105644,
-0.09023164212703705,
0.05999075248837471,
-0.05241934582591057,
-0.09016361832618713,
-0.03393383324146271,
-0.07645075023174286,
0.13294468820095062,
-0.0629684180021286,
0.05601520463824272,
-0.03255095332860947,
-0.07133250683546066,
-0.050324998795986176,
-0.016492370516061783,
0.04460815340280533,
0.05951254442334175,
-0.12794871628284454,
0.11029167473316193,
0.13025271892547607,
-0.0006193425506353378,
-0.07498852163553238,
-0.17872096598148346,
0.003240168560296297,
0.009576505981385708,
0.039837226271629333,
0.17141658067703247,
0.12209978699684143,
0.033295199275016785,
0.008770671673119068,
-0.06389404833316803,
-0.18276847898960114,
0.058129217475652695,
-0.056212130934000015,
-0.14230976998806,
-0.052409034222364426,
-0.0728459507226944,
0.017381802201271057,
-0.0859743058681488,
-0.017379917204380035,
0.021926190704107285,
0.006908397190272808,
0.02990424446761608,
-0.026645656675100327,
-0.049561817198991776,
0.021254703402519226,
0.06490101665258408,
-0.0037617047782987356,
0.12023693323135376,
0.008277264423668385,
-0.18308481574058533,
0.07930773496627808,
0.08478537946939468,
0.09196605533361435,
0.013250201940536499,
0.02685922384262085,
-0.021522263064980507,
-0.08061408251523972,
-0.054420311003923416,
0.02957955375313759,
0.11417073011398315,
0.1317172348499298,
0.2361993044614792,
0.08753683418035507,
0.04697408527135849,
-0.02164587564766407,
-0.016415923833847046,
0.002810494042932987,
-0.06318057328462601,
-0.029935607686638832,
0.10614971816539764,
0.05865858122706413,
-0.067733034491539,
-0.04576427489519119,
0.09590928256511688,
0.02732124738395214,
0.21205885708332062,
-0.03342745825648308,
0.01286078616976738,
-0.10957037657499313,
-0.06550975888967514,
-0.031982194632291794,
0.09201868623495102,
0.09498392790555954,
0.009755023755133152,
-0.022056059911847115,
-0.04259001836180687,
0.0012916827108711004,
-0.1334889680147171,
-0.10375088453292847,
0.026475343853235245,
0.013400445692241192,
-0.11206940561532974,
0.11674030870199203,
-0.11352457851171494,
0.039504457265138626,
0.06024791672825813,
-0.13837239146232605,
0.04428480193018913,
-0.029713207855820656,
-0.07886212319135666,
0.16866780817508698,
-0.11075661331415176,
-0.094340018928051,
-0.08831550180912018,
0.004082420375198126,
0.0075836325995624065,
-0.03922267258167267,
-0.009283260442316532,
-0.19952571392059326,
-0.005375816952437162,
-0.03544965013861656,
0.013616434298455715,
-0.06988783925771713,
-0.11287739872932434,
-0.010957922786474228,
0.07084179669618607,
-0.043388739228248596,
-0.07803605496883392,
0.007967432029545307,
-0.08923084288835526,
-0.10623309016227722,
0.028189711272716522,
0.019765101373195648,
-0.022883659228682518,
0.16152891516685486,
0.01816628873348236,
0.05626589432358742,
-0.03298520669341087,
0.30665266513824463,
-0.038163769990205765,
0.08371731638908386,
-0.02993497997522354,
-0.07433546334505081,
0.06130730360746384,
-0.022327827289700508,
0.06086638569831848,
-0.020221687853336334,
-0.02362890914082527,
0.0077952733263373375,
-0.08579335361719131,
-0.18365982174873352,
-0.05417544022202492,
0.03724347800016403,
0.195254847407341,
0.031118987128138542,
0.01910330168902874,
-0.0488768145442009,
-0.010547760874032974,
0.1665220558643341,
-0.10005921125411987,
0.04030545800924301,
-0.05366240441799164,
0.11506262421607971,
-0.08640182018280029,
0.06195629760622978,
0.020486772060394287,
0.04266135022044182,
-0.04877188801765442,
0.09486009180545807,
0.0826394334435463,
0.1121082529425621,
-0.02206910029053688,
0.046257395297288895,
0.019012698903679848,
0.07383184134960175,
0.11073657125234604,
0.0368414968252182,
-0.0729052945971489,
0.001982470043003559,
-0.006313489284366369,
-0.039427030831575394,
0.11933320760726929,
0.17963355779647827,
-0.11991413682699203,
-0.05106910318136215,
0.27167606353759766,
0.0031242913100868464,
0.19481229782104492,
-0.01315275114029646,
0.043591804802417755,
-0.04484925419092178,
0.04572054371237755,
-0.05338600277900696,
-0.04086209088563919,
0.2094656229019165,
0.08045925945043564,
-0.17165091633796692,
-0.08549032360315323,
-0.05912299454212189,
0.07081323862075806,
0.10728751868009567,
0.0013539529172703624,
-0.04156802222132683,
0.0004610282776411623,
0.0014198932331055403,
0.08339415490627289,
-0.14520122110843658,
0.11816094070672989,
-0.03172019124031067,
0.05612684786319733,
0.017555562779307365,
-0.045326150953769684,
0.04264266416430473,
0.07474290579557419,
0.26618310809135437,
0.0904107540845871,
-0.040318213403224945,
-0.0892091691493988,
-0.12260187417268753,
0.010461576282978058,
0.029102616012096405,
-0.03534553572535515,
0.0037547778338193893,
-0.020087555050849915,
0.0318896509706974,
0.008264793083071709,
0.016230624169111252,
-0.08987458795309067,
-0.03175399824976921,
-0.027736429125070572,
-0.023839212954044342,
0.10733365267515182,
-0.09495144337415695,
-0.1444292515516281,
-0.15713949501514435,
0.04191131144762039,
-0.0766405463218689,
-0.056593164801597595,
-0.054507751017808914,
-0.05239389091730118,
-0.0311186034232378,
-0.03773957118391991,
0.09099467098712921,
-0.0021037792321294546,
0.14807306230068207,
-0.1920108050107956,
-0.04220759496092796,
0.051812779158353806,
-0.07607918977737427,
-0.08729588985443115,
0.03410962224006653,
0.12136995792388916,
0.05116051807999611,
0.11504370719194412,
0.013609255664050579,
0.09567681699991226,
0.0045484392903745174,
-0.06713183224201202,
0.15302421152591705,
-0.14069625735282898,
-0.27875974774360657,
-0.03836318850517273,
0.016946332529187202,
0.1615200787782669,
-0.05613167956471443,
0.031766023486852646,
0.3335736393928528,
0.27782970666885376,
-0.1428707242012024,
0.25916144251823425,
0.019178593531250954,
0.004398873541504145,
-0.19130495190620422,
-0.10125631093978882,
0.025324683636426926,
0.04740457236766815,
0.12032642960548401,
-0.14564448595046997,
-0.010732659138739109,
-0.04543145373463631,
-0.025908485054969788,
0.10386138409376144,
-0.12300799041986465,
-0.07263197749853134,
0.07765276730060577,
0.039809420704841614,
0.1808302253484726,
0.03932500258088112,
0.0014799144119024277,
0.13626977801322937,
0.06612244248390198,
0.019124457612633705,
0.05216038227081299,
0.08028066903352737,
-0.018944554030895233,
0.14207926392555237,
0.05448179319500923,
-0.02551644667983055,
0.052681710571050644,
-0.0054580713622272015,
-0.03219012916088104,
0.015605825930833817,
-0.183198019862175,
-0.10147556662559509,
-0.0561356320977211,
-0.10798973590135574,
-0.04978342354297638,
0.056853994727134705,
-0.12395523488521576,
-0.007896827533841133,
-0.03841273859143257,
0.03718273714184761,
-0.07831971347332001,
-0.09360362589359283,
-0.036494381725788116,
0.1351792961359024,
0.07210618257522583,
0.04471297934651375,
0.035655103623867035,
-0.07390819489955902,
0.07097936421632767,
0.21671734750270844,
0.08159157633781433,
0.028919655829668045,
-0.19545674324035645,
-0.024042490869760513,
-0.0803457647562027,
0.06306298077106476,
-0.08856996893882751,
-0.016788700595498085,
0.11923003196716309,
0.08616556972265244,
0.05413002520799637,
0.09640096127986908,
-0.045083072036504745,
0.021686913445591927,
0.02684609219431877,
-0.15131035447120667,
-0.18501274287700653,
-0.08534606546163559,
-0.03519878163933754,
0.11561143398284912,
-0.06398691236972809,
0.10897188633680344,
-0.13615410029888153,
0.010051886551082134,
-0.006060056854039431,
0.02693452313542366,
-0.03596206381917,
-0.11251141875982285,
0.15348562598228455,
0.11999429017305374,
-0.06767056882381439,
0.03127254918217659,
-0.09527092427015305,
-0.04423454403877258,
0.12686803936958313,
-0.013623855076730251,
-0.0371493324637413,
-0.054547641426324844,
-0.03628576174378395,
0.15247689187526703,
-0.03436964750289917,
0.008244883269071579,
-0.041229065507650375,
-0.18217355012893677,
0.0798322781920433,
0.09045056998729706,
0.019827889278531075,
-0.031874191015958786,
-0.09797266125679016,
-0.010231015272438526,
-0.0011165260802954435,
0.11730700731277466,
-0.10696814209222794,
-0.10933240503072739,
-0.15144047141075134,
0.06713984161615372,
-0.0007159380475059152,
0.18502596020698547,
-0.06394898891448975,
-0.08904669433832169,
-0.12429379671812057,
0.02344517596065998,
-0.0027384376153349876,
-0.042264558374881744,
0.01618490368127823,
0.07992301136255264,
-0.04095321521162987,
0.02075677551329136,
-0.06651144474744797,
0.06372585147619247,
-0.11786920577287674,
0.09625071287155151,
0.01063506118953228,
0.016993753612041473,
-0.0417880080640316,
-0.01618220843374729,
0.039470795542001724,
-0.057925306260585785,
0.07921463251113892,
0.011758086271584034,
0.0010938759660348296,
0.10196787863969803,
-0.0034960443153977394,
0.06409632414579391,
-0.05372481048107147,
-0.023290161043405533,
0.06578411161899567,
-0.05874887853860855,
-0.03370826691389084,
-0.1573946475982666,
-0.0709633082151413,
0.020051732659339905,
-0.04775108024477959,
0.002077929675579071,
0.03673801198601723,
0.062159497290849686,
-0.06937079131603241,
-0.12125655263662338,
-0.043812792748212814,
-0.028638383373618126,
0.021301284432411194,
0.10829301923513412,
-0.07526551932096481,
0.1547859013080597,
-0.052787959575653076,
-0.00020603960729204118,
0.07437096536159515,
0.04048224538564682,
0.01393822580575943,
-0.10422444343566895,
-0.04698587954044342,
-0.11035211384296417,
0.1502903699874878,
-0.007902312092483044,
-0.03533121198415756,
0.03719403222203255,
-0.11946307867765427,
-0.1572723090648651,
0.03418220207095146,
0.10199101269245148,
0.0448341928422451,
0.025807438418269157,
0.027079269289970398,
-0.04042419046163559,
-0.021270349621772766,
-0.07034418731927872,
0.0882953479886055,
-0.12085357308387756,
-0.09669415652751923,
0.09555385261774063,
0.12178351730108261,
-0.0036850625183433294,
-0.07441367954015732,
0.11554073542356491,
-0.021787192672491074,
0.05525410920381546,
-0.02971339225769043,
0.10308072715997696,
0.0796005055308342,
-0.12273547053337097,
0.005693064536899328,
-0.036891788244247437,
-0.0741485133767128,
-0.12975730001926422,
0.019545545801520348,
-0.061916105449199677,
-0.13383042812347412,
0.12179028987884521,
-0.09376577287912369,
0.030037038028240204,
-0.10506992787122726,
0.021338803693652153,
0.01864001713693142,
0.061665527522563934,
-0.10988292098045349,
0.08575301617383957,
0.13424484431743622,
-0.043199893087148666,
-0.07184189558029175,
-0.12455986440181732,
-0.05022053420543671,
-0.04231856390833855,
-0.13957437872886658,
-0.11600435525178909,
0.0100301094353199,
-0.023418782278895378,
-0.05818291753530502,
0.0015462689334526658,
-0.03659068048000336,
0.008594646118581295,
0.021907730028033257,
0.04032021388411522,
-0.02693161368370056,
0.05134565755724907,
-0.057569269090890884,
-0.052510857582092285,
0.11489357799291611,
0.04113486409187317,
-0.03561042994260788,
-0.052359987050294876,
0.12997733056545258,
-0.11959461867809296,
0.07662346214056015,
-0.020313527435064316,
0.017129231244325638,
-0.06435854732990265,
0.17131924629211426,
0.11673715710639954,
-0.1367570012807846,
-0.005008010193705559,
-0.08210669457912445,
0.020409544929862022,
0.023555370047688484,
0.13693512976169586,
-0.03411718085408211,
-0.0012358218664303422,
-0.1580323874950409,
0.018575575202703476,
-0.18557456135749817,
-0.03716109320521355,
0.04671547934412956,
0.09917585551738739,
0.15293832123279572,
-0.0034432117827236652,
-0.1263325810432434,
0.10424192249774933,
-0.2118520885705948,
0.0907607227563858,
0.05121984705328941,
-0.11874113976955414,
-0.06765396893024445,
-0.06795281916856766,
0.1198519766330719,
0.009196433238685131,
0.2040700763463974,
-0.013615905307233334,
-0.09132910519838333,
-0.07060808688402176,
-0.01980910450220108,
-0.030524181202054024,
0.09714830666780472,
0.041414931416511536,
0.04653804749250412,
0.12821412086486816,
0.00368314771912992,
0.07533777505159378,
0.060310911387205124,
0.02759413793683052,
-0.012300663627684116,
0.04076618701219559,
0.08261215686798096,
-0.14588621258735657,
-0.1659701019525528,
0.1326720416545868,
0.025149408727884293,
0.11792458593845367,
0.03658788278698921,
-0.1549617499113083,
0.06687124073505402,
0.2523096203804016,
-0.11147607117891312,
0.02505038119852543,
0.12737524509429932,
-0.0366884209215641,
0.0672016367316246,
0.1144871786236763,
-0.02633814327418804,
-0.05217865854501724,
-0.011363590136170387,
0.10233135521411896,
0.028660254552960396,
-0.04646271467208862,
-0.02340836264193058,
-0.03373933956027031,
-0.019070526584982872,
-0.011738128960132599,
-0.0909019410610199,
-0.1543993502855301,
-0.10471053421497345,
-0.16619662940502167,
0.04399140924215317,
-0.04626438021659851,
0.13418889045715332,
0.09469578415155411,
-0.012723101302981377,
0.04568437114357948,
0.028575526550412178,
0.07275456190109253,
0.07916246354579926,
-0.02939477376639843,
-0.036159269511699677
] |
null | null | null |
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="HannoRE/q-Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
| {"tags": ["Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "q-Taxi-v3", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "Taxi-v3", "type": "Taxi-v3"}, "metrics": [{"type": "mean_reward", "value": "7.50 +/- 2.79", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | HannoRE/q-Taxi-v3 | [
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | 2024-02-12T12:47:50+00:00 | [] | [] | TAGS
#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
|
# Q-Learning Agent playing1 Taxi-v3
This is a trained model of a Q-Learning agent playing Taxi-v3 .
## Usage
| [
"# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
"TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n",
"# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
32,
33
] | [
"passage: TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
0.048862796276807785,
-0.16549694538116455,
-0.005485367961227894,
0.02960980497300625,
0.1345081776380539,
-0.01784728653728962,
0.11895976960659027,
0.07759871333837509,
-0.07461097836494446,
-0.055395450443029404,
0.1418241262435913,
0.09088201075792313,
0.055222880095243454,
0.05699880048632622,
0.09511256217956543,
-0.27440664172172546,
0.048217080533504486,
-0.02918700873851776,
0.05621987581253052,
0.11878681182861328,
0.0670095682144165,
-0.040441032499074936,
0.061956584453582764,
0.11818158626556396,
-0.1018151044845581,
-0.007344264071434736,
0.035402704030275345,
-0.09440053254365921,
0.17413531243801117,
0.07204403728246689,
0.12337774783372879,
0.05132639780640602,
0.179361954331398,
-0.12762396037578583,
0.024310702458024025,
-0.0010275895474478602,
-0.10138072073459625,
-0.03909514099359512,
-0.012415820732712746,
-0.08349097520112991,
0.03230205550789833,
0.23522862792015076,
0.07199250161647797,
0.06632792949676514,
-0.17707863450050354,
-0.06584878265857697,
-0.04375573247671127,
0.069611094892025,
0.14951466023921967,
0.03758616745471954,
-0.033800311386585236,
0.1684885323047638,
-0.2564343810081482,
0.05066783353686333,
0.037275806069374084,
-0.42313119769096375,
0.017119819298386574,
0.1507398933172226,
0.15090937912464142,
0.06909667700529099,
-0.10573802888393402,
0.013512322679162025,
0.051325585693120956,
-0.0005318621988408267,
0.024325110018253326,
0.006554204970598221,
0.15601307153701782,
0.08537693321704865,
-0.1487821787595749,
-0.058576688170433044,
0.17441977560520172,
-0.03788546845316887,
-0.02613203600049019,
-0.039745692163705826,
0.0067160045728087425,
-0.06427708268165588,
-0.004067842848598957,
-0.1777995079755783,
0.00734262028709054,
0.06666424125432968,
-0.014348524622619152,
0.014901017770171165,
-0.035522811114788055,
-0.0966939702630043,
-0.023098144680261612,
-0.08592145889997482,
0.01677769608795643,
-0.006319406442344189,
-0.10187895596027374,
0.05002119392156601,
-0.061138734221458435,
0.0014382408699020743,
-0.05123179033398628,
-0.15047866106033325,
-0.049055423587560654,
-0.03481535613536835,
0.1474713832139969,
-0.0044205985032022,
-0.01873963139951229,
-0.03164304047822952,
0.15474793314933777,
0.049551334232091904,
-0.05370146036148071,
0.05625450983643532,
0.07605006545782089,
0.23867930471897125,
0.10401605814695358,
0.10196955502033234,
-0.06798075139522552,
0.10180158913135529,
-0.12330973148345947,
-0.08915644884109497,
-0.17508824169635773,
0.11820860952138901,
0.00015364694991149008,
0.1317785084247589,
-0.12023144960403442,
0.07898581773042679,
-0.067511186003685,
0.013453764840960503,
0.01636839471757412,
0.0820009782910347,
-0.012399360537528992,
0.10676060616970062,
-0.005061192903667688,
-0.06941985338926315,
0.014177112840116024,
0.05935845896601677,
0.03754841163754463,
-0.038601722568273544,
-0.03192409873008728,
-0.05762290954589844,
-0.05065649375319481,
-0.10128600150346756,
-0.06447898596525192,
0.018573462963104248,
-0.007677143905311823,
-0.1833900660276413,
-0.06407523155212402,
0.00897200871258974,
0.015712225809693336,
-0.03988850116729736,
-0.05148044601082802,
-0.15265507996082306,
-0.042461175471544266,
-0.015450406819581985,
-0.03500641882419586,
-0.06214277446269989,
-0.0383245050907135,
0.046435944736003876,
-0.07560601085424423,
0.013364278711378574,
0.023342855274677277,
0.05405820533633232,
-0.025881100445985794,
0.06068144738674164,
-0.08357544988393784,
0.09493788331747055,
-0.1540430635213852,
-0.03271956741809845,
-0.025445878505706787,
-0.041183918714523315,
0.1752462536096573,
0.06099751964211464,
-0.015994304791092873,
0.15260063111782074,
-0.17141541838645935,
-0.058121129870414734,
0.15596486628055573,
0.008629098534584045,
-0.09967197477817535,
-0.003560945624485612,
-0.09397093951702118,
0.1428760588169098,
0.08571921288967133,
0.2478504776954651,
0.12005335837602615,
-0.22748184204101562,
0.055358242243528366,
0.12515293061733246,
-0.14365963637828827,
0.10365243256092072,
0.07344598323106766,
0.005470725707709789,
-0.18886831402778625,
-0.06843198090791702,
-0.06121627986431122,
0.1053021252155304,
-0.08522345870733261,
-0.0776243582367897,
0.09323626756668091,
-0.05086790770292282,
0.24641476571559906,
-0.028281206265091896,
0.06174173951148987,
-0.026681531220674515,
-0.1389324963092804,
-0.01723906397819519,
0.060955192893743515,
0.05258452147245407,
-0.024835573509335518,
-0.25895482301712036,
0.13646544516086578,
0.048650871962308884,
0.025074828416109085,
0.004106190986931324,
-0.05691491439938545,
0.016934165731072426,
0.1511998474597931,
0.020012924447655678,
0.13717477023601532,
0.027723990380764008,
0.0706823319196701,
-0.006239562761038542,
-0.10560829937458038,
-0.04169593006372452,
0.061916545033454895,
-0.08518962562084198,
-0.06641357392072678,
0.011197872459888458,
-0.06935211271047592,
-0.11783787608146667,
-0.12166737765073776,
-0.026334572583436966,
-0.02980303019285202,
-0.07444227486848831,
0.02368103712797165,
0.06536602973937988,
-0.06702698022127151,
-0.0023908785078674555,
0.007125476840883493,
-0.011537045240402222,
0.16434046626091003,
0.011393417604267597,
-0.007796820718795061,
0.1328643560409546,
-0.11533161997795105,
0.12461213022470474,
0.049438029527664185,
-0.024806302040815353,
-0.04662557691335678,
0.0014137453399598598,
-0.057529181241989136,
0.029044216498732567,
-0.04390640929341316,
0.02774495631456375,
0.20111067593097687,
0.02772962674498558,
0.11389166116714478,
-0.0656520202755928,
0.04385066404938698,
-0.007961965166032314,
-0.009693224914371967,
0.018563594669103622,
0.07608018070459366,
0.07813210040330887,
-0.1324140727519989,
0.02262016013264656,
0.22455167770385742,
0.1385764330625534,
0.18313980102539062,
-0.010877152904868126,
0.06325667351484299,
-0.04875868931412697,
0.027505528181791306,
0.024100203067064285,
0.10314226150512695,
-0.10732068121433258,
-0.0322517491877079,
-0.025407759472727776,
0.023599207401275635,
-0.08197105675935745,
-0.1055799350142479,
-0.090115025639534,
0.01222382951527834,
-0.03125503659248352,
-0.15570329129695892,
0.13300658762454987,
-0.10451057553291321,
0.01802753657102585,
0.04692702740430832,
-0.22163605690002441,
0.11530312895774841,
0.014291439205408096,
-0.10303618758916855,
0.11281087249517441,
-0.12051989883184433,
-0.08699832111597061,
-0.05777236074209213,
-0.18658851087093353,
0.05280197039246559,
0.04673841595649719,
0.05166793242096901,
-0.18521739542484283,
0.024835903197526932,
0.05545609071850777,
0.13426995277404785,
-0.09743253141641617,
-0.07142634689807892,
-0.15038461983203888,
0.016068490222096443,
-0.033661190420389175,
-0.16029728949069977,
-0.005609163548797369,
-0.032781440764665604,
-0.18849676847457886,
-0.04539939761161804,
-0.15086813271045685,
-0.034627582877874374,
0.20464378595352173,
0.026907702907919884,
0.09480511397123337,
-0.07926445454359055,
0.3802889585494995,
-0.042039383202791214,
-0.06146497279405594,
-0.01321389526128769,
-0.07072482258081436,
0.02512686513364315,
0.13271741569042206,
0.0036099457647651434,
-0.017886579036712646,
-0.0037857077550143003,
0.0024592927657067776,
-0.06234965845942497,
-0.13400450348854065,
0.0028710351325571537,
0.03905198723077774,
0.1874423623085022,
0.004639793653041124,
0.06659388542175293,
0.03133883699774742,
0.057546284049749374,
0.07748064398765564,
0.030926106497645378,
0.0011591583024710417,
-0.01591806672513485,
0.06604493409395218,
-0.11684755235910416,
0.042466625571250916,
-0.030429253354668617,
-0.10143838077783585,
-0.013183288276195526,
0.07950251549482346,
0.12755028903484344,
0.17849206924438477,
-0.04790908098220825,
0.17489230632781982,
0.13580141961574554,
0.16576050221920013,
0.049315933138132095,
-0.020801831036806107,
-0.08773037046194077,
-0.06118565797805786,
0.004774159751832485,
-0.031952597200870514,
0.04869702458381653,
0.3231290578842163,
0.037619613111019135,
-0.09036035090684891,
0.11149907857179642,
0.009480619803071022,
0.05359881371259689,
0.022797370329499245,
-0.11162138730287552,
0.11170321702957153,
0.07968773692846298,
-0.06341761350631714,
-0.07602835446596146,
0.16758501529693604,
-0.1109386757016182,
-0.26646625995635986,
-0.11410990357398987,
-0.012305386364459991,
0.07903840392827988,
0.005651174578815699,
0.05498376116156578,
-0.11829282343387604,
-0.16034497320652008,
-0.034191906452178955,
0.1335442066192627,
-0.3077351450920105,
0.2065143585205078,
-0.0198091771453619,
0.06707923114299774,
-0.039657969027757645,
-0.07026876509189606,
0.09694647043943405,
0.13174086809158325,
0.29124146699905396,
0.01396956667304039,
0.04841272905468941,
-0.15176129341125488,
-0.0976925864815712,
0.0018439020495861769,
0.015482662245631218,
-0.02563396655023098,
0.028520405292510986,
-0.0540912002325058,
0.008404579944908619,
-0.018086453899741173,
0.2102297693490982,
-0.11316607892513275,
0.004344627261161804,
-0.06968966871500015,
-0.11707738786935806,
0.19409789144992828,
-0.07178345322608948,
-0.04543264955282211,
-0.14959357678890228,
-0.15512511134147644,
-0.004174166824668646,
-0.02413962036371231,
-0.019664527848362923,
-0.17603960633277893,
-0.18804074823856354,
-0.05204557999968529,
-0.005645004566758871,
-0.003464865731075406,
0.05867868289351463,
-0.07517234236001968,
-0.04805335775017738,
0.1009904220700264,
-0.07743175327777863,
-0.056063808500766754,
-0.1103200614452362,
0.1391381323337555,
0.06248528137803078,
0.16743235290050507,
0.05907081440091133,
0.0006117874872870743,
0.11471151560544968,
-0.02913086675107479,
0.11103474348783493,
-0.11291708797216415,
-0.17145049571990967,
-0.08334989100694656,
-0.018775060772895813,
0.09519003331661224,
-0.04789286106824875,
0.0028788831550627947,
0.2550160884857178,
0.14880181849002838,
-0.0897710770368576,
0.27680760622024536,
0.04414956644177437,
-0.09375058114528656,
-0.18432219326496124,
-0.15961645543575287,
0.03759992495179176,
0.060025621205568314,
0.13095876574516296,
-0.057205069810152054,
-0.08483537286520004,
-0.08492398262023926,
-0.07478608191013336,
-0.13140805065631866,
-0.24232175946235657,
-0.030598774552345276,
0.22874866425991058,
0.08656918257474899,
0.08219650387763977,
-0.012482990510761738,
-0.01186054851859808,
0.00526038184762001,
0.02680150233209133,
0.12018456310033798,
-0.13341329991817474,
0.11107480525970459,
0.022198403254151344,
0.044267985969781876,
0.009712530300021172,
0.07929777354001999,
0.03375575691461563,
-0.003218587953597307,
-0.0006439819699153304,
-0.0988350659608841,
-0.2596651017665863,
0.0816885456442833,
-0.01623627357184887,
-0.09960969537496567,
0.014988959766924381,
0.02061903104186058,
-0.2089255303144455,
0.011128270998597145,
-0.019883770495653152,
-0.03150356933474541,
-0.06483490765094757,
-0.10664787143468857,
-0.056551624089479446,
0.04928823933005333,
0.10853826254606247,
0.011660109274089336,
0.05354316532611847,
-0.0404130220413208,
0.07917837053537369,
0.0826287642121315,
0.15132710337638855,
0.06795957684516907,
-0.190711110830307,
-0.10953907668590546,
-0.0414445661008358,
0.12121522426605225,
-0.12505418062210083,
0.036917757242918015,
0.053161121904850006,
-0.016534561291337013,
0.14621229469776154,
0.1070784479379654,
-0.07452095299959183,
0.11915595084428787,
0.08904775977134705,
-0.04094788804650307,
-0.23367151618003845,
-0.07120766490697861,
0.11133213341236115,
0.07195597887039185,
-0.03961895406246185,
0.018120890483260155,
-0.04960581287741661,
-0.013980977237224579,
0.048759616911411285,
-0.0538676381111145,
-0.07230538129806519,
0.004421027842909098,
0.1247575581073761,
0.1029362753033638,
-0.04655474051833153,
0.01296416949480772,
0.037371400743722916,
0.003788623260334134,
0.04730486497282982,
0.0407949760556221,
-0.08269952982664108,
-0.04124005511403084,
0.02782733179628849,
0.37552911043167114,
-0.010165480896830559,
-0.020456433296203613,
0.018555615097284317,
-0.19949445128440857,
0.09135842323303223,
0.13205479085445404,
0.04697350412607193,
0.004247748292982578,
-0.08139242231845856,
0.026877427473664284,
-0.010625290684401989,
0.09936143457889557,
-0.07806670665740967,
-0.05493134260177612,
-0.21631066501140594,
-0.025010565295815468,
0.017490221187472343,
0.24077683687210083,
-0.08458559215068817,
-0.12801732122898102,
-0.20628872513771057,
0.13128381967544556,
-0.11333390325307846,
-0.03695881739258766,
-0.024473199620842934,
0.03926658630371094,
-0.01989821158349514,
0.06291737407445908,
-0.0710630789399147,
0.006373001262545586,
-0.11024709790945053,
0.055267609655857086,
0.04204455390572548,
0.1229788213968277,
0.014207782223820686,
0.02016810141503811,
0.05822525918483734,
-0.01837925612926483,
0.07173580676317215,
-0.06203491613268852,
-0.04550490900874138,
0.14224006235599518,
-0.020255116745829582,
-0.04152837023139,
-0.0483345128595829,
-0.036874305456876755,
0.11981741338968277,
-0.05059147998690605,
-0.007141099311411381,
-0.054929375648498535,
-0.06906463205814362,
0.03462086617946625,
-0.009175732731819153,
-0.008798843249678612,
0.06801853328943253,
0.04024988040328026,
-0.026994358748197556,
0.005263668950647116,
0.03447828069329262,
-0.10330043733119965,
-0.04955084249377251,
0.16955432295799255,
-0.0749620869755745,
0.10274054110050201,
-0.031069839373230934,
0.018015999346971512,
0.005847334861755371,
-0.022399673238396645,
-0.015360680408775806,
-0.1457086056470871,
-0.06137600541114807,
-0.09489979594945908,
0.11565322428941727,
0.08146517723798752,
0.03358805552124977,
0.04274565726518631,
0.019532648846507072,
-0.04414922371506691,
-0.038583990186452866,
0.12961317598819733,
0.08133101463317871,
0.012996876612305641,
0.01137041300535202,
0.01941833831369877,
-0.020302120596170425,
0.0028480992186814547,
-0.01250747125595808,
-0.07239153981208801,
-0.05874783173203468,
0.09400010108947754,
0.1600283533334732,
-0.06127211079001427,
-0.13325586915016174,
-0.020593497902154922,
0.04988488554954529,
0.0014717020094394684,
-0.08777432143688202,
0.04833676666021347,
0.15805292129516602,
-0.05623878911137581,
0.03216489031910896,
-0.09984751045703888,
-0.07263360917568207,
-0.16060975193977356,
-0.10029061883687973,
-0.06092562898993492,
-0.28350353240966797,
0.09752398729324341,
0.006392303854227066,
-0.014731393195688725,
0.059529416263103485,
0.051305368542671204,
-0.052508849650621414,
0.07068239152431488,
-0.18146829307079315,
-0.007054794579744339,
0.03497592359781265,
-0.13212306797504425,
0.02475893869996071,
-0.2378365397453308,
0.10198072344064713,
-0.04623803123831749,
-0.1519704908132553,
-0.04004510119557381,
0.0641569048166275,
-0.09540136158466339,
-0.01822364516556263,
-0.0475153923034668,
-0.01922670193016529,
0.01624443754553795,
-0.009348669089376926,
-0.031147832050919533,
0.13716529309749603,
0.02827494591474533,
-0.03268734738230705,
0.005254602525383234,
0.0223685409873724,
0.03955082967877388,
-0.0969657450914383,
-0.05986930429935455,
0.08311155438423157,
-0.031056145206093788,
0.14728976786136627,
0.000341245875461027,
0.04181376099586487,
-0.06758682429790497,
0.2593761384487152,
0.2023983597755432,
-0.12479214370250702,
0.008118697442114353,
-0.021801479160785675,
0.012670028023421764,
-0.041751839220523834,
0.13110700249671936,
0.013386172242462635,
0.12186761200428009,
-0.17513342201709747,
-0.01036517322063446,
-0.0818324014544487,
-0.04501292482018471,
0.06702108681201935,
0.14714950323104858,
0.15742522478103638,
0.03436789661645889,
-0.07328428328037262,
0.06722653657197952,
-0.30119743943214417,
0.20540550351142883,
-0.1346001923084259,
-0.01498429011553526,
-0.040251150727272034,
-0.058389630168676376,
0.061147745698690414,
0.11309876292943954,
0.10832664370536804,
-0.021150551736354828,
-0.0905047357082367,
-0.04486766457557678,
-0.039378076791763306,
-0.13019338250160217,
-0.02718670479953289,
0.1654091775417328,
0.06799814850091934,
0.31520840525627136,
-0.017577875405550003,
0.07702425122261047,
0.034410297870635986,
0.06451138854026794,
0.004519328009337187,
0.09537279605865479,
0.07960964739322662,
-0.06345855444669724,
-0.07373003661632538,
-0.001637450186535716,
0.05033271387219429,
0.14567798376083374,
-0.03826142102479935,
-0.18691548705101013,
0.15858715772628784,
0.07192251086235046,
-0.13762691617012024,
-0.05777517706155777,
0.08409425616264343,
-0.0739973932504654,
0.0550808347761631,
0.08115427941083908,
0.015876613557338715,
-0.017793258652091026,
-0.004664506763219833,
0.06074233725667,
0.024694660678505898,
-0.02343848906457424,
0.003570882137864828,
-0.08337053656578064,
-0.04151543974876404,
0.07267895340919495,
-0.0844460055232048,
-0.20546193420886993,
-0.0957019031047821,
-0.07551700621843338,
0.030557552352547646,
-0.0649830624461174,
0.12575586140155792,
0.1717868149280548,
0.0593598335981369,
-0.03307248651981354,
-0.10721943527460098,
-0.035562749952077866,
0.07602505385875702,
-0.044773899018764496,
-0.09409699589014053
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | feature-extraction | tommymarto/LernnaviBERT_baseline_students_answers_384_lstm_seq_len_10 | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:52:14+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
39,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.052746038883924484,
0.20255789160728455,
-0.0045078229159116745,
0.0248473659157753,
0.10497838258743286,
0.00675728265196085,
0.06521498411893845,
0.11486967653036118,
-0.0023755673319101334,
0.12028469145298004,
0.027631845325231552,
0.08119397610425949,
0.12110675126314163,
0.15393014252185822,
0.005160121712833643,
-0.24253977835178375,
0.05344875901937485,
-0.09366832673549652,
0.004077504388988018,
0.11452110856771469,
0.1343945860862732,
-0.10780399292707443,
0.08976872265338898,
-0.00683097867295146,
-0.01712046191096306,
-0.015751034021377563,
-0.07134060561656952,
-0.06668227165937424,
0.05541034787893295,
0.07649129629135132,
0.0725555345416069,
0.010986946523189545,
0.07830587029457092,
-0.2806258797645569,
0.014425364322960377,
0.08005264401435852,
0.0010765197221189737,
0.06795802712440491,
0.08151742070913315,
-0.06789936870336533,
0.1251654475927353,
-0.0605485662817955,
0.14059753715991974,
0.07639917731285095,
-0.08928128331899643,
-0.19590547680854797,
-0.06669555604457855,
0.07481247186660767,
0.129872128367424,
0.05026249960064888,
-0.02990107797086239,
0.1371748298406601,
-0.09688840061426163,
0.00786701962351799,
0.12302009761333466,
-0.07360870391130447,
-0.05524582043290138,
0.031063849106431007,
0.10805318504571915,
0.09297362715005875,
-0.11762315034866333,
-0.008467874489724636,
0.029582185670733452,
0.022175652906298637,
0.08627551048994064,
0.015828849747776985,
0.1525639444589615,
0.041341137140989304,
-0.14141254127025604,
-0.0526716373860836,
0.09056255221366882,
0.03701045364141464,
-0.050960201770067215,
-0.23367193341255188,
-0.026245610788464546,
-0.012442239560186863,
-0.03079850971698761,
-0.04234880208969116,
0.053594592958688736,
-0.03630254790186882,
0.07596245408058167,
-0.007196845952421427,
-0.07732249796390533,
-0.031211229041218758,
0.05230424553155899,
0.06785056740045547,
0.018615471199154854,
-0.006994647905230522,
0.019442738965153694,
0.11387838423252106,
0.07708574831485748,
-0.13029205799102783,
-0.07214002311229706,
-0.0739525631070137,
-0.09558356553316116,
-0.04332297295331955,
0.03707554563879967,
0.07106684148311615,
0.04390906170010567,
0.20283061265945435,
-0.017690327018499374,
0.046562306582927704,
0.0476159006357193,
0.005842953454703093,
0.07147589325904846,
0.10925443470478058,
-0.06689215451478958,
-0.14432233572006226,
-0.06022803485393524,
0.08875485509634018,
-0.009834992699325085,
-0.03670760244131088,
-0.049119677394628525,
0.04676154628396034,
0.03209913894534111,
0.11318106204271317,
0.08643888682126999,
-0.003593706525862217,
-0.0628826767206192,
-0.042073074728250504,
0.22331053018569946,
-0.14625342190265656,
0.043256524950265884,
0.007445589639246464,
-0.0429743155837059,
-0.0076383077539503574,
0.005870272871106863,
0.014089803211390972,
-0.03238216042518616,
0.10351061820983887,
-0.0778173878788948,
-0.035906463861465454,
-0.1116463914513588,
-0.06868703663349152,
0.024910317733883858,
0.0025890374090522528,
-0.018393149599432945,
-0.04424213990569115,
-0.11253650486469269,
-0.051282741129398346,
0.0724339634180069,
-0.07579848170280457,
-0.05524555593729019,
0.009976830333471298,
-0.04834962263703346,
0.0031978494953364134,
0.00010397454752819613,
0.11258035898208618,
-0.03314845636487007,
0.025259260088205338,
-0.04850656911730766,
0.06803499162197113,
0.10959596186876297,
0.038730688393116,
-0.0804535374045372,
0.07286878675222397,
-0.22788093984127045,
0.10223092138767242,
-0.09346398711204529,
0.025767935439944267,
-0.14578653872013092,
-0.04199126362800598,
0.02854149229824543,
0.02887420728802681,
-0.010361229069530964,
0.1268649846315384,
-0.1982942521572113,
-0.035082314163446426,
0.15190726518630981,
-0.11336656659841537,
-0.09347330778837204,
0.065653957426548,
-0.05610617995262146,
0.11296144872903824,
0.04835578054189682,
-0.019556574523448944,
0.06953749805688858,
-0.1281629204750061,
-0.04506009817123413,
-0.021473335102200508,
-0.008493004366755486,
0.14857245981693268,
0.06750676780939102,
-0.05737153813242912,
0.07104712724685669,
0.02051553688943386,
-0.037109848111867905,
-0.03301886469125748,
-0.03470754995942116,
-0.09331934154033661,
0.009520708583295345,
-0.07244295626878738,
0.03737799823284149,
-0.02224314957857132,
-0.08870045095682144,
-0.030656753107905388,
-0.17619828879833221,
0.043274905532598495,
0.08050142228603363,
0.008233942091464996,
-0.021131468936800957,
-0.09287237375974655,
0.02556683123111725,
-0.009385489858686924,
-0.021018607541918755,
-0.1641797423362732,
-0.044834475964307785,
0.04416196420788765,
-0.1971662938594818,
0.023802341893315315,
-0.03283040598034859,
0.05093098804354668,
0.03247829154133797,
-0.04019762575626373,
-0.005096070934087038,
0.0028117431793361902,
0.01809627003967762,
-0.026984719559550285,
-0.200385183095932,
-0.031109308823943138,
-0.029154371470212936,
0.1362139731645584,
-0.22226740419864655,
0.028292208909988403,
0.07483648508787155,
0.13521188497543335,
0.0009690870065242052,
-0.04426588490605354,
0.010693409480154514,
-0.05366935580968857,
-0.053671274334192276,
-0.06512755900621414,
-0.007102466654032469,
-0.03287021815776825,
-0.04422381520271301,
0.06460095942020416,
-0.19425635039806366,
-0.03641216829419136,
0.10608077049255371,
0.10164625942707062,
-0.14719000458717346,
-0.028969714418053627,
-0.04096706584095955,
-0.06081128865480423,
-0.09094393998384476,
-0.0630471333861351,
0.14371246099472046,
0.04861542955040932,
0.048413511365652084,
-0.08624191582202911,
-0.0630124881863594,
0.00895135197788477,
0.0006565740332007408,
-0.03649118170142174,
0.08907787501811981,
0.08782777935266495,
-0.10737399011850357,
0.08881597965955734,
0.08605224639177322,
0.06605713814496994,
0.10539878904819489,
0.001256609451957047,
-0.10750970244407654,
-0.029154706746339798,
0.005644100718200207,
0.01547710970044136,
0.14092515408992767,
-0.044270921498537064,
0.04743899777531624,
0.05656488984823227,
-0.027443327009677887,
0.01715722121298313,
-0.10313762724399567,
0.02984124980866909,
0.046840768307447433,
-0.010507673025131226,
0.012429861351847649,
-0.03895113617181778,
0.025837475433945656,
0.08796556293964386,
0.03584056720137596,
0.027896199375391006,
0.0029043578542768955,
-0.03437814116477966,
-0.10392027348279953,
0.17429527640342712,
-0.0878753736615181,
-0.28357240557670593,
-0.1356295943260193,
-0.00747122336179018,
0.05167245492339134,
-0.022715993225574493,
0.013256389647722244,
-0.04903135821223259,
-0.11467588692903519,
-0.10348290205001831,
0.008818334899842739,
0.0437844917178154,
-0.07700283080339432,
-0.07256268709897995,
0.046553414314985275,
0.033613573759794235,
-0.14174877107143402,
0.022300107404589653,
0.048012908548116684,
-0.03855963796377182,
-0.015413837507367134,
0.07170835882425308,
0.10258439928293228,
0.17387451231479645,
-0.004228805657476187,
-0.01945391111075878,
0.023280048742890358,
0.24459126591682434,
-0.14296141266822815,
0.10647262632846832,
0.15432609617710114,
-0.06630013138055801,
0.1025824174284935,
0.19176462292671204,
0.02610800787806511,
-0.07571171224117279,
0.03370760753750801,
0.03715203329920769,
-0.053104497492313385,
-0.23274335265159607,
-0.060641512274742126,
0.0011178229469805956,
-0.06850682199001312,
0.09104112535715103,
0.08915619552135468,
0.11183936148881912,
0.0454646460711956,
-0.08415863662958145,
-0.06847929954528809,
0.019614145159721375,
0.10642454773187637,
-0.03275766968727112,
0.007264797575771809,
0.09054313600063324,
-0.04184457287192345,
-0.005177726969122887,
0.10835286974906921,
0.007426192983984947,
0.1962665617465973,
0.031048519536852837,
0.15333782136440277,
0.07211130857467651,
0.0342402458190918,
0.026680786162614822,
0.025636766105890274,
0.023090654984116554,
0.009547512046992779,
-0.01598707027733326,
-0.08795502036809921,
0.027014199644327164,
0.13500221073627472,
0.07871367782354355,
0.029795078560709953,
0.020392734557390213,
-0.0429922379553318,
0.062152985483407974,
0.15964233875274658,
0.006258485373109579,
-0.2136749029159546,
-0.03950631618499756,
0.08867984265089035,
-0.0793125256896019,
-0.1237078458070755,
-0.02518491819500923,
0.03823186457157135,
-0.1809074580669403,
0.04127289727330208,
-0.01795332506299019,
0.11453432589769363,
-0.11700457334518433,
-0.028958700597286224,
0.039744846522808075,
0.08327627927064896,
-0.03253408893942833,
0.07922478020191193,
-0.1647184044122696,
0.1165376752614975,
0.012328862212598324,
0.05802180990576744,
-0.11617794632911682,
0.09878876805305481,
0.012594180181622505,
-0.009003117680549622,
0.16720694303512573,
-0.0008162438753060997,
-0.07339610159397125,
-0.06517832726240158,
-0.07867198437452316,
-0.022016214206814766,
0.09116258472204208,
-0.11647430807352066,
0.08271238952875137,
-0.012302344664931297,
-0.03819865360856056,
0.002976413816213608,
-0.1073245257139206,
-0.12343364208936691,
-0.191313698887825,
0.05862122401595116,
-0.11746024340391159,
0.00024363139527849853,
-0.10003595799207687,
-0.05551697313785553,
-0.04721582680940628,
0.19990667700767517,
-0.14306047558784485,
-0.09675363451242447,
-0.1526252180337906,
-0.09468596428632736,
0.1679719239473343,
-0.04768168181180954,
0.08716544508934021,
-0.00014324963558465242,
0.22273695468902588,
0.00589721417054534,
-0.010143720544874668,
0.07824880629777908,
-0.08608578145503998,
-0.17828822135925293,
-0.07740302383899689,
0.12055730819702148,
0.12802201509475708,
0.05279289186000824,
-0.012038013897836208,
0.020934196189045906,
-0.036648161709308624,
-0.11678951978683472,
0.003050430677831173,
0.1217387318611145,
0.05949230119585991,
0.039503831416368484,
-0.002558275358751416,
-0.10200468450784683,
-0.07551230490207672,
-0.0352395698428154,
0.02261841483414173,
0.18903005123138428,
-0.08441178500652313,
0.15781226754188538,
0.13112787902355194,
-0.05333179607987404,
-0.21253353357315063,
0.030583804473280907,
0.043237145990133286,
0.004318034742027521,
0.0612679123878479,
-0.17720702290534973,
0.08167627453804016,
0.025727098807692528,
-0.05116020143032074,
0.15224720537662506,
-0.16569727659225464,
-0.15514664351940155,
0.0824643224477768,
0.05010354146361351,
-0.22108957171440125,
-0.12386278063058853,
-0.0879128947854042,
-0.06589758396148682,
-0.1396872103214264,
0.08584427833557129,
0.014041651971638203,
-0.0018043812597170472,
0.05013851076364517,
0.033740755170583725,
0.018914686515927315,
-0.048698488622903824,
0.21615906059741974,
-0.0022440196480602026,
0.03326340764760971,
-0.07553089410066605,
-0.10180798172950745,
0.06950566172599792,
-0.05141735449433327,
0.08518881350755692,
-0.03099823370575905,
0.005753061734139919,
-0.08320630341768265,
-0.057475052773952484,
-0.05255331099033356,
0.03318103775382042,
-0.08139406144618988,
-0.10520965605974197,
-0.06759276986122131,
0.09429939836263657,
0.09139011800289154,
-0.03298058733344078,
-0.04032526910305023,
-0.08896728605031967,
0.039150089025497437,
0.20617929100990295,
0.17360219359397888,
0.05333937704563141,
-0.10111589729785919,
0.002542630536481738,
-0.01915728859603405,
0.040264517068862915,
-0.21200114488601685,
0.04798245429992676,
0.04617756977677345,
0.024147402495145798,
0.12109645456075668,
-0.0176423080265522,
-0.1646004468202591,
-0.047221194952726364,
0.0562983863055706,
-0.03494611009955406,
-0.20504815876483917,
-0.01314060389995575,
0.04864202439785004,
-0.18736153841018677,
-0.06957933306694031,
0.016700902953743935,
-0.014444489032030106,
-0.027432914823293686,
0.013032985851168633,
0.06286440044641495,
0.025481918826699257,
0.10238313674926758,
0.05989401787519455,
0.1000840812921524,
-0.112981878221035,
0.0795830711722374,
0.09043775498867035,
-0.08344172686338425,
0.009394102729856968,
0.06964189559221268,
-0.05280066654086113,
-0.02294989861547947,
0.022772129625082016,
0.06757686287164688,
-0.003049787599593401,
-0.057536181062459946,
-0.02079189568758011,
-0.10809285193681717,
0.06586270034313202,
0.1269281655550003,
0.0400845967233181,
-0.006831571459770203,
0.04905473813414574,
0.02419281378388405,
-0.07880669087171555,
0.11321208626031876,
0.03362756222486496,
0.03722309693694115,
-0.05989459529519081,
-0.01674187369644642,
0.04316421225667,
0.005734616424888372,
-0.02047782577574253,
-0.025104478001594543,
-0.05658029392361641,
-0.013948953710496426,
-0.18932224810123444,
0.014544147998094559,
-0.07588981091976166,
0.005138450767844915,
0.014814606867730618,
-0.040141742676496506,
-0.018671197816729546,
0.012856033630669117,
-0.08163223415613174,
-0.05027473345398903,
-0.0038707295898348093,
0.09766460955142975,
-0.1400173306465149,
0.008230311796069145,
0.09175591170787811,
-0.11852382868528366,
0.06848865002393723,
-0.019968708977103233,
-0.014717686921358109,
0.0038272906094789505,
-0.1270400881767273,
0.04572216048836708,
-0.004586559720337391,
0.02062096633017063,
0.04444560408592224,
-0.17065683007240295,
0.004877567756921053,
-0.0423397533595562,
-0.0478336401283741,
-0.015323328785598278,
-0.08405033499002457,
-0.11406292766332626,
0.10921793431043625,
0.002206311793997884,
-0.08430022746324539,
-0.010287429206073284,
0.04696008190512657,
0.10919637978076935,
-0.03898061811923981,
0.124757781624794,
0.0047785635106265545,
0.06639395654201508,
-0.18268363177776337,
-0.024298490956425667,
-0.014514438807964325,
0.007352736312896013,
0.027192458510398865,
-0.016180848702788353,
0.04238643869757652,
-0.01372526679188013,
0.2601816952228546,
-0.021822240203619003,
0.07231466472148895,
0.0637383759021759,
0.042024899274110794,
0.016651110723614693,
0.08318763226270676,
0.06755662709474564,
0.016758481040596962,
0.004258559085428715,
0.02265608124434948,
-0.03241465613245964,
-0.016654497012495995,
-0.15768693387508392,
0.07677853107452393,
0.14623822271823883,
0.08591317385435104,
0.007676990237087011,
0.06586159020662308,
-0.10330242663621902,
-0.10554943233728409,
0.08015866577625275,
-0.03888537734746933,
-0.0009790018666535616,
-0.058588381856679916,
0.15355949103832245,
0.14971502125263214,
-0.17422176897525787,
0.08231138437986374,
-0.03791337087750435,
-0.04883022606372833,
-0.11436772346496582,
-0.15839459002017975,
-0.06608819216489792,
-0.029153592884540558,
-0.0041826991364359856,
-0.05528274551033974,
0.06748054921627045,
0.10802645981311798,
-0.0021057529374957085,
-0.00038325722562149167,
0.09545762091875076,
-0.026331622153520584,
-0.01757199876010418,
0.03465426340699196,
0.04817976430058479,
0.033562518656253815,
-0.04831063002347946,
0.020485511049628258,
0.004976877011358738,
0.03976510092616081,
0.05864322930574417,
0.023703020066022873,
-0.03892989084124565,
0.014479226432740688,
-0.01092575490474701,
-0.1049860492348671,
0.022427968680858612,
-0.029776830226182938,
-0.07360642403364182,
0.13104131817817688,
0.029177764430642128,
0.019099419936537743,
-0.03228067234158516,
0.20109383761882782,
-0.07107947021722794,
-0.06925153732299805,
-0.14109766483306885,
0.10889512300491333,
-0.03372858464717865,
0.06323269009590149,
0.058447178453207016,
-0.1133023053407669,
-0.002398417331278324,
0.1314154714345932,
0.133079394698143,
-0.033533163368701935,
0.005780258681625128,
0.03008044883608818,
0.00756559893488884,
-0.0482633113861084,
0.045497048646211624,
0.031092669814825058,
0.15440985560417175,
-0.06949599832296371,
0.07780899107456207,
0.00008295764564536512,
-0.08774317800998688,
-0.036128852516412735,
0.1405542492866516,
0.006535779219120741,
0.03079606406390667,
-0.06559351831674576,
0.10371401906013489,
-0.07252706587314606,
-0.23936228454113007,
0.045033879578113556,
-0.07753164321184158,
-0.15683837234973907,
-0.013978141359984875,
0.02726292423903942,
-0.009009851142764091,
0.02702206000685692,
0.0654432401061058,
-0.06469112634658813,
0.161378413438797,
0.03472336754202843,
-0.08781957626342773,
-0.05673113837838173,
0.07957270741462708,
-0.09192227572202682,
0.2958409786224365,
0.013188840821385384,
0.029593972489237785,
0.10327941924333572,
-0.019989576190710068,
-0.13285429775714874,
0.030561091378331184,
0.10066051781177521,
-0.09982595592737198,
0.06684590131044388,
0.18159176409244537,
-0.009470577351748943,
0.10021016746759415,
0.07437440752983093,
-0.061603669077157974,
0.05807222053408623,
-0.0826035663485527,
-0.06770919263362885,
-0.09389114379882812,
0.05970105528831482,
-0.06468918174505234,
0.14543601870536804,
0.1228262409567833,
-0.04243761673569679,
-0.004415105562657118,
-0.02816380001604557,
0.043726447969675064,
0.012194468639791012,
0.12871193885803223,
0.008576037362217903,
-0.1618158370256424,
0.026840461418032646,
0.0030557403806596994,
0.10387714207172394,
-0.21997274458408356,
-0.08367477357387543,
0.04838619381189346,
-0.029553698375821114,
-0.05334814265370369,
0.10579082369804382,
0.06295353919267654,
0.0504634715616703,
-0.04548325017094612,
-0.05543007701635361,
-0.008723298087716103,
0.14979462325572968,
-0.1187625601887703,
-0.006005466915667057
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b0_1
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4979
- Mean Iou: 0.4170
- Mean Accuracy: 0.6846
- Overall Accuracy: 0.9603
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3839
- Accuracy Undropoff: 0.9853
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.2914
- Iou Undropoff: 0.9597
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.0495 | 5.0 | 10 | 1.0890 | 0.1852 | 0.3572 | 0.4990 | nan | 0.2026 | 0.5119 | 0.0 | 0.0474 | 0.5081 |
| 0.9941 | 10.0 | 20 | 1.0479 | 0.3452 | 0.8357 | 0.8479 | nan | 0.8225 | 0.8490 | 0.0 | 0.1931 | 0.8425 |
| 0.9448 | 15.0 | 30 | 0.9839 | 0.3790 | 0.8217 | 0.9010 | nan | 0.7351 | 0.9082 | 0.0 | 0.2390 | 0.8980 |
| 0.8912 | 20.0 | 40 | 0.9041 | 0.3845 | 0.7150 | 0.9247 | nan | 0.4863 | 0.9437 | 0.0 | 0.2303 | 0.9233 |
| 0.8458 | 25.0 | 50 | 0.7997 | 0.3835 | 0.6687 | 0.9326 | nan | 0.3808 | 0.9565 | 0.0 | 0.2188 | 0.9316 |
| 0.8299 | 30.0 | 60 | 0.7387 | 0.3751 | 0.6333 | 0.9326 | nan | 0.3068 | 0.9597 | 0.0 | 0.1934 | 0.9318 |
| 0.7518 | 35.0 | 70 | 0.6810 | 0.3791 | 0.6322 | 0.9404 | nan | 0.2961 | 0.9683 | 0.0 | 0.1975 | 0.9397 |
| 0.6943 | 40.0 | 80 | 0.6322 | 0.3703 | 0.6069 | 0.9422 | nan | 0.2411 | 0.9726 | 0.0 | 0.1691 | 0.9417 |
| 0.6617 | 45.0 | 90 | 0.6071 | 0.3780 | 0.6240 | 0.9454 | nan | 0.2734 | 0.9746 | 0.0 | 0.1892 | 0.9449 |
| 0.634 | 50.0 | 100 | 0.5932 | 0.3765 | 0.6106 | 0.9497 | nan | 0.2407 | 0.9805 | 0.0 | 0.1802 | 0.9494 |
| 0.6157 | 55.0 | 110 | 0.5829 | 0.3982 | 0.6538 | 0.9524 | nan | 0.3281 | 0.9795 | 0.0 | 0.2425 | 0.9520 |
| 0.5814 | 60.0 | 120 | 0.5708 | 0.4038 | 0.6699 | 0.9533 | nan | 0.3608 | 0.9790 | 0.0 | 0.2586 | 0.9528 |
| 0.5988 | 65.0 | 130 | 0.5575 | 0.3974 | 0.6456 | 0.9569 | nan | 0.3061 | 0.9851 | 0.0 | 0.2357 | 0.9564 |
| 0.5583 | 70.0 | 140 | 0.5530 | 0.4224 | 0.7075 | 0.9576 | nan | 0.4346 | 0.9803 | 0.0 | 0.3103 | 0.9570 |
| 0.5596 | 75.0 | 150 | 0.5264 | 0.4034 | 0.6522 | 0.9598 | nan | 0.3167 | 0.9877 | 0.0 | 0.2510 | 0.9593 |
| 0.5524 | 80.0 | 160 | 0.5392 | 0.4208 | 0.7109 | 0.9567 | nan | 0.4429 | 0.9790 | 0.0 | 0.3065 | 0.9560 |
| 0.5294 | 85.0 | 170 | 0.5257 | 0.4161 | 0.6913 | 0.9582 | nan | 0.4002 | 0.9824 | 0.0 | 0.2909 | 0.9576 |
| 0.5477 | 90.0 | 180 | 0.5178 | 0.4207 | 0.6962 | 0.9591 | nan | 0.4095 | 0.9829 | 0.0 | 0.3035 | 0.9584 |
| 0.528 | 95.0 | 190 | 0.5185 | 0.4183 | 0.6939 | 0.9590 | nan | 0.4047 | 0.9831 | 0.0 | 0.2965 | 0.9584 |
| 0.5144 | 100.0 | 200 | 0.5004 | 0.4153 | 0.6788 | 0.9604 | nan | 0.3716 | 0.9860 | 0.0 | 0.2859 | 0.9599 |
| 0.5313 | 105.0 | 210 | 0.5032 | 0.4199 | 0.7005 | 0.9585 | nan | 0.4191 | 0.9819 | 0.0 | 0.3020 | 0.9578 |
| 0.5172 | 110.0 | 220 | 0.4993 | 0.4188 | 0.6931 | 0.9591 | nan | 0.4030 | 0.9832 | 0.0 | 0.2978 | 0.9585 |
| 0.5124 | 115.0 | 230 | 0.4999 | 0.4167 | 0.6828 | 0.9606 | nan | 0.3799 | 0.9858 | 0.0 | 0.2901 | 0.9600 |
| 0.5025 | 120.0 | 240 | 0.4979 | 0.4170 | 0.6846 | 0.9603 | nan | 0.3839 | 0.9853 | 0.0 | 0.2914 | 0.9597 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b0_1", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b0_1 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:52:21+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b0\_1
====================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4979
* Mean Iou: 0.4170
* Mean Accuracy: 0.6846
* Overall Accuracy: 0.9603
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3839
* Accuracy Undropoff: 0.9853
* Iou Unlabeled: 0.0
* Iou Dropoff: 0.2914
* Iou Undropoff: 0.9597
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10591059178113937,
0.036406707018613815,
-0.002041781088337302,
0.11596930027008057,
0.1714659482240677,
0.028318241238594055,
0.11641968786716461,
0.11560021340847015,
-0.10877452790737152,
0.030912652611732483,
0.10430896282196045,
0.14163099229335785,
0.01629962958395481,
0.09532219916582108,
-0.020085278898477554,
-0.30744001269340515,
-0.02539004385471344,
0.03143095597624779,
-0.08681149780750275,
0.12519687414169312,
0.06513086706399918,
-0.16188034415245056,
0.09207220375537872,
-0.0032083075493574142,
-0.22071543335914612,
0.01589730754494667,
-0.004114775452762842,
-0.03122878633439541,
0.16029687225818634,
0.02357633225619793,
0.11608259379863739,
0.010143724270164967,
0.11489138752222061,
-0.20352774858474731,
0.017524294555187225,
0.05539601296186447,
-0.0043795364908874035,
0.06756952404975891,
0.06578890234231949,
0.0016288068145513535,
0.15191982686519623,
-0.1059940904378891,
0.06725753843784332,
0.0020551886409521103,
-0.14449739456176758,
-0.21125079691410065,
-0.07391216605901718,
0.022725099697709084,
0.07813199609518051,
0.09479884803295135,
-0.004944729618728161,
0.11241678893566132,
-0.09166383743286133,
0.1124383807182312,
0.26984864473342896,
-0.24355480074882507,
-0.0860433354973793,
0.037472907453775406,
0.0015846790047362447,
0.06687662750482559,
-0.13214673101902008,
0.008929550647735596,
0.03309425339102745,
0.04682445526123047,
0.11712583154439926,
-0.03376404941082001,
-0.09935519844293594,
0.02651998959481716,
-0.13818451762199402,
-0.03384209796786308,
0.05395243316888809,
0.052767518907785416,
-0.021198933944106102,
-0.03161989524960518,
-0.06856082379817963,
-0.18060626089572906,
-0.06563921272754669,
0.012051342986524105,
0.06658738106489182,
-0.05934448167681694,
-0.11524756997823715,
-0.01712081767618656,
-0.11102376133203506,
-0.08600760251283646,
-0.04974017292261124,
0.12833237648010254,
0.03393315151333809,
0.019822686910629272,
-0.03449401631951332,
0.12721192836761475,
-0.027143806219100952,
-0.13947705924510956,
0.016818301752209663,
0.02969534695148468,
-0.04308154061436653,
-0.03135136887431145,
-0.049116622656583786,
-0.06412902474403381,
-0.012727650813758373,
0.10901389271020889,
-0.060329437255859375,
0.06700942665338516,
0.03596173971891403,
0.0513516440987587,
-0.11418701708316803,
0.18956543505191803,
-0.06669562309980392,
-0.008179159834980965,
-0.0371287576854229,
0.057560257613658905,
0.00442540692165494,
-0.022692175582051277,
-0.10595186054706573,
0.003773114178329706,
0.07032424956560135,
-0.00843335036188364,
-0.08732932060956955,
0.06882526725530624,
-0.039810728281736374,
-0.011808953247964382,
-0.0013848122907802463,
-0.07570318877696991,
0.04621141403913498,
-0.00037272917688824236,
-0.083360955119133,
-0.029091104865074158,
0.05189743638038635,
0.015402769669890404,
0.014119260013103485,
0.16494019329547882,
-0.08581560105085373,
0.06393270194530487,
-0.11169346421957016,
-0.0995871052145958,
0.0009529309463687241,
-0.08649130165576935,
0.03577325493097305,
-0.07892858982086182,
-0.14939658343791962,
-0.008248391561210155,
0.07200909405946732,
-0.04051196202635765,
0.0025248501915484667,
-0.05283036455512047,
-0.09021950513124466,
0.002668651519343257,
-0.008688265457749367,
0.16225291788578033,
-0.06490723043680191,
0.12246591597795486,
0.03718114271759987,
0.07248258590698242,
-0.06544356048107147,
0.03855926916003227,
-0.08517038077116013,
0.02032848261296749,
-0.22071704268455505,
0.04320244863629341,
-0.04951300844550133,
0.06566619873046875,
-0.060653574764728546,
-0.12212702631950378,
0.0072395578026771545,
0.0020949707832187414,
0.09128567576408386,
0.10670078545808792,
-0.22462882101535797,
-0.0764412209391594,
0.1502048522233963,
-0.07392831146717072,
-0.09839023649692535,
0.11358591169118881,
-0.06351952999830246,
0.013043041341006756,
0.0610421858727932,
0.19783717393875122,
0.05328095331788063,
-0.1377512514591217,
0.023082606494426727,
-0.015367588959634304,
0.049398116767406464,
-0.027165455743670464,
0.052065908908843994,
0.02162730135023594,
0.08811955153942108,
0.018704550340771675,
-0.06750258058309555,
0.0677628293633461,
-0.12442849576473236,
-0.0973825752735138,
-0.02574753575026989,
-0.08652006089687347,
0.04356151819229126,
0.08868462592363358,
0.06171506643295288,
-0.10411621630191803,
-0.07881054282188416,
0.09323766827583313,
0.07615721970796585,
-0.06921514123678207,
0.040464919060468674,
-0.0656096488237381,
0.04356173798441887,
-0.01758229359984398,
-0.036990679800510406,
-0.174820214509964,
-0.024669533595442772,
-0.022326266393065453,
0.033140379935503006,
0.029954442754387856,
0.022955268621444702,
0.09186587482690811,
0.08934277296066284,
-0.07147775590419769,
-0.02469678223133087,
-0.06682298332452774,
0.0025075539015233517,
-0.12117345631122589,
-0.22780826687812805,
-0.04391269013285637,
-0.007835295982658863,
0.09110807627439499,
-0.21234571933746338,
0.024492016062140465,
0.02387630194425583,
0.08892504125833511,
0.025946561247110367,
-0.03145389258861542,
-0.053651172667741776,
0.07619374990463257,
-0.010972054675221443,
-0.06552385538816452,
0.06957029551267624,
-0.005011508706957102,
-0.06881695985794067,
-0.05457460507750511,
-0.11221367865800858,
0.1631614714860916,
0.13315021991729736,
-0.1470557600259781,
-0.09176328033208847,
-0.012604077346622944,
-0.0641164630651474,
-0.03307487815618515,
-0.04428306594491005,
0.038533080369234085,
0.17923392355442047,
0.0007990925805643201,
0.13288265466690063,
-0.060702238231897354,
-0.03450474143028259,
0.028916707262396812,
-0.028426647186279297,
0.027541212737560272,
0.12790487706661224,
0.12613020837306976,
-0.06276159733533859,
0.12355757504701614,
0.12424010783433914,
-0.08129942417144775,
0.14877618849277496,
-0.032862648367881775,
-0.08071060478687286,
-0.018813837319612503,
-0.0140802301466465,
-0.007860261015594006,
0.17658783495426178,
-0.14840620756149292,
-0.016807327046990395,
-0.004256419837474823,
0.013818185776472092,
0.014899618923664093,
-0.2512436509132385,
-0.05613352730870247,
0.03986582159996033,
-0.04548269510269165,
-0.009494428522884846,
-0.024595370516180992,
-0.0038176560774445534,
0.10482000559568405,
-0.007377767004072666,
-0.07433445006608963,
0.0012401690473780036,
-0.007497822400182486,
-0.049883194267749786,
0.2067444920539856,
-0.05796069651842117,
-0.12030388414859772,
-0.08982103317975998,
-0.07567491382360458,
-0.03568755090236664,
0.0035153538919985294,
0.05847974866628647,
-0.10734867304563522,
-0.018122749403119087,
-0.05908204987645149,
0.019605427980422974,
0.007381477393209934,
0.03708811476826668,
-0.0007159020169638097,
-0.0075367651879787445,
0.05688917264342308,
-0.09645732492208481,
-0.009906535036861897,
-0.06527683883905411,
-0.053648751229047775,
0.05491875112056732,
0.059240926057100296,
0.14802519977092743,
0.1349872648715973,
-0.02579585649073124,
0.01997743546962738,
-0.032591044902801514,
0.25795280933380127,
-0.09483391046524048,
-0.027106285095214844,
0.11867467314004898,
-0.013237598352134228,
0.05646909028291702,
0.1070772260427475,
0.08230385184288025,
-0.10863186419010162,
-0.0021534692496061325,
0.06373824924230576,
-0.05230473354458809,
-0.15508690476417542,
-0.015050246380269527,
-0.057802021503448486,
-0.02989283762872219,
0.07719714194536209,
0.027087148278951645,
-0.0038367193192243576,
0.05564122274518013,
0.04802238196134567,
0.04267096146941185,
-0.023051314055919647,
0.05042608082294464,
0.08746105432510376,
0.03245827928185463,
0.1090887263417244,
-0.04443301260471344,
-0.06665770709514618,
0.032430410385131836,
0.0034270044416189194,
0.24427002668380737,
-0.01566135697066784,
0.0969657152891159,
0.07355113327503204,
0.16159577667713165,
-0.012215204536914825,
0.04744432121515274,
-0.014521806500852108,
-0.0670519545674324,
-0.01921711303293705,
-0.04441175237298012,
-0.01764151081442833,
0.011651534587144852,
-0.05094140022993088,
0.038839858025312424,
-0.12640246748924255,
0.01144858356565237,
0.06839627027511597,
0.2500508725643158,
0.029192989692091942,
-0.3174043297767639,
-0.06564193964004517,
-0.004925642162561417,
-0.011179664172232151,
-0.008617421612143517,
0.0061102197505533695,
0.1544247567653656,
-0.08261898159980774,
0.0563044436275959,
-0.08604864776134491,
0.08553512394428253,
-0.036846838891506195,
0.05086645483970642,
0.07695168256759644,
0.07221728563308716,
-0.0032013095915317535,
0.05564851686358452,
-0.2847530245780945,
0.30179959535598755,
0.0014939256943762302,
0.08370096236467361,
-0.06397277116775513,
-0.031934574246406555,
0.0330696664750576,
0.08131279051303864,
0.08739541471004486,
-0.01500504370778799,
-0.022958464920520782,
-0.21385733783245087,
-0.022049281746149063,
0.030934831127524376,
0.12974496185779572,
-0.018540095537900925,
0.10405240207910538,
-0.010432885028421879,
-0.005348693113774061,
0.07394920289516449,
-0.0008029399323277175,
-0.03410647064447403,
-0.09015288203954697,
-0.026033926755189896,
-0.02589907869696617,
-0.049240998923778534,
-0.059444017708301544,
-0.10635793209075928,
-0.112869031727314,
0.11381607502698898,
0.016569698229432106,
-0.014265110716223717,
-0.1209530159831047,
0.09916826337575912,
0.07826872169971466,
-0.07615713775157928,
0.04145707190036774,
0.032202497124671936,
0.05965138599276543,
0.03166025131940842,
-0.05790642276406288,
0.11721611022949219,
-0.05998785048723221,
-0.16017937660217285,
-0.056180190294981,
0.09191469103097916,
0.05136469379067421,
0.05751107633113861,
-0.023572638630867004,
0.015342534519731998,
-0.016443448141217232,
-0.0919826477766037,
0.05438494682312012,
-0.04167114198207855,
0.062491871416568756,
0.009315837174654007,
-0.01950502209365368,
0.052434101700782776,
-0.05579143017530441,
-0.012027722783386707,
0.1481226235628128,
0.2866939604282379,
-0.0898970365524292,
0.014696088619530201,
0.014996358193457127,
-0.0657133013010025,
-0.1908927857875824,
0.07903271913528442,
0.059024542570114136,
0.0003236344491597265,
0.08764078468084335,
-0.16696979105472565,
0.09721887856721878,
0.10313831269741058,
-0.00018088877550326288,
0.11356144398450851,
-0.3685600757598877,
-0.1283271759748459,
0.07952839881181717,
0.19072848558425903,
0.07684752345085144,
-0.15424127876758575,
0.000239272034377791,
-0.0026929150335490704,
-0.1495710015296936,
0.09243884682655334,
-0.07779091596603394,
0.134884774684906,
-0.01935102604329586,
0.0875285193324089,
0.01665026694536209,
-0.0608099028468132,
0.12345021218061447,
-0.00443003186956048,
0.1392066925764084,
-0.06876465678215027,
-0.04000568017363548,
0.053715817630290985,
-0.03778083249926567,
-0.013550779782235622,
-0.04685073345899582,
0.02681855857372284,
-0.060428131371736526,
-0.012398848310112953,
-0.10420776903629303,
0.013147079385817051,
-0.03861471638083458,
-0.06746852397918701,
-0.045386649668216705,
0.043276671320199966,
0.04539574310183525,
-0.0031612433958798647,
0.15219777822494507,
-0.010296742431819439,
0.11267101764678955,
0.05012974143028259,
0.059254422783851624,
-0.06078752502799034,
-0.10647626966238022,
-0.01841866783797741,
0.008730622008442879,
0.04805048182606697,
-0.13453665375709534,
0.016422748565673828,
0.1524914801120758,
0.050543252378702164,
0.12226856499910355,
0.08626432716846466,
-0.03241485357284546,
0.0326315201818943,
0.06950843334197998,
-0.15724165737628937,
-0.1142343059182167,
0.0016000923933461308,
-0.06595996022224426,
-0.07277495414018631,
0.05207930505275726,
0.07878997176885605,
-0.07562381774187088,
0.011810862459242344,
-0.006615807767957449,
0.006621858105063438,
-0.06877296417951584,
0.204387828707695,
0.05550229921936989,
0.04115636274218559,
-0.10286250710487366,
0.07344283163547516,
0.018016142770648003,
-0.08592265099287033,
-0.001509714056737721,
0.09055889397859573,
-0.06869915872812271,
-0.02459602802991867,
0.08121182024478912,
0.19019687175750732,
-0.07842886447906494,
-0.023111004382371902,
-0.1500396728515625,
-0.10630752891302109,
0.06907854229211807,
0.18417076766490936,
0.10038203746080399,
-0.006512294057756662,
-0.05241300165653229,
0.04744500666856766,
-0.1165413036942482,
0.07704318314790726,
0.02548481896519661,
0.08105357736349106,
-0.14998933672904968,
0.18408077955245972,
0.011139607056975365,
0.05580996349453926,
-0.026211272925138474,
0.03276738151907921,
-0.11876556277275085,
0.04053877294063568,
-0.11572045087814331,
-0.03540515899658203,
-0.015525621362030506,
0.0042603593319654465,
-0.012991220690310001,
-0.06306900829076767,
-0.06357935816049576,
0.004990594461560249,
-0.12769334018230438,
-0.022463973611593246,
0.04468223452568054,
0.0224167387932539,
-0.12709692120552063,
-0.039472274482250214,
0.028656914830207825,
-0.06417880207300186,
0.05620552971959114,
0.036458492279052734,
0.014389386400580406,
0.06632015854120255,
-0.17220503091812134,
-0.022627338767051697,
0.07081697136163712,
-0.006781075149774551,
0.06253356486558914,
-0.03669579699635506,
-0.02614782750606537,
-0.02933921478688717,
0.08741414546966553,
0.012411856092512608,
0.06344663351774216,
-0.13639067113399506,
0.005803209729492664,
-0.03352953493595123,
-0.09334559738636017,
-0.05869465693831444,
0.052820511162281036,
0.061405763030052185,
0.037500809878110886,
0.16263720393180847,
-0.0828312337398529,
0.04512277618050575,
-0.21882659196853638,
-0.016123522073030472,
0.0019690608605742455,
-0.10597095638513565,
-0.08253838866949081,
-0.07247085124254227,
0.08213773369789124,
-0.07532702386379242,
0.10943756252527237,
0.03624960035085678,
0.06528622657060623,
0.031573258340358734,
-0.0334358774125576,
-0.003328854450955987,
0.03439907357096672,
0.21135494112968445,
0.011939233168959618,
-0.03267694637179375,
0.0883578434586525,
0.07814532518386841,
0.10031159967184067,
0.1352544128894806,
0.22853423655033112,
0.1549084633588791,
-0.02621779963374138,
0.08982422947883606,
0.052107393741607666,
-0.06428554654121399,
-0.17406204342842102,
0.03619799762964249,
-0.05177341774106026,
0.09734605252742767,
-0.061150599271059036,
0.20368969440460205,
0.08749991655349731,
-0.1844257265329361,
0.0654185488820076,
-0.04458949714899063,
-0.10134459286928177,
-0.0817386656999588,
-0.038064587861299515,
-0.07051079720258713,
-0.14805366098880768,
0.025765161961317062,
-0.10210359841585159,
0.04328684136271477,
0.14954687654972076,
0.009621428325772285,
-0.012140410020947456,
0.21240055561065674,
0.0323977954685688,
0.03548222407698631,
0.05726966634392738,
0.01483248732984066,
-0.02861449122428894,
-0.0914607122540474,
-0.060975588858127594,
0.018010711297392845,
-0.028469564393162727,
0.018588028848171234,
-0.06901375949382782,
-0.07651720941066742,
0.026613211259245872,
0.00447197025641799,
-0.09357792139053345,
0.023254575207829475,
0.020589983090758324,
0.09027770161628723,
0.02563934214413166,
0.006544429808855057,
0.016347095370292664,
-0.028143852949142456,
0.24694928526878357,
-0.09354884177446365,
-0.08238384127616882,
-0.08273855596780777,
0.21722160279750824,
0.03111916407942772,
0.001720828702673316,
0.009150993078947067,
-0.08181167393922806,
0.008721832185983658,
0.22852452099323273,
0.17236484587192535,
-0.13280828297138214,
-0.010138200595974922,
-0.00008552828512620181,
0.0014475751668214798,
-0.029257183894515038,
0.1178811565041542,
0.12015199661254883,
0.05330536887049675,
-0.11473056674003601,
-0.05287366360425949,
-0.05217143893241882,
-0.01830500364303589,
-0.026758121326565742,
0.05119778960943222,
0.0652909055352211,
0.022434907034039497,
-0.07022004574537277,
0.07567879557609558,
-0.058515407145023346,
-0.14318467676639557,
0.10563810169696808,
-0.2276010364294052,
-0.15647411346435547,
-0.006186052691191435,
0.12145841866731644,
0.003695683553814888,
0.0602986142039299,
-0.04246499389410019,
0.0008722473867237568,
0.05150967091321945,
-0.004567287862300873,
-0.079880490899086,
-0.10231195390224457,
0.08472739160060883,
-0.11505527794361115,
0.21724817156791687,
-0.05875490605831146,
0.035077739506959915,
0.1125645637512207,
0.06407725065946579,
-0.049887288361787796,
0.055466264486312866,
0.041938669979572296,
-0.12384229898452759,
-0.004967229440808296,
0.12469584494829178,
-0.039480820298194885,
0.05403485894203186,
0.0337943434715271,
-0.13283227384090424,
0.031666163355112076,
-0.054909028112888336,
-0.041300851851701736,
-0.027253326028585434,
-0.05152781680226326,
-0.06417661905288696,
0.1148044764995575,
0.2075551152229309,
-0.008365480229258537,
0.024621890857815742,
-0.08690034598112106,
0.01690840907394886,
0.06606423854827881,
0.04895693063735962,
-0.07765200734138489,
-0.21532177925109863,
0.006690898910164833,
0.0703587755560875,
-0.04111383110284805,
-0.2040543109178543,
-0.1128106415271759,
0.036610137671232224,
-0.0538211353123188,
-0.07152272015810013,
0.09118351340293884,
0.09030050039291382,
0.055649980902671814,
-0.05476338416337967,
-0.10570818930864334,
-0.05848672240972519,
0.1701793223619461,
-0.14630326628684998,
-0.07738175243139267
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b0_2
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4274
- Mean Iou: 0.6102
- Mean Accuracy: 0.6603
- Overall Accuracy: 0.9607
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3326
- Accuracy Undropoff: 0.9879
- Iou Unlabeled: nan
- Iou Dropoff: 0.2602
- Iou Undropoff: 0.9601
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.0555 | 5.0 | 10 | 1.0734 | 0.2254 | 0.4211 | 0.6018 | nan | 0.2240 | 0.6182 | 0.0 | 0.0622 | 0.6140 |
| 0.9825 | 10.0 | 20 | 1.0261 | 0.2992 | 0.6380 | 0.7780 | nan | 0.4852 | 0.7907 | 0.0 | 0.1170 | 0.7807 |
| 0.8991 | 15.0 | 30 | 0.8985 | 0.3231 | 0.5517 | 0.8892 | nan | 0.1836 | 0.9198 | 0.0 | 0.0776 | 0.8917 |
| 0.8191 | 20.0 | 40 | 0.7413 | 0.3270 | 0.5262 | 0.9299 | nan | 0.0858 | 0.9665 | 0.0 | 0.0513 | 0.9296 |
| 0.7562 | 25.0 | 50 | 0.6268 | 0.3259 | 0.5130 | 0.9436 | nan | 0.0433 | 0.9826 | 0.0 | 0.0343 | 0.9435 |
| 0.7395 | 30.0 | 60 | 0.5872 | 0.3235 | 0.5073 | 0.9498 | nan | 0.0246 | 0.9900 | 0.0 | 0.0206 | 0.9498 |
| 0.7272 | 35.0 | 70 | 0.5820 | 0.3379 | 0.5415 | 0.9411 | nan | 0.1055 | 0.9774 | 0.0 | 0.0729 | 0.9409 |
| 0.6525 | 40.0 | 80 | 0.5571 | 0.3445 | 0.5451 | 0.9498 | nan | 0.1036 | 0.9865 | 0.0 | 0.0839 | 0.9496 |
| 0.6161 | 45.0 | 90 | 0.5465 | 0.3480 | 0.5480 | 0.9528 | nan | 0.1064 | 0.9895 | 0.0 | 0.0914 | 0.9526 |
| 0.6131 | 50.0 | 100 | 0.5379 | 0.3712 | 0.5917 | 0.9555 | nan | 0.1949 | 0.9885 | 0.0 | 0.1584 | 0.9551 |
| 0.579 | 55.0 | 110 | 0.5229 | 0.3892 | 0.6411 | 0.9536 | nan | 0.3002 | 0.9819 | 0.0 | 0.2146 | 0.9530 |
| 0.5133 | 60.0 | 120 | 0.5113 | 0.3962 | 0.6596 | 0.9541 | nan | 0.3384 | 0.9808 | 0.0 | 0.2352 | 0.9535 |
| 0.535 | 65.0 | 130 | 0.4925 | 0.3981 | 0.6566 | 0.9561 | nan | 0.3299 | 0.9833 | 0.0 | 0.2386 | 0.9555 |
| 0.4866 | 70.0 | 140 | 0.4717 | 0.5993 | 0.6516 | 0.9584 | nan | 0.3169 | 0.9863 | nan | 0.2407 | 0.9579 |
| 0.5119 | 75.0 | 150 | 0.4712 | 0.5976 | 0.6513 | 0.9578 | nan | 0.3171 | 0.9856 | nan | 0.2380 | 0.9572 |
| 0.5034 | 80.0 | 160 | 0.4737 | 0.6120 | 0.6840 | 0.9562 | nan | 0.3872 | 0.9808 | nan | 0.2686 | 0.9554 |
| 0.4503 | 85.0 | 170 | 0.4496 | 0.6103 | 0.6618 | 0.9604 | nan | 0.3361 | 0.9875 | nan | 0.2607 | 0.9598 |
| 0.4653 | 90.0 | 180 | 0.4617 | 0.6201 | 0.6907 | 0.9580 | nan | 0.3992 | 0.9822 | nan | 0.2830 | 0.9572 |
| 0.4375 | 95.0 | 190 | 0.4412 | 0.6090 | 0.6592 | 0.9605 | nan | 0.3305 | 0.9878 | nan | 0.2580 | 0.9599 |
| 0.4306 | 100.0 | 200 | 0.4355 | 0.6120 | 0.6653 | 0.9602 | nan | 0.3436 | 0.9870 | nan | 0.2643 | 0.9597 |
| 0.4456 | 105.0 | 210 | 0.4414 | 0.6178 | 0.6756 | 0.9601 | nan | 0.3653 | 0.9860 | nan | 0.2760 | 0.9595 |
| 0.4435 | 110.0 | 220 | 0.4387 | 0.6150 | 0.6681 | 0.9608 | nan | 0.3489 | 0.9873 | nan | 0.2699 | 0.9602 |
| 0.4263 | 115.0 | 230 | 0.4348 | 0.6156 | 0.6692 | 0.9607 | nan | 0.3512 | 0.9872 | nan | 0.2711 | 0.9602 |
| 0.4123 | 120.0 | 240 | 0.4274 | 0.6102 | 0.6603 | 0.9607 | nan | 0.3326 | 0.9879 | nan | 0.2602 | 0.9601 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b0_2", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b0_2 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:52:41+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b0\_2
====================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4274
* Mean Iou: 0.6102
* Mean Accuracy: 0.6603
* Overall Accuracy: 0.9607
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3326
* Accuracy Undropoff: 0.9879
* Iou Unlabeled: nan
* Iou Dropoff: 0.2602
* Iou Undropoff: 0.9601
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10568203032016754,
0.034310873597860336,
-0.0020052888430655003,
0.11610046774148941,
0.17228668928146362,
0.028363551944494247,
0.11617675423622131,
0.11523397266864777,
-0.11099763214588165,
0.03049360029399395,
0.10458611696958542,
0.14283451437950134,
0.01627075858414173,
0.09541396796703339,
-0.01991966739296913,
-0.3067193627357483,
-0.025057120248675346,
0.03138473629951477,
-0.08510126173496246,
0.1253262758255005,
0.06550346314907074,
-0.1619582623243332,
0.09239703416824341,
-0.003596579423174262,
-0.21964815258979797,
0.015999557450413704,
-0.005329017993062735,
-0.03146868944168091,
0.15970826148986816,
0.02341117337346077,
0.11567100137472153,
0.009936717338860035,
0.11415968835353851,
-0.20344634354114532,
0.017679238691926003,
0.055597614496946335,
-0.004258429165929556,
0.06707211583852768,
0.06586253643035889,
0.0014985718298703432,
0.15110868215560913,
-0.10579033195972443,
0.06835782527923584,
0.0022776732221245766,
-0.14473670721054077,
-0.2100660800933838,
-0.07446174323558807,
0.022574272006750107,
0.07840698212385178,
0.09581136703491211,
-0.004728428088128567,
0.1130012720823288,
-0.09175640344619751,
0.1131497174501419,
0.2711831331253052,
-0.2422180026769638,
-0.0866069495677948,
0.03972857445478439,
0.002479933900758624,
0.06742198020219803,
-0.13131248950958252,
0.009007683023810387,
0.032906271517276764,
0.0477973110973835,
0.11796174943447113,
-0.03357322886586189,
-0.09860288351774216,
0.02659316547214985,
-0.13903304934501648,
-0.033298641443252563,
0.05523455888032913,
0.05271794646978378,
-0.02079673297703266,
-0.033362798392772675,
-0.0671728253364563,
-0.18194013833999634,
-0.06600288301706314,
0.01221657358109951,
0.06640360504388809,
-0.060101468116045,
-0.11538924276828766,
-0.016313115134835243,
-0.11030862480401993,
-0.08534715324640274,
-0.049041688442230225,
0.12745332717895508,
0.033965498208999634,
0.019251281395554543,
-0.03417006507515907,
0.12710566818714142,
-0.02634599432349205,
-0.1389843374490738,
0.016804032027721405,
0.02873995341360569,
-0.04225322604179382,
-0.03238746523857117,
-0.04932522401213646,
-0.06459038704633713,
-0.012606538832187653,
0.109328493475914,
-0.060348931699991226,
0.06769931316375732,
0.036480240523815155,
0.050864461809396744,
-0.11365550011396408,
0.18960613012313843,
-0.06753421574831009,
-0.008469657972455025,
-0.03723716363310814,
0.05861883610486984,
0.004194814711809158,
-0.02243168279528618,
-0.10548785328865051,
0.0049391635693609715,
0.07021109014749527,
-0.007454104721546173,
-0.08733302354812622,
0.06983554363250732,
-0.03953494131565094,
-0.012364836409687996,
0.0004808762460015714,
-0.07560261338949203,
0.045141857117414474,
-0.0008761744247749448,
-0.083174929022789,
-0.028285665437579155,
0.051485173404216766,
0.01520170271396637,
0.014083442278206348,
0.16533713042736053,
-0.08578535169363022,
0.06360255926847458,
-0.11210166662931442,
-0.09957769513130188,
0.00037678383523598313,
-0.08606544882059097,
0.035591594874858856,
-0.07912884652614594,
-0.1499486118555069,
-0.008778676390647888,
0.07213620096445084,
-0.04063490405678749,
0.003026821417734027,
-0.05252555385231972,
-0.0903196781873703,
0.0024763643741607666,
-0.008908649906516075,
0.16290920972824097,
-0.0651860460639,
0.12217126041650772,
0.03796898201107979,
0.07190880924463272,
-0.06495226919651031,
0.03906301409006119,
-0.08485566824674606,
0.02004924975335598,
-0.22168710827827454,
0.043134234845638275,
-0.05033760517835617,
0.06720215827226639,
-0.06040476635098457,
-0.12282755225896835,
0.007988572120666504,
0.0022983329836279154,
0.09129443764686584,
0.10614853352308273,
-0.22375546395778656,
-0.07584104686975479,
0.14787186682224274,
-0.07313519716262817,
-0.09966561943292618,
0.11313436180353165,
-0.06325682252645493,
0.011858033947646618,
0.06089091673493385,
0.19724565744400024,
0.052754130214452744,
-0.13853949308395386,
0.02326320856809616,
-0.015579626895487309,
0.0495004840195179,
-0.027750838547945023,
0.051581937819719315,
0.02157057262957096,
0.08754453808069229,
0.019136443734169006,
-0.06767737865447998,
0.06801313161849976,
-0.1243351399898529,
-0.09735546261072159,
-0.025388270616531372,
-0.08679729700088501,
0.04176992550492287,
0.08891592174768448,
0.061011720448732376,
-0.10475653409957886,
-0.0782274454832077,
0.09197782725095749,
0.07598263025283813,
-0.06920557469129562,
0.040173232555389404,
-0.0652095377445221,
0.044158581644296646,
-0.017552589997649193,
-0.03718860074877739,
-0.174536794424057,
-0.02554505690932274,
-0.022517019882798195,
0.03456355258822441,
0.029413586482405663,
0.02168627828359604,
0.09174060821533203,
0.08913848549127579,
-0.07175818085670471,
-0.024819908663630486,
-0.06567797064781189,
0.002686374355107546,
-0.12057187408208847,
-0.22752492129802704,
-0.04412039741873741,
-0.007968965917825699,
0.08866548538208008,
-0.2109946310520172,
0.02440466918051243,
0.02240639366209507,
0.08802163600921631,
0.025336291640996933,
-0.03154107555747032,
-0.05266375467181206,
0.07719859480857849,
-0.01098770834505558,
-0.0658455342054367,
0.07010015100240707,
-0.005256453063338995,
-0.06792617589235306,
-0.05507153645157814,
-0.11310558766126633,
0.16384564340114594,
0.13299763202667236,
-0.1474342942237854,
-0.09256324917078018,
-0.01349611859768629,
-0.06420727074146271,
-0.03276904299855232,
-0.04335766285657883,
0.03838436305522919,
0.17910966277122498,
0.0004090390575584024,
0.13274246454238892,
-0.06086982786655426,
-0.03500383347272873,
0.028675096109509468,
-0.027712905779480934,
0.026376595720648766,
0.127682164311409,
0.12660253047943115,
-0.06166522204875946,
0.12367600202560425,
0.12418309599161148,
-0.08067891746759415,
0.1488448977470398,
-0.03283599391579628,
-0.0800417810678482,
-0.017992252483963966,
-0.014102495275437832,
-0.007623208686709404,
0.1762656271457672,
-0.14839740097522736,
-0.016887729987502098,
-0.004641381558030844,
0.014189463108778,
0.015359614044427872,
-0.24998725950717926,
-0.05676104873418808,
0.039820894598960876,
-0.045143842697143555,
-0.007877825759351254,
-0.023854508996009827,
-0.004334282595664263,
0.10426000505685806,
-0.006898446008563042,
-0.07499462366104126,
0.0014738719910383224,
-0.007587512955069542,
-0.049762170761823654,
0.2071406990289688,
-0.05837604030966759,
-0.12102153897285461,
-0.08949463069438934,
-0.07839836925268173,
-0.037190522998571396,
0.002755607943981886,
0.05840720236301422,
-0.10830553621053696,
-0.01873757690191269,
-0.059783291071653366,
0.019399868324398994,
0.006756237708032131,
0.03639897704124451,
-0.0002487284364178777,
-0.008273708634078503,
0.056638363748788834,
-0.09683648496866226,
-0.010092961601912975,
-0.06549257040023804,
-0.05354372039437294,
0.05472349748015404,
0.05884351581335068,
0.14876136183738708,
0.1346484124660492,
-0.024849001318216324,
0.020248960703611374,
-0.033107731491327286,
0.2594600319862366,
-0.09442488104104996,
-0.027865400537848473,
0.11897386610507965,
-0.01188881229609251,
0.056266117841005325,
0.10684240609407425,
0.08272673189640045,
-0.10943450033664703,
-0.0014664501650258899,
0.06393436342477798,
-0.052600644528865814,
-0.15494459867477417,
-0.015240147709846497,
-0.0580441951751709,
-0.029980240389704704,
0.07653060555458069,
0.027349097654223442,
-0.0027386874426156282,
0.055826976895332336,
0.048215556889772415,
0.0418405756354332,
-0.0229563657194376,
0.05065071955323219,
0.08783916383981705,
0.032351333647966385,
0.10961855947971344,
-0.04406305402517319,
-0.06647936999797821,
0.03213553875684738,
0.003989717457443476,
0.243775874376297,
-0.015748774632811546,
0.09612256288528442,
0.07445143908262253,
0.16092947125434875,
-0.012775292620062828,
0.048031628131866455,
-0.01546204462647438,
-0.06777671724557877,
-0.01905171200633049,
-0.044707585126161575,
-0.01688350923359394,
0.009901583194732666,
-0.05110535770654678,
0.03950786590576172,
-0.12625817954540253,
0.010610254481434822,
0.0685291662812233,
0.25067001581192017,
0.027425672858953476,
-0.3183855414390564,
-0.06481905281543732,
-0.005749670788645744,
-0.01072122436016798,
-0.008127817884087563,
0.006525708362460136,
0.15171122550964355,
-0.08263228833675385,
0.05701193958520889,
-0.08584713190793991,
0.08545848727226257,
-0.03671751916408539,
0.05084918439388275,
0.07855608314275742,
0.07366984337568283,
-0.003862930228933692,
0.05514295771718025,
-0.28678765892982483,
0.30101078748703003,
0.0014081423869356513,
0.08434140682220459,
-0.06383278220891953,
-0.031907349824905396,
0.0332881435751915,
0.08187414705753326,
0.08728303760290146,
-0.015654537826776505,
-0.02024191804230213,
-0.21498098969459534,
-0.02206515520811081,
0.0307297445833683,
0.12941695749759674,
-0.017684750258922577,
0.10396631062030792,
-0.010228889063000679,
-0.005236118566244841,
0.07403544336557388,
0.000029118979000486434,
-0.03504656255245209,
-0.09080992639064789,
-0.025998810306191444,
-0.025739219039678574,
-0.05002981051802635,
-0.05900079756975174,
-0.10674583911895752,
-0.11585163325071335,
0.11343960464000702,
0.01860172115266323,
-0.01447988860309124,
-0.12113673985004425,
0.09700119495391846,
0.07841750234365463,
-0.07535451650619507,
0.041327331215143204,
0.03151172772049904,
0.05825412645936012,
0.03162510320544243,
-0.05837370082736015,
0.1170579344034195,
-0.06111983209848404,
-0.15937337279319763,
-0.05613251402974129,
0.09158256649971008,
0.051379863172769547,
0.05715322121977806,
-0.024234652519226074,
0.015338847413659096,
-0.01662769541144371,
-0.09233999252319336,
0.05401405692100525,
-0.04180057719349861,
0.06286649405956268,
0.009956397116184235,
-0.019821783527731895,
0.05227202549576759,
-0.056310418993234634,
-0.01219782792031765,
0.14759410917758942,
0.2859400510787964,
-0.08906932175159454,
0.014190033078193665,
0.015104818157851696,
-0.06532087177038193,
-0.1903555691242218,
0.07927858829498291,
0.058426305651664734,
-0.00036695797462016344,
0.08805356174707413,
-0.16665244102478027,
0.09746026247739792,
0.10335470736026764,
0.00010310610377928242,
0.11318225413560867,
-0.3682861626148224,
-0.12823761999607086,
0.07982020080089569,
0.19185775518417358,
0.07535739243030548,
-0.15506905317306519,
0.001190862967632711,
-0.002242852933704853,
-0.14933304488658905,
0.09208282828330994,
-0.07634279131889343,
0.13554194569587708,
-0.019897058606147766,
0.08622775971889496,
0.01612669788300991,
-0.06148972734808922,
0.12291643023490906,
-0.004599647130817175,
0.1398308277130127,
-0.06937699019908905,
-0.03888072073459625,
0.05356992781162262,
-0.037859052419662476,
-0.012833229266107082,
-0.047298576682806015,
0.02697151154279709,
-0.06029711663722992,
-0.012618509121239185,
-0.10466840863227844,
0.012930364347994328,
-0.0388936810195446,
-0.06769615411758423,
-0.04555997997522354,
0.04402066767215729,
0.04539979249238968,
-0.00353108998388052,
0.15231339633464813,
-0.010838954709470272,
0.1136644184589386,
0.048744384199380875,
0.06046447157859802,
-0.06204691156744957,
-0.10752249509096146,
-0.01850900426506996,
0.00833151489496231,
0.04758154973387718,
-0.13260920345783234,
0.015375776216387749,
0.15228407084941864,
0.04943443462252617,
0.12221904844045639,
0.08648478984832764,
-0.03246122971177101,
0.03208085149526596,
0.06972602009773254,
-0.15693266689777374,
-0.11227903515100479,
0.0013418900780379772,
-0.06658440828323364,
-0.07313991338014603,
0.052583206444978714,
0.07773560285568237,
-0.07507479190826416,
0.01205501053482294,
-0.0066259209997951984,
0.006797967478632927,
-0.06821276992559433,
0.20487575232982635,
0.05566054582595825,
0.041133441030979156,
-0.10329809039831161,
0.07347003370523453,
0.017363468185067177,
-0.08510752022266388,
-0.0014960605185478926,
0.09201300144195557,
-0.06920777261257172,
-0.02483738213777542,
0.08131858706474304,
0.19032466411590576,
-0.07863075286149979,
-0.02270977571606636,
-0.15031938254833221,
-0.10656477510929108,
0.0695284977555275,
0.1841556876897812,
0.1004321277141571,
-0.007318377494812012,
-0.05236385762691498,
0.047397125512361526,
-0.11662990599870682,
0.07693072408437729,
0.023970171809196472,
0.08136561512947083,
-0.14991366863250732,
0.18174730241298676,
0.011687264777719975,
0.055347736924886703,
-0.026145366951823235,
0.03271676227450371,
-0.1190951019525528,
0.040742430835962296,
-0.11505230516195297,
-0.036935433745384216,
-0.01618773490190506,
0.004669161979109049,
-0.013356480747461319,
-0.062012411653995514,
-0.06344255059957504,
0.005434367805719376,
-0.1271473914384842,
-0.02260776422917843,
0.0452495701611042,
0.02267582342028618,
-0.12671732902526855,
-0.0390266589820385,
0.027462095022201538,
-0.0639568567276001,
0.05608043447136879,
0.035452939569950104,
0.013840378262102604,
0.06580358743667603,
-0.1730583906173706,
-0.02282106690108776,
0.0703514814376831,
-0.007203318178653717,
0.06232687085866928,
-0.0366305448114872,
-0.026281604543328285,
-0.02910427749156952,
0.08718588948249817,
0.01298639364540577,
0.06237444281578064,
-0.13668407499790192,
0.006213736720383167,
-0.03299032524228096,
-0.09234659373760223,
-0.05839794874191284,
0.053035344928503036,
0.06233586370944977,
0.03737616166472435,
0.16254039108753204,
-0.08333157747983932,
0.044984035193920135,
-0.21841612458229065,
-0.0164665337651968,
0.0018833264475688338,
-0.10650001466274261,
-0.08360176533460617,
-0.07185343652963638,
0.0824684351682663,
-0.07519587874412537,
0.11156070232391357,
0.03669772669672966,
0.06490853428840637,
0.031698670238256454,
-0.03261697664856911,
-0.0020481105893850327,
0.03388416767120361,
0.210927352309227,
0.011350899003446102,
-0.032606806606054306,
0.08931900560855865,
0.07884462922811508,
0.1002785712480545,
0.13491471111774445,
0.22851504385471344,
0.15499670803546906,
-0.026361871510744095,
0.08939691632986069,
0.05229821056127548,
-0.06479450315237045,
-0.17212559282779694,
0.03481404483318329,
-0.05090286582708359,
0.09775900095701218,
-0.061160873621702194,
0.20184721052646637,
0.08728577196598053,
-0.18398678302764893,
0.06550253927707672,
-0.0451340451836586,
-0.10112285614013672,
-0.08130311965942383,
-0.03782977536320686,
-0.06964492052793503,
-0.1480770856142044,
0.025917263701558113,
-0.10287830978631973,
0.042506083846092224,
0.15076063573360443,
0.010468785651028156,
-0.012213587760925293,
0.2135465294122696,
0.033625487238168716,
0.035995982587337494,
0.057313088327646255,
0.014271584339439869,
-0.028803816065192223,
-0.09212145209312439,
-0.06047889217734337,
0.017424648627638817,
-0.029513197019696236,
0.018649844452738762,
-0.06916222721338272,
-0.07753527164459229,
0.027022233232855797,
0.0048261722549796104,
-0.09392049908638,
0.02354523167014122,
0.02049095183610916,
0.09011025726795197,
0.02770555019378662,
0.006251844577491283,
0.01624777726829052,
-0.028084497898817062,
0.24562418460845947,
-0.09299247711896896,
-0.08212646096944809,
-0.08260203152894974,
0.21761731803417206,
0.031158512458205223,
0.0013461606577038765,
0.008507671765983105,
-0.08096644282341003,
0.008408990688621998,
0.22978870570659637,
0.1737034022808075,
-0.13272763788700104,
-0.010249785147607327,
-0.00010265110176987946,
0.0016836047871038318,
-0.02940971404314041,
0.11851155757904053,
0.12115024030208588,
0.052382346242666245,
-0.11461067944765091,
-0.052260637283325195,
-0.052412550896406174,
-0.018258582800626755,
-0.026400191709399223,
0.04908032342791557,
0.06722354888916016,
0.022531850263476372,
-0.06953738629817963,
0.0760277658700943,
-0.05932505056262016,
-0.14101992547512054,
0.1048654168844223,
-0.22703136503696442,
-0.15631406009197235,
-0.00711339246481657,
0.12067399173974991,
0.003429584437981248,
0.06026557832956314,
-0.041714973747730255,
0.0010934170568361878,
0.05072740465402603,
-0.004878721199929714,
-0.079104945063591,
-0.10343799740076065,
0.08524784445762634,
-0.11356052756309509,
0.21715715527534485,
-0.05884687602519989,
0.03447098657488823,
0.11297964304685593,
0.0634879395365715,
-0.04997463896870613,
0.05654465779662132,
0.04185411334037781,
-0.12334147840738297,
-0.0044664726592600346,
0.12353415787220001,
-0.03904144465923309,
0.05490783974528313,
0.03297015652060509,
-0.13199153542518616,
0.032621726393699646,
-0.05498421564698219,
-0.0407620370388031,
-0.027256276458501816,
-0.050684232264757156,
-0.0640455111861229,
0.11558375507593155,
0.20812341570854187,
-0.008313491009175777,
0.024143340066075325,
-0.0869632437825203,
0.017221445217728615,
0.06573953479528427,
0.04842563346028328,
-0.07829015702009201,
-0.2159627079963684,
0.006738430354744196,
0.06964126974344254,
-0.04179685190320015,
-0.2051277607679367,
-0.11257582902908325,
0.03696665167808533,
-0.05386553704738617,
-0.07216659933328629,
0.09070315957069397,
0.08999166637659073,
0.05633538216352463,
-0.054740212857723236,
-0.10217511653900146,
-0.059030719101428986,
0.17010535299777985,
-0.14732299745082855,
-0.07690544426441193
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b0_3
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3666
- Mean Iou: 0.6400
- Mean Accuracy: 0.7120
- Overall Accuracy: 0.9610
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.4404
- Accuracy Undropoff: 0.9836
- Iou Unlabeled: nan
- Iou Dropoff: 0.3196
- Iou Undropoff: 0.9603
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.0352 | 5.0 | 10 | 1.0676 | 0.2560 | 0.5776 | 0.7142 | nan | 0.4286 | 0.7266 | 0.0 | 0.0589 | 0.7090 |
| 0.9564 | 10.0 | 20 | 0.9743 | 0.3355 | 0.5576 | 0.9248 | nan | 0.1571 | 0.9581 | 0.0 | 0.0822 | 0.9243 |
| 0.8577 | 15.0 | 30 | 0.8504 | 0.3318 | 0.5283 | 0.9409 | nan | 0.0782 | 0.9784 | 0.0 | 0.0545 | 0.9407 |
| 0.7512 | 20.0 | 40 | 0.6972 | 0.3270 | 0.5122 | 0.9527 | nan | 0.0318 | 0.9926 | 0.0 | 0.0283 | 0.9526 |
| 0.6955 | 25.0 | 50 | 0.5761 | 0.3259 | 0.5099 | 0.9545 | nan | 0.0250 | 0.9948 | 0.0 | 0.0234 | 0.9544 |
| 0.6691 | 30.0 | 60 | 0.5209 | 0.3360 | 0.5271 | 0.9525 | nan | 0.0632 | 0.9911 | 0.0 | 0.0557 | 0.9524 |
| 0.626 | 35.0 | 70 | 0.5297 | 0.3408 | 0.5362 | 0.9505 | nan | 0.0844 | 0.9881 | 0.0 | 0.0719 | 0.9503 |
| 0.5544 | 40.0 | 80 | 0.5263 | 0.3616 | 0.5757 | 0.9521 | nan | 0.1652 | 0.9862 | 0.0 | 0.1330 | 0.9518 |
| 0.5316 | 45.0 | 90 | 0.4825 | 0.3836 | 0.6353 | 0.9506 | nan | 0.2915 | 0.9792 | 0.0 | 0.2009 | 0.9500 |
| 0.4929 | 50.0 | 100 | 0.4763 | 0.3958 | 0.6588 | 0.9530 | nan | 0.3378 | 0.9797 | 0.0 | 0.2352 | 0.9524 |
| 0.468 | 55.0 | 110 | 0.4583 | 0.4077 | 0.6974 | 0.9528 | nan | 0.4188 | 0.9759 | 0.0 | 0.2713 | 0.9519 |
| 0.429 | 60.0 | 120 | 0.4268 | 0.3985 | 0.6526 | 0.9575 | nan | 0.3199 | 0.9852 | 0.0 | 0.2386 | 0.9569 |
| 0.4211 | 65.0 | 130 | 0.3988 | 0.3951 | 0.6406 | 0.9584 | nan | 0.2939 | 0.9872 | 0.0 | 0.2275 | 0.9578 |
| 0.3926 | 70.0 | 140 | 0.4085 | 0.4102 | 0.6780 | 0.9587 | nan | 0.3718 | 0.9842 | 0.0 | 0.2726 | 0.9581 |
| 0.4006 | 75.0 | 150 | 0.3944 | 0.6077 | 0.6574 | 0.9604 | nan | 0.3269 | 0.9879 | nan | 0.2555 | 0.9599 |
| 0.3978 | 80.0 | 160 | 0.3881 | 0.6216 | 0.6875 | 0.9591 | nan | 0.3912 | 0.9838 | nan | 0.2848 | 0.9585 |
| 0.3553 | 85.0 | 170 | 0.3877 | 0.6333 | 0.7077 | 0.9595 | nan | 0.4329 | 0.9824 | nan | 0.3079 | 0.9588 |
| 0.3637 | 90.0 | 180 | 0.4004 | 0.6428 | 0.7273 | 0.9594 | nan | 0.4741 | 0.9805 | nan | 0.3270 | 0.9586 |
| 0.3416 | 95.0 | 190 | 0.3835 | 0.6403 | 0.7166 | 0.9604 | nan | 0.4507 | 0.9825 | nan | 0.3210 | 0.9596 |
| 0.342 | 100.0 | 200 | 0.3634 | 0.6371 | 0.7061 | 0.9611 | nan | 0.4279 | 0.9842 | nan | 0.3137 | 0.9604 |
| 0.3393 | 105.0 | 210 | 0.3740 | 0.6429 | 0.7217 | 0.9604 | nan | 0.4614 | 0.9820 | nan | 0.3262 | 0.9596 |
| 0.3535 | 110.0 | 220 | 0.3771 | 0.6423 | 0.7199 | 0.9605 | nan | 0.4575 | 0.9823 | nan | 0.3249 | 0.9597 |
| 0.3159 | 115.0 | 230 | 0.3710 | 0.6423 | 0.7167 | 0.9610 | nan | 0.4502 | 0.9832 | nan | 0.3243 | 0.9603 |
| 0.3278 | 120.0 | 240 | 0.3666 | 0.6400 | 0.7120 | 0.9610 | nan | 0.4404 | 0.9836 | nan | 0.3196 | 0.9603 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b0_3", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b0_3 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:52:47+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b0\_3
====================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3666
* Mean Iou: 0.6400
* Mean Accuracy: 0.7120
* Overall Accuracy: 0.9610
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.4404
* Accuracy Undropoff: 0.9836
* Iou Unlabeled: nan
* Iou Dropoff: 0.3196
* Iou Undropoff: 0.9603
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 4e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10658146440982819,
0.035669438540935516,
-0.0020039125811308622,
0.11685964465141296,
0.17215201258659363,
0.02884974703192711,
0.11566212028265,
0.11502735316753387,
-0.10987032204866409,
0.03101789392530918,
0.10427012294530869,
0.1415839046239853,
0.016186604276299477,
0.09467508643865585,
-0.019878871738910675,
-0.3065880835056305,
-0.025919364765286446,
0.03165772557258606,
-0.0858822911977768,
0.12480808794498444,
0.06456383317708969,
-0.1625397652387619,
0.0926043912768364,
-0.003477038349956274,
-0.220951646566391,
0.015969978645443916,
-0.004929928574711084,
-0.031147846952080727,
0.1602516621351242,
0.023593798279762268,
0.11601641774177551,
0.009395374916493893,
0.11411872506141663,
-0.20192860066890717,
0.017695069313049316,
0.05553748458623886,
-0.004535375628620386,
0.06757364422082901,
0.06559345871210098,
0.0016771841328591108,
0.15083062648773193,
-0.10641025006771088,
0.06750890612602234,
0.0019922738429158926,
-0.14488519728183746,
-0.21351677179336548,
-0.07423392683267593,
0.0215129517018795,
0.07753857225179672,
0.09613748639822006,
-0.005231843795627356,
0.11164797842502594,
-0.0913282036781311,
0.11296182870864868,
0.2684515714645386,
-0.24460168182849884,
-0.08599363267421722,
0.03779495880007744,
0.0020909712184220552,
0.06767726689577103,
-0.13271833956241608,
0.009146027266979218,
0.033558476716279984,
0.04750565066933632,
0.11729356646537781,
-0.03352457284927368,
-0.0981115847826004,
0.02685215137898922,
-0.13827095925807953,
-0.03282301500439644,
0.05632453411817551,
0.05309121683239937,
-0.020698362961411476,
-0.032167769968509674,
-0.06772546470165253,
-0.18218281865119934,
-0.06635315716266632,
0.013187347911298275,
0.06682165712118149,
-0.06015392020344734,
-0.1151098906993866,
-0.015434838831424713,
-0.11033114045858383,
-0.08448340743780136,
-0.049377571791410446,
0.126863494515419,
0.034225862473249435,
0.019409719854593277,
-0.03406473994255066,
0.126541405916214,
-0.02682335488498211,
-0.13948729634284973,
0.017700234428048134,
0.029604820534586906,
-0.04233107343316078,
-0.031533133238554,
-0.049502238631248474,
-0.061885084956884384,
-0.012341060675680637,
0.10923565179109573,
-0.060915376991033554,
0.06760382652282715,
0.036113087087869644,
0.05050193890929222,
-0.11417386680841446,
0.1892005354166031,
-0.0673631951212883,
-0.009652645327150822,
-0.036654870957136154,
0.0578601211309433,
0.003771218005567789,
-0.022506538778543472,
-0.10580193996429443,
0.004559902939945459,
0.06964147090911865,
-0.008087985217571259,
-0.08784227073192596,
0.07040823996067047,
-0.03877219557762146,
-0.011323514394462109,
-0.00025712186470627785,
-0.07608281075954437,
0.045854054391384125,
-0.000280067470157519,
-0.08331041038036346,
-0.028731079772114754,
0.05140325427055359,
0.01446702890098095,
0.013788807205855846,
0.16469363868236542,
-0.08633553981781006,
0.06313348561525345,
-0.11189283430576324,
-0.10043271631002426,
0.00015224078379105777,
-0.08550585806369781,
0.03631260246038437,
-0.0787276178598404,
-0.14874786138534546,
-0.009284095838665962,
0.07184597849845886,
-0.04054278880357742,
0.002561682602390647,
-0.05245020240545273,
-0.09069453179836273,
0.002652528462931514,
-0.008491038344800472,
0.16328178346157074,
-0.06532365828752518,
0.12222342193126678,
0.038390107452869415,
0.0728745236992836,
-0.0649229884147644,
0.038883939385414124,
-0.0844988077878952,
0.01991860754787922,
-0.22143474221229553,
0.04263328015804291,
-0.05011102557182312,
0.06758804619312286,
-0.06022815778851509,
-0.12278299033641815,
0.008911032229661942,
0.002434871159493923,
0.09089520573616028,
0.10604644566774368,
-0.22339415550231934,
-0.07625037431716919,
0.1487499475479126,
-0.0727376863360405,
-0.09903831034898758,
0.1135200634598732,
-0.0636979341506958,
0.011958872899413109,
0.06152302771806717,
0.19833484292030334,
0.05317815765738487,
-0.13834945857524872,
0.022782571613788605,
-0.01584707200527191,
0.04916553571820259,
-0.026877213269472122,
0.05065161734819412,
0.022223301231861115,
0.08666347712278366,
0.019282536581158638,
-0.06597211956977844,
0.06781138479709625,
-0.1242222711443901,
-0.0972430631518364,
-0.02568771317601204,
-0.08649498969316483,
0.0426776260137558,
0.08892949670553207,
0.06140424683690071,
-0.10501088947057724,
-0.07866375148296356,
0.09079887717962265,
0.0760178491473198,
-0.06931324303150177,
0.03959835693240166,
-0.06546251475811005,
0.04437203332781792,
-0.017896618694067,
-0.03670214116573334,
-0.1744215190410614,
-0.025590581819415092,
-0.022514987736940384,
0.03472093865275383,
0.029349898919463158,
0.022920891642570496,
0.09203581511974335,
0.08950868248939514,
-0.0718412846326828,
-0.024757595732808113,
-0.06489964574575424,
0.0026228763163089752,
-0.1216009184718132,
-0.22827133536338806,
-0.04366660490632057,
-0.0074590472504496574,
0.08991631120443344,
-0.21361461281776428,
0.024146143347024918,
0.023443084210157394,
0.08791916817426682,
0.025406556203961372,
-0.030748547986149788,
-0.05308956280350685,
0.07669319957494736,
-0.010511049069464207,
-0.06548381596803665,
0.07005736231803894,
-0.004932784009724855,
-0.06763020902872086,
-0.05526914820075035,
-0.11265762150287628,
0.16437232494354248,
0.13382136821746826,
-0.14777810871601105,
-0.09259908646345139,
-0.012305393815040588,
-0.06415138393640518,
-0.03322165459394455,
-0.04286801069974899,
0.03845025599002838,
0.17977656424045563,
-0.00043587497202679515,
0.13320328295230865,
-0.060287002474069595,
-0.034524351358413696,
0.02935914881527424,
-0.027206216007471085,
0.028186596930027008,
0.1284797191619873,
0.12472444772720337,
-0.059950388967990875,
0.12348302453756332,
0.12353941053152084,
-0.08054402470588684,
0.148437038064003,
-0.033421535044908524,
-0.0801716074347496,
-0.018381161615252495,
-0.014644211158156395,
-0.008099651895463467,
0.17704647779464722,
-0.14947202801704407,
-0.017223678529262543,
-0.004145270679146051,
0.013554297387599945,
0.01501391176134348,
-0.25035515427589417,
-0.05597740039229393,
0.03943096101284027,
-0.044959377497434616,
-0.009295334108173847,
-0.024228107184171677,
-0.004166843369603157,
0.10450568795204163,
-0.006965349894016981,
-0.07491064816713333,
0.0014012844767421484,
-0.0073641203343868256,
-0.049170590937137604,
0.20663437247276306,
-0.05856961011886597,
-0.11987311393022537,
-0.08959413319826126,
-0.0782533586025238,
-0.03616559877991676,
0.0030046047177165747,
0.05865433067083359,
-0.10874352604150772,
-0.018855294212698936,
-0.05884005129337311,
0.019010337069630623,
0.0065961251966655254,
0.03687474876642227,
-0.0008830351871438324,
-0.008259141817688942,
0.056908681988716125,
-0.09632179886102676,
-0.009728285484015942,
-0.06558690220117569,
-0.05342559143900871,
0.05503804609179497,
0.05987169221043587,
0.14856834709644318,
0.13420040905475616,
-0.025031477212905884,
0.019986145198345184,
-0.03257414698600769,
0.25969305634498596,
-0.09545700252056122,
-0.026813829317688942,
0.1183260902762413,
-0.012326685711741447,
0.056062474846839905,
0.10712651908397675,
0.08308054506778717,
-0.10940790176391602,
-0.0022422904148697853,
0.06392207741737366,
-0.05178222432732582,
-0.15515881776809692,
-0.015016636811196804,
-0.0580604113638401,
-0.029967674985527992,
0.07712752372026443,
0.027168894186615944,
-0.0029335583094507456,
0.055631570518016815,
0.04859987273812294,
0.043913744390010834,
-0.02408657781779766,
0.0500451922416687,
0.08968091756105423,
0.031970661133527756,
0.10920767486095428,
-0.044749826192855835,
-0.06672646850347519,
0.03203762695193291,
0.003513665869832039,
0.2455667406320572,
-0.01568421721458435,
0.0977911651134491,
0.0738426074385643,
0.16219797730445862,
-0.011944914236664772,
0.047561705112457275,
-0.015501835383474827,
-0.06761094927787781,
-0.018721280619502068,
-0.04444349184632301,
-0.0157382320612669,
0.010553033091127872,
-0.05199524015188217,
0.038839492946863174,
-0.1264546513557434,
0.00944901816546917,
0.06809503585100174,
0.24973371624946594,
0.028157508000731468,
-0.31892719864845276,
-0.06589832901954651,
-0.005964553914964199,
-0.010706400498747826,
-0.008853519335389137,
0.006663851905614138,
0.15287598967552185,
-0.08215257525444031,
0.056425463408231735,
-0.08573628216981888,
0.08541277050971985,
-0.037609733641147614,
0.051372211426496506,
0.07843992859125137,
0.0735916718840599,
-0.0041280267760157585,
0.05546289682388306,
-0.28513994812965393,
0.3006385266780853,
0.001929833902977407,
0.08436381816864014,
-0.0638033077120781,
-0.03201325610280037,
0.03342399746179581,
0.08278179913759232,
0.0871378555893898,
-0.015363037586212158,
-0.022138895466923714,
-0.21433746814727783,
-0.02165895886719227,
0.03140702843666077,
0.12909568846225739,
-0.017799507826566696,
0.10436473041772842,
-0.009726504795253277,
-0.005925692152231932,
0.07379692792892456,
-0.002254476537927985,
-0.03432253748178482,
-0.09086120128631592,
-0.026098381727933884,
-0.024717407301068306,
-0.04887542873620987,
-0.05863429978489876,
-0.10661199688911438,
-0.11553742736577988,
0.1132572740316391,
0.01793304644525051,
-0.01467480044811964,
-0.12101283669471741,
0.09746942669153214,
0.07876796275377274,
-0.07539597153663635,
0.04172446206212044,
0.03201085329055786,
0.058279067277908325,
0.032394517213106155,
-0.057919081300497055,
0.11758577823638916,
-0.06038132682442665,
-0.15965056419372559,
-0.05620008334517479,
0.09187466651201248,
0.051011454313993454,
0.056726641952991486,
-0.024248244240880013,
0.015896758064627647,
-0.01787342131137848,
-0.09210817515850067,
0.05500771105289459,
-0.04363706707954407,
0.06390745937824249,
0.008770408108830452,
-0.020376836881041527,
0.05279978737235069,
-0.05568188428878784,
-0.012032723985612392,
0.14663517475128174,
0.28503867983818054,
-0.08939952403306961,
0.013615978881716728,
0.01535139698535204,
-0.0654783770442009,
-0.18999657034873962,
0.07933976501226425,
0.0579763762652874,
0.00002422380930511281,
0.08615142852067947,
-0.16788651049137115,
0.09845058619976044,
0.10247883945703506,
0.0004526855773292482,
0.11708319187164307,
-0.36628398299217224,
-0.12810713052749634,
0.07948003709316254,
0.19083461165428162,
0.07731244713068008,
-0.15516233444213867,
0.0005531265633180737,
-0.002273940946906805,
-0.14934523403644562,
0.09085600078105927,
-0.0786498486995697,
0.1353216916322708,
-0.019727256149053574,
0.08751556277275085,
0.016125718131661415,
-0.06108447536826134,
0.12314268946647644,
-0.0033595424611121416,
0.13994896411895752,
-0.06920597702264786,
-0.03951369598507881,
0.05337483808398247,
-0.037535686045885086,
-0.013509167358279228,
-0.04688244313001633,
0.026739850640296936,
-0.06138315051794052,
-0.012129154987633228,
-0.10458101332187653,
0.012913920916616917,
-0.03887468948960304,
-0.06692971289157867,
-0.045523807406425476,
0.04340113699436188,
0.04486192390322685,
-0.0033528239000588655,
0.1511673480272293,
-0.010346625000238419,
0.11297435313463211,
0.04863670840859413,
0.058815788477659225,
-0.0614674836397171,
-0.10833916068077087,
-0.017572570592164993,
0.008455571718513966,
0.04835299029946327,
-0.134717658162117,
0.015401676297187805,
0.1527244746685028,
0.04964559152722359,
0.12203241139650345,
0.08634509146213531,
-0.03190410137176514,
0.032097794115543365,
0.06967717409133911,
-0.15787102282047272,
-0.11307202279567719,
0.0015649113338440657,
-0.06960689276456833,
-0.07253875583410263,
0.052326902747154236,
0.07823316007852554,
-0.07522716373205185,
0.012293062172830105,
-0.006738957017660141,
0.006383191328495741,
-0.06803685426712036,
0.2048560082912445,
0.05601721256971359,
0.0411628820002079,
-0.10404356569051743,
0.07304946333169937,
0.01864413172006607,
-0.08673650026321411,
-0.0016025063814595342,
0.09162192046642303,
-0.06982506811618805,
-0.025159498676657677,
0.08047877252101898,
0.19217449426651,
-0.07691860944032669,
-0.023474272340536118,
-0.15054762363433838,
-0.10662856698036194,
0.06964197754859924,
0.18315599858760834,
0.09984281659126282,
-0.007120213005691767,
-0.05231857672333717,
0.04733284190297127,
-0.11738499253988266,
0.07766234874725342,
0.024250589311122894,
0.08143247663974762,
-0.1496332287788391,
0.18258419632911682,
0.011344808153808117,
0.055744413286447525,
-0.02617131546139717,
0.03271980956196785,
-0.1190931499004364,
0.04017581790685654,
-0.11559398472309113,
-0.03597123548388481,
-0.015160761773586273,
0.0048886863514781,
-0.013553815893828869,
-0.06198248267173767,
-0.06339488178491592,
0.005495542194694281,
-0.1274556666612625,
-0.022396022453904152,
0.04604620113968849,
0.023082537576556206,
-0.12620069086551666,
-0.03910963982343674,
0.027426553890109062,
-0.0642177015542984,
0.05631328746676445,
0.036205653101205826,
0.014128121547400951,
0.06566578894853592,
-0.17037418484687805,
-0.022609803825616837,
0.0699119120836258,
-0.00740792928263545,
0.06315690279006958,
-0.036673806607723236,
-0.026248754933476448,
-0.029956303536891937,
0.08741764724254608,
0.012508689425885677,
0.06275051087141037,
-0.1368410289287567,
0.006079390179365873,
-0.03256414830684662,
-0.09205051511526108,
-0.058765944093465805,
0.05273941159248352,
0.06175496429204941,
0.036587923765182495,
0.1622702032327652,
-0.08313591033220291,
0.04461469501256943,
-0.2187720090150833,
-0.01667710393667221,
0.00166800944134593,
-0.10752339661121368,
-0.08155331015586853,
-0.07241737842559814,
0.08283813297748566,
-0.07580270618200302,
0.11159209907054901,
0.036595314741134644,
0.0650658831000328,
0.031223811209201813,
-0.03290214762091637,
-0.0044968463480472565,
0.03419122099876404,
0.21078522503376007,
0.011388846673071384,
-0.033021923154592514,
0.08832283318042755,
0.07899310439825058,
0.1000763401389122,
0.1362074464559555,
0.22669437527656555,
0.15417030453681946,
-0.024831246584653854,
0.08922412991523743,
0.05188487470149994,
-0.06527955830097198,
-0.17315673828125,
0.037143219262361526,
-0.05201854556798935,
0.09902139753103256,
-0.061322543770074844,
0.20158584415912628,
0.08701613545417786,
-0.1831216663122177,
0.06587304919958115,
-0.04561259225010872,
-0.1014954224228859,
-0.08040817826986313,
-0.03648768737912178,
-0.07010755687952042,
-0.1484937220811844,
0.025740694254636765,
-0.10247522592544556,
0.04295794293284416,
0.15045328438282013,
0.009956915862858295,
-0.01276398915797472,
0.2138761281967163,
0.03407645970582962,
0.035998281091451645,
0.057694751769304276,
0.014688142575323582,
-0.0293678380548954,
-0.09120018035173416,
-0.06057210639119148,
0.017557155340909958,
-0.03006577491760254,
0.01803506352007389,
-0.06905519217252731,
-0.07702520489692688,
0.0273517444729805,
0.005127053242176771,
-0.09353002160787582,
0.023289497941732407,
0.021120117977261543,
0.09051118791103363,
0.02638629823923111,
0.005916237365454435,
0.016057213768363,
-0.028504427522420883,
0.24606244266033173,
-0.09304593503475189,
-0.08128537982702255,
-0.08253912627696991,
0.21659664809703827,
0.030767302960157394,
0.0007105515105649829,
0.00831757765263319,
-0.08113142102956772,
0.00970269925892353,
0.2310786098241806,
0.17343589663505554,
-0.13225895166397095,
-0.010611243546009064,
0.0002560832945164293,
0.001492669922299683,
-0.028985802084207535,
0.11866889894008636,
0.12088412046432495,
0.052077341824769974,
-0.11458968371152878,
-0.05332330986857414,
-0.052946239709854126,
-0.018273336812853813,
-0.027334434911608696,
0.0507839098572731,
0.0665673092007637,
0.022280029952526093,
-0.06943082064390182,
0.07579857110977173,
-0.058149948716163635,
-0.1435275822877884,
0.10652757436037064,
-0.22657154500484467,
-0.15662455558776855,
-0.00713339913636446,
0.1215621829032898,
0.0040944707579910755,
0.06001145392656326,
-0.04197164624929428,
0.0006273824255913496,
0.05277416855096817,
-0.005199987906962633,
-0.0790577381849289,
-0.10326632112264633,
0.08467258512973785,
-0.11619342863559723,
0.21695835888385773,
-0.05888678878545761,
0.034678004682064056,
0.11300665885210037,
0.06376772373914719,
-0.05027712136507034,
0.05593922734260559,
0.04156490042805672,
-0.12435542792081833,
-0.00418570451438427,
0.12413641065359116,
-0.03889516368508339,
0.05342726781964302,
0.03280099108815193,
-0.1310064047574997,
0.032207198441028595,
-0.05495760589838028,
-0.041245948523283005,
-0.027092520147562027,
-0.051688309758901596,
-0.06395041197538376,
0.11518609523773193,
0.208700031042099,
-0.008517570793628693,
0.024571888148784637,
-0.08686932176351547,
0.016529284417629242,
0.06649183481931686,
0.04755489155650139,
-0.07823097705841064,
-0.21562783420085907,
0.006792236119508743,
0.07148145139217377,
-0.041370224207639694,
-0.2050217092037201,
-0.11188836395740509,
0.03653312474489212,
-0.05365889146924019,
-0.07168535888195038,
0.09085071831941605,
0.09053772687911987,
0.05624939501285553,
-0.05486525222659111,
-0.10296052694320679,
-0.05846158042550087,
0.17006079852581024,
-0.14712511003017426,
-0.07775804400444031
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b0_4
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3688
- Mean Iou: 0.3485
- Mean Accuracy: 0.5433
- Overall Accuracy: 0.9606
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.0881
- Accuracy Undropoff: 0.9984
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.0851
- Iou Undropoff: 0.9604
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.2008 | 5.0 | 10 | 1.0960 | 0.1205 | 0.4461 | 0.2825 | nan | 0.6246 | 0.2677 | 0.0 | 0.0943 | 0.2671 |
| 1.0485 | 10.0 | 20 | 1.0952 | 0.1603 | 0.6272 | 0.4049 | nan | 0.8696 | 0.3848 | 0.0 | 0.0965 | 0.3843 |
| 0.9156 | 15.0 | 30 | 1.0312 | 0.3080 | 0.5963 | 0.8333 | nan | 0.3377 | 0.8548 | 0.0 | 0.0924 | 0.8317 |
| 0.7435 | 20.0 | 40 | 0.9448 | 0.3221 | 0.5508 | 0.8937 | nan | 0.1769 | 0.9248 | 0.0 | 0.0733 | 0.8930 |
| 0.7336 | 25.0 | 50 | 0.7446 | 0.3191 | 0.4998 | 0.9461 | nan | 0.0129 | 0.9866 | 0.0 | 0.0113 | 0.9461 |
| 0.6585 | 30.0 | 60 | 0.6397 | 0.3183 | 0.4981 | 0.9534 | nan | 0.0014 | 0.9948 | 0.0 | 0.0013 | 0.9534 |
| 0.583 | 35.0 | 70 | 0.5785 | 0.3181 | 0.4978 | 0.9537 | nan | 0.0006 | 0.9951 | 0.0 | 0.0005 | 0.9537 |
| 0.5324 | 40.0 | 80 | 0.5458 | 0.3182 | 0.4980 | 0.9545 | nan | 0.0002 | 0.9958 | 0.0 | 0.0002 | 0.9545 |
| 0.5155 | 45.0 | 90 | 0.5347 | 0.3186 | 0.4987 | 0.9558 | nan | 0.0001 | 0.9973 | 0.0 | 0.0001 | 0.9558 |
| 0.4874 | 50.0 | 100 | 0.4954 | 0.3179 | 0.4976 | 0.9537 | nan | 0.0 | 0.9951 | 0.0 | 0.0 | 0.9537 |
| 0.4716 | 55.0 | 110 | 0.4646 | 0.3185 | 0.4985 | 0.9555 | nan | 0.0 | 0.9969 | 0.0 | 0.0 | 0.9555 |
| 0.4441 | 60.0 | 120 | 0.4426 | 0.3185 | 0.4985 | 0.9555 | nan | 0.0 | 0.9970 | 0.0 | 0.0 | 0.9555 |
| 0.4659 | 65.0 | 130 | 0.4345 | 0.3189 | 0.4991 | 0.9567 | nan | 0.0 | 0.9982 | 0.0 | 0.0 | 0.9567 |
| 0.4758 | 70.0 | 140 | 0.4221 | 0.3181 | 0.4978 | 0.9543 | nan | 0.0 | 0.9957 | 0.0 | 0.0 | 0.9543 |
| 0.4208 | 75.0 | 150 | 0.4029 | 0.3190 | 0.4993 | 0.9571 | nan | 0.0 | 0.9987 | 0.0 | 0.0 | 0.9571 |
| 0.4395 | 80.0 | 160 | 0.4170 | 0.3207 | 0.5016 | 0.9559 | nan | 0.0062 | 0.9971 | 0.0 | 0.0062 | 0.9559 |
| 0.3981 | 85.0 | 170 | 0.3992 | 0.3214 | 0.5027 | 0.9574 | nan | 0.0067 | 0.9987 | 0.0 | 0.0066 | 0.9574 |
| 0.3983 | 90.0 | 180 | 0.3965 | 0.3282 | 0.5125 | 0.9560 | nan | 0.0288 | 0.9963 | 0.0 | 0.0285 | 0.9560 |
| 0.398 | 95.0 | 190 | 0.3747 | 0.3272 | 0.5112 | 0.9569 | nan | 0.0251 | 0.9973 | 0.0 | 0.0249 | 0.9568 |
| 0.3767 | 100.0 | 200 | 0.3722 | 0.3301 | 0.5155 | 0.9574 | nan | 0.0336 | 0.9975 | 0.0 | 0.0330 | 0.9573 |
| 0.3797 | 105.0 | 210 | 0.3781 | 0.3334 | 0.5204 | 0.9583 | nan | 0.0429 | 0.9980 | 0.0 | 0.0420 | 0.9582 |
| 0.373 | 110.0 | 220 | 0.3744 | 0.3409 | 0.5317 | 0.9593 | nan | 0.0654 | 0.9980 | 0.0 | 0.0636 | 0.9591 |
| 0.372 | 115.0 | 230 | 0.3700 | 0.3440 | 0.5364 | 0.9599 | nan | 0.0746 | 0.9983 | 0.0 | 0.0723 | 0.9598 |
| 0.3629 | 120.0 | 240 | 0.3688 | 0.3485 | 0.5433 | 0.9606 | nan | 0.0881 | 0.9984 | 0.0 | 0.0851 | 0.9604 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b0_4", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b0_4 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:52:49+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b0\_4
====================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3688
* Mean Iou: 0.3485
* Mean Accuracy: 0.5433
* Overall Accuracy: 0.9606
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.0881
* Accuracy Undropoff: 0.9984
* Iou Unlabeled: 0.0
* Iou Dropoff: 0.0851
* Iou Undropoff: 0.9604
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10649233311414719,
0.03541902080178261,
-0.002008110284805298,
0.11587068438529968,
0.17249611020088196,
0.02851674146950245,
0.11544682830572128,
0.11457392573356628,
-0.10972921550273895,
0.031442414969205856,
0.10455724596977234,
0.1422630250453949,
0.01651071570813656,
0.09646012634038925,
-0.020534632727503777,
-0.30654653906822205,
-0.025566929951310158,
0.031897399574518204,
-0.085641048848629,
0.12503349781036377,
0.0653185248374939,
-0.1630534678697586,
0.09219326823949814,
-0.003469406859949231,
-0.22116395831108093,
0.016427848488092422,
-0.004900987725704908,
-0.031342100352048874,
0.1595771759748459,
0.023765306919813156,
0.11590364575386047,
0.009380001574754715,
0.11358951032161713,
-0.20307078957557678,
0.017782283946871758,
0.05622512102127075,
-0.0043406435288488865,
0.06756623834371567,
0.06620211154222488,
0.0016262467252090573,
0.15138760209083557,
-0.10710099339485168,
0.06757436692714691,
0.00219079852104187,
-0.14455656707286835,
-0.21195274591445923,
-0.07354521006345749,
0.022106915712356567,
0.0785149559378624,
0.09556274861097336,
-0.005519969388842583,
0.11230214685201645,
-0.09189144521951675,
0.11340049654245377,
0.2702406644821167,
-0.24366013705730438,
-0.08655690401792526,
0.03894307091832161,
0.002010022522881627,
0.06756281107664108,
-0.13231836259365082,
0.008765911683440208,
0.03365390747785568,
0.046433039009571075,
0.11845893412828445,
-0.03324741870164871,
-0.09865421056747437,
0.02689301408827305,
-0.13883599638938904,
-0.03352012485265732,
0.05586788058280945,
0.053482215851545334,
-0.020159399136900902,
-0.03282769396901131,
-0.06759633123874664,
-0.18260620534420013,
-0.06595168262720108,
0.011947227641940117,
0.06682480126619339,
-0.060095276683568954,
-0.11392708122730255,
-0.0157875157892704,
-0.11059167236089706,
-0.08550330996513367,
-0.04916469380259514,
0.12676334381103516,
0.03432220220565796,
0.01899923011660576,
-0.03564015403389931,
0.12657660245895386,
-0.028372230008244514,
-0.1392209231853485,
0.01717406138777733,
0.028990289196372032,
-0.043101582676172256,
-0.03224371373653412,
-0.04957793653011322,
-0.06523070484399796,
-0.01213329192250967,
0.10851341485977173,
-0.059460222721099854,
0.06760929524898529,
0.03599978983402252,
0.05042739585042,
-0.11337345093488693,
0.18941862881183624,
-0.06706184148788452,
-0.009321819990873337,
-0.03720396012067795,
0.05813972279429436,
0.0031261146068573,
-0.022109270095825195,
-0.10478325188159943,
0.003962255083024502,
0.0705074816942215,
-0.008451290428638458,
-0.0875946432352066,
0.06957915425300598,
-0.039526890963315964,
-0.0115347383543849,
-0.0019373028771951795,
-0.07579585909843445,
0.045837875455617905,
-0.0002824731345754117,
-0.08342599123716354,
-0.028811132535338402,
0.05099841579794884,
0.014591277576982975,
0.01335973385721445,
0.16602031886577606,
-0.08670076727867126,
0.06328269839286804,
-0.11252683401107788,
-0.1004817932844162,
0.00011045288556488231,
-0.08778220415115356,
0.0363481231033802,
-0.0785403698682785,
-0.14912283420562744,
-0.009210201911628246,
0.07156404107809067,
-0.04026268422603607,
0.002490660175681114,
-0.053450807929039,
-0.09049218147993088,
0.0022155873011797667,
-0.008602695539593697,
0.16406433284282684,
-0.06495238840579987,
0.12248161435127258,
0.03826385736465454,
0.07250885665416718,
-0.0641457810997963,
0.039288248866796494,
-0.08494531363248825,
0.01955431140959263,
-0.2205471694469452,
0.04310363903641701,
-0.050358060747385025,
0.06745324283838272,
-0.061215128749608994,
-0.12230591475963593,
0.007535097189247608,
0.0020529634784907103,
0.09138645976781845,
0.10658048838376999,
-0.22340381145477295,
-0.07632577419281006,
0.15001466870307922,
-0.07304205000400543,
-0.09915706515312195,
0.11305509507656097,
-0.06369689851999283,
0.012146028690040112,
0.06172249838709831,
0.19822841882705688,
0.05336225405335426,
-0.13852377235889435,
0.022840937599539757,
-0.015900328755378723,
0.049486055970191956,
-0.027449751272797585,
0.05092339962720871,
0.022035008296370506,
0.08634582906961441,
0.01932832971215248,
-0.06749307364225388,
0.06813844293355942,
-0.12453159689903259,
-0.0973159596323967,
-0.026223300024867058,
-0.08707922697067261,
0.0425996370613575,
0.08921016752719879,
0.06201290711760521,
-0.1042734906077385,
-0.07783126085996628,
0.09162496030330658,
0.07593740522861481,
-0.06878883391618729,
0.03940066322684288,
-0.06503570079803467,
0.04400739446282387,
-0.018228968605399132,
-0.03635057434439659,
-0.17525562644004822,
-0.024230515584349632,
-0.022172527387738228,
0.032932575792074203,
0.029138708487153053,
0.022093607112765312,
0.0918286070227623,
0.08869968354701996,
-0.07115217298269272,
-0.024756498634815216,
-0.06573759019374847,
0.002546109724789858,
-0.12175681442022324,
-0.22863373160362244,
-0.04354200139641762,
-0.007867901585996151,
0.08889780193567276,
-0.21311669051647186,
0.024510059505701065,
0.022799061611294746,
0.08842430263757706,
0.02574062906205654,
-0.03138883039355278,
-0.05283556878566742,
0.07688137143850327,
-0.010428636334836483,
-0.06497251987457275,
0.06946990638971329,
-0.0056456285528838634,
-0.06745541095733643,
-0.05654985457658768,
-0.11247335374355316,
0.16469940543174744,
0.13317930698394775,
-0.14714597165584564,
-0.09214778244495392,
-0.013312920928001404,
-0.063844233751297,
-0.033247288316488266,
-0.04398446902632713,
0.039430055767297745,
0.1780266910791397,
-0.00010981995001202449,
0.13309699296951294,
-0.060851842164993286,
-0.03459689021110535,
0.029519541189074516,
-0.027206161990761757,
0.027374764904379845,
0.12912613153457642,
0.12653884291648865,
-0.062028925865888596,
0.12387561798095703,
0.12418481707572937,
-0.08114030212163925,
0.1494152545928955,
-0.033264271914958954,
-0.0810229554772377,
-0.018308553844690323,
-0.013985042460262775,
-0.008388935588300228,
0.17675906419754028,
-0.15065397322177887,
-0.017390701919794083,
-0.00424934271723032,
0.013155661523342133,
0.014830991625785828,
-0.24995087087154388,
-0.0558709055185318,
0.039869796484708786,
-0.04461740329861641,
-0.009433131664991379,
-0.02365971729159355,
-0.004057092592120171,
0.10431108623743057,
-0.007682909723371267,
-0.07424376159906387,
0.0007452150457538664,
-0.0073659587651491165,
-0.049111928790807724,
0.2073720246553421,
-0.057999152690172195,
-0.1201261654496193,
-0.08830592781305313,
-0.07677625864744186,
-0.03632787987589836,
0.002308193128556013,
0.0591593012213707,
-0.1085822731256485,
-0.018768738955259323,
-0.05880347266793251,
0.019458457827568054,
0.006906068418174982,
0.037035223096609116,
-0.0010389474919065833,
-0.008258688263595104,
0.05622628331184387,
-0.09682733565568924,
-0.009275443851947784,
-0.06575994193553925,
-0.05308527871966362,
0.054321642965078354,
0.05974826216697693,
0.1487133949995041,
0.13398493826389313,
-0.02516005001962185,
0.02025916799902916,
-0.033261314034461975,
0.2603031396865845,
-0.09539790451526642,
-0.028128890320658684,
0.11783468723297119,
-0.01214310247451067,
0.055856361985206604,
0.10701325535774231,
0.08245833963155746,
-0.10904908925294876,
-0.0019291220232844353,
0.06410636752843857,
-0.05143284797668457,
-0.15570802986621857,
-0.015027196146547794,
-0.058327436447143555,
-0.029958199709653854,
0.077450692653656,
0.02701939083635807,
-0.001855272683314979,
0.05557077005505562,
0.048519883304834366,
0.04288593307137489,
-0.02351878397166729,
0.05035394802689552,
0.08817767351865768,
0.03225383907556534,
0.10927974432706833,
-0.04484448581933975,
-0.06649953126907349,
0.03129003196954727,
0.003169270697981119,
0.2457313984632492,
-0.015184384770691395,
0.09709212183952332,
0.07371558248996735,
0.16226935386657715,
-0.012239483185112476,
0.048111625015735626,
-0.015381221659481525,
-0.06757867336273193,
-0.018642326816916466,
-0.044339071959257126,
-0.016227567568421364,
0.010756530798971653,
-0.051272425800561905,
0.039304398000240326,
-0.1268744319677353,
0.009760280139744282,
0.06842625141143799,
0.2505629360675812,
0.027572879567742348,
-0.3173650801181793,
-0.06491094827651978,
-0.006207557395100594,
-0.011117108166217804,
-0.008987529203295708,
0.006322643253952265,
0.15259242057800293,
-0.0820283591747284,
0.0560346357524395,
-0.08593686670064926,
0.08597972989082336,
-0.03631705045700073,
0.05112973973155022,
0.07715161889791489,
0.0740961879491806,
-0.004238784778863192,
0.055085957050323486,
-0.28788360953330994,
0.30207714438438416,
0.001856112270615995,
0.0847378820180893,
-0.06413950026035309,
-0.032106366008520126,
0.03255250304937363,
0.08181426674127579,
0.0877198874950409,
-0.015706293284893036,
-0.02197972871363163,
-0.21383488178253174,
-0.021518880501389503,
0.03125238046050072,
0.1295892745256424,
-0.018455512821674347,
0.10351228713989258,
-0.009902393445372581,
-0.00588485412299633,
0.0742294192314148,
-0.0014182537561282516,
-0.03541947901248932,
-0.0902821347117424,
-0.025833291932940483,
-0.024863464757800102,
-0.04891938716173172,
-0.05869458615779877,
-0.10644339770078659,
-0.11488485336303711,
0.11102840304374695,
0.015258767642080784,
-0.013820716179907322,
-0.12087500095367432,
0.09911880642175674,
0.07827001810073853,
-0.075639508664608,
0.042421214282512665,
0.031842127442359924,
0.05792653188109398,
0.0320403166115284,
-0.05805044621229172,
0.11795914173126221,
-0.060278356075286865,
-0.15919680893421173,
-0.05573449283838272,
0.09106617420911789,
0.05156053602695465,
0.056767117232084274,
-0.02431696094572544,
0.015305006876587868,
-0.017362257465720177,
-0.09200239181518555,
0.05444863811135292,
-0.04200691729784012,
0.06232433393597603,
0.00978512316942215,
-0.019913334399461746,
0.049309540539979935,
-0.056224022060632706,
-0.012404541485011578,
0.14732937514781952,
0.2859904170036316,
-0.08951869606971741,
0.01367634255439043,
0.01556307915598154,
-0.06635761260986328,
-0.1894458681344986,
0.08042430877685547,
0.05870437249541283,
0.0004593605117406696,
0.08709000796079636,
-0.1676849126815796,
0.09808558970689774,
0.10309728235006332,
0.0002890884061343968,
0.1149357259273529,
-0.368745893239975,
-0.12818658351898193,
0.0806475505232811,
0.19140946865081787,
0.07766779512166977,
-0.15497079491615295,
0.0007545585976913571,
-0.0016887550009414554,
-0.1488846242427826,
0.091683529317379,
-0.07718772441148758,
0.13508258759975433,
-0.019662674516439438,
0.08572512865066528,
0.01652027852833271,
-0.061377961188554764,
0.12247602641582489,
-0.004532550927251577,
0.14073969423770905,
-0.06842464208602905,
-0.03942161798477173,
0.053188927471637726,
-0.03752102702856064,
-0.01285445224493742,
-0.04775197058916092,
0.0267304927110672,
-0.06012487784028053,
-0.012296222150325775,
-0.10488420724868774,
0.013460368849337101,
-0.038980379700660706,
-0.06695487350225449,
-0.04539008066058159,
0.043324876576662064,
0.04547496885061264,
-0.003100699046626687,
0.15248781442642212,
-0.010343621484935284,
0.11362266540527344,
0.04826241731643677,
0.059205636382102966,
-0.060791101306676865,
-0.10804982483386993,
-0.017937231808900833,
0.008780719712376595,
0.04870529845356941,
-0.13476704061031342,
0.015425100922584534,
0.15282666683197021,
0.05006728321313858,
0.12231310456991196,
0.08622467517852783,
-0.03185645863413811,
0.031450528651475906,
0.06939312070608139,
-0.15781576931476593,
-0.1146925538778305,
0.0013899669284000993,
-0.06701695919036865,
-0.07267320156097412,
0.052834637463092804,
0.07759876549243927,
-0.07596214860677719,
0.011788897216320038,
-0.007180044427514076,
0.0064629726111888885,
-0.0678086206316948,
0.20557783544063568,
0.05521059408783913,
0.041427500545978546,
-0.10367696732282639,
0.07370376586914062,
0.018692485988140106,
-0.08784474432468414,
-0.0012133740819990635,
0.0907752588391304,
-0.06913053244352341,
-0.025034740567207336,
0.08155308663845062,
0.19176793098449707,
-0.0766492486000061,
-0.0228650514036417,
-0.14995284378528595,
-0.10607686638832092,
0.06983733922243118,
0.18615254759788513,
0.10015472769737244,
-0.007434334140270948,
-0.051916833966970444,
0.0478600412607193,
-0.11728440970182419,
0.07808661460876465,
0.025074606761336327,
0.08142903447151184,
-0.14993512630462646,
0.18285270035266876,
0.011471637524664402,
0.05453678220510483,
-0.025952065363526344,
0.032693106681108475,
-0.11964283883571625,
0.04019041731953621,
-0.11360186338424683,
-0.03626396507024765,
-0.015379550866782665,
0.004765302408486605,
-0.013333975337445736,
-0.062404002994298935,
-0.06320908665657043,
0.005723265465348959,
-0.12754005193710327,
-0.02226276695728302,
0.04579556733369827,
0.023444000631570816,
-0.12672723829746246,
-0.03899329900741577,
0.027743156999349594,
-0.06368311494588852,
0.055810023099184036,
0.03595525026321411,
0.013894312083721161,
0.06595242023468018,
-0.17330007255077362,
-0.021810321137309074,
0.070060595870018,
-0.007457200437784195,
0.06273166090250015,
-0.03653194010257721,
-0.02568737417459488,
-0.02941904403269291,
0.08768700808286667,
0.012275002896785736,
0.06199142336845398,
-0.13691334426403046,
0.0052282540127635,
-0.032837286591529846,
-0.09186450392007828,
-0.05883370339870453,
0.052420951426029205,
0.06178171560168266,
0.03659208491444588,
0.16250573098659515,
-0.08272150158882141,
0.044772256165742874,
-0.2185482531785965,
-0.016092568635940552,
0.0017781276255846024,
-0.1076044961810112,
-0.08283082395792007,
-0.07246813923120499,
0.08282878249883652,
-0.07605460286140442,
0.1108672097325325,
0.03712504357099533,
0.06563857197761536,
0.03164868801832199,
-0.03307740390300751,
-0.004263680428266525,
0.03464328125119209,
0.2116430103778839,
0.011542430147528648,
-0.03280042111873627,
0.08914298564195633,
0.07961855828762054,
0.10031109303236008,
0.13723793625831604,
0.22827985882759094,
0.1556602418422699,
-0.02679344266653061,
0.08985250443220139,
0.051500122994184494,
-0.06530258804559708,
-0.1727873831987381,
0.03688646852970123,
-0.05145660415291786,
0.09927286952733994,
-0.061768654733896255,
0.20273716747760773,
0.08735809475183487,
-0.18363389372825623,
0.0660974308848381,
-0.04502302035689354,
-0.10125258564949036,
-0.08116330206394196,
-0.03795985132455826,
-0.07032150030136108,
-0.14877022802829742,
0.025798222050070763,
-0.1027073934674263,
0.04333026334643364,
0.14922219514846802,
0.010018367320299149,
-0.012742683291435242,
0.21410177648067474,
0.03483295440673828,
0.036013517528772354,
0.05892600119113922,
0.014170661568641663,
-0.029570305719971657,
-0.09281884878873825,
-0.060259848833084106,
0.017781438305974007,
-0.030329212546348572,
0.018029576167464256,
-0.06978259235620499,
-0.077887162566185,
0.027117611840367317,
0.005052410531789064,
-0.09354209899902344,
0.023705361410975456,
0.020700691267848015,
0.08957210183143616,
0.026962894946336746,
0.00625406252220273,
0.01595999114215374,
-0.028466112911701202,
0.24721521139144897,
-0.09312139451503754,
-0.08100591599941254,
-0.08266246318817139,
0.2153213918209076,
0.03227752074599266,
0.0012870734790340066,
0.008854847401380539,
-0.08128305524587631,
0.009572489187121391,
0.22953803837299347,
0.1718563735485077,
-0.1308571994304657,
-0.010271341539919376,
-0.00013335027324501425,
0.0014996831305325031,
-0.029510721564292908,
0.11788038164377213,
0.12110237777233124,
0.052471183240413666,
-0.11455448716878891,
-0.05134640634059906,
-0.05223130062222481,
-0.0185924731194973,
-0.026401309296488762,
0.049513909965753555,
0.06708652526140213,
0.02181844599545002,
-0.0690699890255928,
0.07623188942670822,
-0.05878157168626785,
-0.1415841430425644,
0.1051250696182251,
-0.22685500979423523,
-0.15652738511562347,
-0.0076664225198328495,
0.1219167709350586,
0.004140776582062244,
0.05953097715973854,
-0.0418783500790596,
0.0007616900838911533,
0.05116748437285423,
-0.004854890983551741,
-0.07880108803510666,
-0.10559022426605225,
0.08460206538438797,
-0.1158004179596901,
0.2171562761068344,
-0.05846324563026428,
0.033730924129486084,
0.11284830421209335,
0.06351487338542938,
-0.05023018643260002,
0.05544724315404892,
0.04167674481868744,
-0.12405151128768921,
-0.004302829038351774,
0.12474225461483002,
-0.039072390645742416,
0.055527038872241974,
0.032598868012428284,
-0.13145694136619568,
0.03233962506055832,
-0.05618817359209061,
-0.040532518178224564,
-0.027373507618904114,
-0.05006802827119827,
-0.06371132284402847,
0.11478167772293091,
0.20842719078063965,
-0.00825675018131733,
0.02441330999135971,
-0.08749818056821823,
0.017284953966736794,
0.06594919413328171,
0.04984837397933006,
-0.07829046994447708,
-0.21566852927207947,
0.007306637242436409,
0.07119728624820709,
-0.04119804874062538,
-0.20360709726810455,
-0.11198395490646362,
0.036078546196222305,
-0.05377151817083359,
-0.0720614343881607,
0.09022299200296402,
0.092189721763134,
0.056102801114320755,
-0.05502096191048622,
-0.10624532401561737,
-0.05849136412143707,
0.1705469787120819,
-0.1468687802553177,
-0.07732440531253815
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b0_5
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2608
- Mean Iou: 0.6161
- Mean Accuracy: 0.6630
- Overall Accuracy: 0.9623
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3365
- Accuracy Undropoff: 0.9894
- Iou Unlabeled: nan
- Iou Dropoff: 0.2705
- Iou Undropoff: 0.9617
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 0.9263 | 5.0 | 10 | 1.0370 | 0.2869 | 0.7147 | 0.7632 | nan | 0.6618 | 0.7675 | 0.0 | 0.1042 | 0.7565 |
| 0.8069 | 10.0 | 20 | 0.8622 | 0.4857 | 0.5062 | 0.9589 | nan | 0.0125 | 0.9999 | nan | 0.0124 | 0.9589 |
| 0.6851 | 15.0 | 30 | 0.6490 | 0.4876 | 0.5081 | 0.9586 | nan | 0.0167 | 0.9995 | nan | 0.0165 | 0.9586 |
| 0.5882 | 20.0 | 40 | 0.4739 | 0.3253 | 0.5085 | 0.9586 | nan | 0.0177 | 0.9994 | 0.0 | 0.0174 | 0.9585 |
| 0.53 | 25.0 | 50 | 0.4153 | 0.3375 | 0.5274 | 0.9584 | nan | 0.0573 | 0.9975 | 0.0 | 0.0542 | 0.9583 |
| 0.5009 | 30.0 | 60 | 0.4275 | 0.3835 | 0.6488 | 0.9475 | nan | 0.3230 | 0.9746 | 0.0 | 0.2037 | 0.9468 |
| 0.4699 | 35.0 | 70 | 0.3819 | 0.4158 | 0.6985 | 0.9578 | nan | 0.4157 | 0.9813 | 0.0 | 0.2904 | 0.9570 |
| 0.3946 | 40.0 | 80 | 0.3563 | 0.6183 | 0.6844 | 0.9585 | nan | 0.3854 | 0.9834 | nan | 0.2787 | 0.9579 |
| 0.3788 | 45.0 | 90 | 0.3259 | 0.6292 | 0.7011 | 0.9593 | nan | 0.4196 | 0.9827 | nan | 0.2998 | 0.9585 |
| 0.3412 | 50.0 | 100 | 0.3392 | 0.6170 | 0.6933 | 0.9562 | nan | 0.4066 | 0.9801 | nan | 0.2785 | 0.9555 |
| 0.3326 | 55.0 | 110 | 0.3214 | 0.6279 | 0.6914 | 0.9606 | nan | 0.3977 | 0.9851 | nan | 0.2958 | 0.9600 |
| 0.2954 | 60.0 | 120 | 0.3119 | 0.6261 | 0.6847 | 0.9613 | nan | 0.3831 | 0.9864 | nan | 0.2915 | 0.9607 |
| 0.3006 | 65.0 | 130 | 0.2853 | 0.5900 | 0.6223 | 0.9625 | nan | 0.2513 | 0.9934 | nan | 0.2180 | 0.9621 |
| 0.2715 | 70.0 | 140 | 0.3021 | 0.6314 | 0.6903 | 0.9620 | nan | 0.3938 | 0.9867 | nan | 0.3014 | 0.9614 |
| 0.276 | 75.0 | 150 | 0.2950 | 0.6243 | 0.6783 | 0.9619 | nan | 0.3690 | 0.9877 | nan | 0.2873 | 0.9613 |
| 0.2622 | 80.0 | 160 | 0.2843 | 0.6134 | 0.6651 | 0.9608 | nan | 0.3426 | 0.9876 | nan | 0.2665 | 0.9602 |
| 0.2395 | 85.0 | 170 | 0.2752 | 0.6050 | 0.6495 | 0.9613 | nan | 0.3094 | 0.9895 | nan | 0.2493 | 0.9608 |
| 0.2597 | 90.0 | 180 | 0.2813 | 0.6296 | 0.6874 | 0.9620 | nan | 0.3879 | 0.9869 | nan | 0.2979 | 0.9614 |
| 0.2294 | 95.0 | 190 | 0.2747 | 0.6106 | 0.6575 | 0.9615 | nan | 0.3259 | 0.9890 | nan | 0.2602 | 0.9609 |
| 0.2303 | 100.0 | 200 | 0.2606 | 0.6040 | 0.6462 | 0.9616 | nan | 0.3023 | 0.9902 | nan | 0.2468 | 0.9611 |
| 0.2335 | 105.0 | 210 | 0.2606 | 0.6080 | 0.6515 | 0.9619 | nan | 0.3130 | 0.9901 | nan | 0.2547 | 0.9614 |
| 0.2322 | 110.0 | 220 | 0.2619 | 0.6167 | 0.6631 | 0.9624 | nan | 0.3366 | 0.9896 | nan | 0.2715 | 0.9619 |
| 0.2116 | 115.0 | 230 | 0.2618 | 0.6183 | 0.6660 | 0.9624 | nan | 0.3427 | 0.9893 | nan | 0.2747 | 0.9618 |
| 0.2099 | 120.0 | 240 | 0.2608 | 0.6161 | 0.6630 | 0.9623 | nan | 0.3365 | 0.9894 | nan | 0.2705 | 0.9617 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b0_5", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b0_5 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:52:51+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b0\_5
====================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2608
* Mean Iou: 0.6161
* Mean Accuracy: 0.6630
* Overall Accuracy: 0.9623
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3365
* Accuracy Undropoff: 0.9894
* Iou Unlabeled: nan
* Iou Dropoff: 0.2705
* Iou Undropoff: 0.9617
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 6e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.1064649447798729,
0.03527533635497093,
-0.001978802727535367,
0.1163235679268837,
0.17267335951328278,
0.028241146355867386,
0.11516391485929489,
0.11503798514604568,
-0.1090955063700676,
0.031093817204236984,
0.10412268340587616,
0.14279337227344513,
0.016291961073875427,
0.09399866312742233,
-0.020326532423496246,
-0.30666714906692505,
-0.025546981021761894,
0.03195459768176079,
-0.08546410501003265,
0.12509174644947052,
0.06475786119699478,
-0.16288155317306519,
0.09201489388942719,
-0.003706889459863305,
-0.2213490754365921,
0.016290757805109024,
-0.005156167782843113,
-0.03060692735016346,
0.15995922684669495,
0.023070329800248146,
0.11631980538368225,
0.009887006133794785,
0.11387231200933456,
-0.20182643830776215,
0.017866060137748718,
0.056141529232263565,
-0.004115610383450985,
0.06756260991096497,
0.0663045346736908,
0.0018802008125931025,
0.15235234797000885,
-0.10559076070785522,
0.06718938052654266,
0.0023934023920446634,
-0.14473532140254974,
-0.21103587746620178,
-0.07418119162321091,
0.020351940765976906,
0.07812579721212387,
0.09654969722032547,
-0.005547147244215012,
0.11293575912714005,
-0.09201215207576752,
0.11309708654880524,
0.2693063020706177,
-0.24295690655708313,
-0.08625418692827225,
0.03926653787493706,
0.0019650955218821764,
0.06686149537563324,
-0.1330672651529312,
0.008118880912661552,
0.03308410570025444,
0.047054633498191833,
0.11665087938308716,
-0.03288080543279648,
-0.0999564528465271,
0.02654857002198696,
-0.13867045938968658,
-0.03310524299740791,
0.05463303625583649,
0.05319058522582054,
-0.020025067031383514,
-0.031660787761211395,
-0.06826108694076538,
-0.1812070906162262,
-0.06600423157215118,
0.013151603750884533,
0.0666898638010025,
-0.05996681749820709,
-0.11456635594367981,
-0.015523867681622505,
-0.11033326387405396,
-0.085902139544487,
-0.04981669411063194,
0.12703174352645874,
0.034321535378694534,
0.019266672432422638,
-0.03493347391486168,
0.12672658264636993,
-0.0267372764647007,
-0.13983391225337982,
0.01688292622566223,
0.02988419495522976,
-0.04271238297224045,
-0.031574152410030365,
-0.04941422492265701,
-0.06422734260559082,
-0.012615879066288471,
0.10639902204275131,
-0.06079825386404991,
0.06794705241918564,
0.03559742122888565,
0.05071644112467766,
-0.1144825741648674,
0.18951334059238434,
-0.06651600450277328,
-0.00700727291405201,
-0.03757239878177643,
0.05810604989528656,
0.003286729333922267,
-0.021952684968709946,
-0.10534967482089996,
0.00437202537432313,
0.06992820650339127,
-0.008664870634675026,
-0.08774257451295853,
0.07020960003137589,
-0.039589207619428635,
-0.011640090495347977,
0.0000861301741679199,
-0.07591013610363007,
0.04544635862112045,
-0.0008271023980341852,
-0.08387154340744019,
-0.02983172796666622,
0.05105491727590561,
0.014502637088298798,
0.013674533925950527,
0.16577741503715515,
-0.0869055762887001,
0.06318511068820953,
-0.11316604167222977,
-0.09988278150558472,
0.00025648600421845913,
-0.08773462474346161,
0.036244332790374756,
-0.07785050570964813,
-0.14981769025325775,
-0.009341959841549397,
0.07159749418497086,
-0.040524471551179886,
0.002868998097255826,
-0.05311252921819687,
-0.09111931174993515,
0.002676870673894882,
-0.00837119109928608,
0.16435563564300537,
-0.06520853191614151,
0.12313096970319748,
0.038128480315208435,
0.07243192940950394,
-0.06486436724662781,
0.03951846435666084,
-0.08564808964729309,
0.019691815599799156,
-0.22095708549022675,
0.043041352182626724,
-0.05076908692717552,
0.06738118827342987,
-0.06093771010637283,
-0.12219693511724472,
0.0070835622027516365,
0.002179069211706519,
0.0915072113275528,
0.10644596815109253,
-0.2239188253879547,
-0.07573185116052628,
0.14910180866718292,
-0.07307816296815872,
-0.0989585593342781,
0.11255866289138794,
-0.06373613327741623,
0.012102761305868626,
0.06067235767841339,
0.19910748302936554,
0.053066398948431015,
-0.1374119371175766,
0.02197415940463543,
-0.016015037894248962,
0.04915572330355644,
-0.027699846774339676,
0.050508011132478714,
0.022157229483127594,
0.08769498020410538,
0.01907079853117466,
-0.06582938879728317,
0.0674828290939331,
-0.1243562325835228,
-0.09669681638479233,
-0.026051459833979607,
-0.08699985593557358,
0.042251452803611755,
0.0901937484741211,
0.06148989126086235,
-0.10449226200580597,
-0.07791747897863388,
0.09068726748228073,
0.07611079514026642,
-0.0686340257525444,
0.039376262575387955,
-0.06532116234302521,
0.04341335967183113,
-0.01825035735964775,
-0.036670997738838196,
-0.17582599818706512,
-0.025955114513635635,
-0.022006535902619362,
0.03475963696837425,
0.029744137078523636,
0.022421488538384438,
0.09191976487636566,
0.08835021406412125,
-0.07171767950057983,
-0.025062119588255882,
-0.06542296707630157,
0.002060960978269577,
-0.12156122177839279,
-0.22796687483787537,
-0.043371010571718216,
-0.008076803758740425,
0.08797289431095123,
-0.21272419393062592,
0.02396128699183464,
0.02351280488073826,
0.08829234540462494,
0.02552235871553421,
-0.031308453530073166,
-0.05253014340996742,
0.07647893577814102,
-0.010654580779373646,
-0.06567797809839249,
0.06995689868927002,
-0.005619138013571501,
-0.06729645282030106,
-0.055201489478349686,
-0.11261012405157089,
0.1630818396806717,
0.13355562090873718,
-0.14656201004981995,
-0.09229909628629684,
-0.012429691851139069,
-0.06409095227718353,
-0.033306557685136795,
-0.04346994683146477,
0.03977256268262863,
0.17981605231761932,
-0.00012548863014671952,
0.1325346976518631,
-0.06096376106142998,
-0.03445388376712799,
0.02944287657737732,
-0.027220742776989937,
0.027601120993494987,
0.12958312034606934,
0.12659059464931488,
-0.06283826380968094,
0.12436183542013168,
0.12415429949760437,
-0.08024793863296509,
0.14925888180732727,
-0.033713940531015396,
-0.08047649264335632,
-0.0171473678201437,
-0.013914289884269238,
-0.007896257564425468,
0.17660079896450043,
-0.1493210792541504,
-0.016459345817565918,
-0.004435533192008734,
0.013969046995043755,
0.015143651515245438,
-0.25068867206573486,
-0.05626754090189934,
0.03898099809885025,
-0.04464585706591606,
-0.009617979638278484,
-0.023667067289352417,
-0.004023591056466103,
0.10480663180351257,
-0.007583467289805412,
-0.07508882880210876,
0.0007786003989167511,
-0.007541450671851635,
-0.04874269664287567,
0.20765981078147888,
-0.05809421092271805,
-0.11849109083414078,
-0.08860902488231659,
-0.07761213183403015,
-0.0362376943230629,
0.0030461472924798727,
0.058542437851428986,
-0.1096472293138504,
-0.018467986956238747,
-0.05873699113726616,
0.019377076998353004,
0.00686908233910799,
0.036027196794748306,
-0.0013489086413756013,
-0.008154553361237049,
0.05611136928200722,
-0.09698041528463364,
-0.009715559892356396,
-0.06630834192037582,
-0.05398743599653244,
0.05478256940841675,
0.05913064628839493,
0.14861217141151428,
0.13478030264377594,
-0.025563279166817665,
0.019811242818832397,
-0.03278306871652603,
0.2586057782173157,
-0.09569961577653885,
-0.027569126337766647,
0.11873909831047058,
-0.011659405194222927,
0.05646179988980293,
0.10722813010215759,
0.08224759250879288,
-0.10944513231515884,
-0.0015863969456404448,
0.06423529982566833,
-0.05221214145421982,
-0.15609857439994812,
-0.01550737302750349,
-0.05823744460940361,
-0.03057030774652958,
0.07690554857254028,
0.027464190497994423,
-0.0036008988972753286,
0.055760085582733154,
0.04811287298798561,
0.042149096727371216,
-0.024342363700270653,
0.05016907677054405,
0.08759834617376328,
0.03272708132863045,
0.10928235203027725,
-0.04492725059390068,
-0.0663089007139206,
0.0311879962682724,
0.004023005720227957,
0.24633149802684784,
-0.016040263697504997,
0.09701643884181976,
0.07377535849809647,
0.16173778474330902,
-0.012071697972714901,
0.048101961612701416,
-0.016136739403009415,
-0.06779490411281586,
-0.019128547981381416,
-0.04395376890897751,
-0.016475114971399307,
0.010128410533070564,
-0.05268013849854469,
0.0391085110604763,
-0.12618909776210785,
0.010384008288383484,
0.06767336279153824,
0.24995078146457672,
0.028045376762747765,
-0.3180948495864868,
-0.06532508134841919,
-0.006256752647459507,
-0.010765226557850838,
-0.009385369718074799,
0.006407259032130241,
0.15225107967853546,
-0.08147666603326797,
0.055756863206624985,
-0.08586252480745316,
0.08601263165473938,
-0.03634632006287575,
0.05099329724907875,
0.07755950838327408,
0.07387584447860718,
-0.0042363847605884075,
0.05599009618163109,
-0.28617605566978455,
0.3027760088443756,
0.0018010791391134262,
0.08433263748884201,
-0.06409712135791779,
-0.03242374584078789,
0.032846152782440186,
0.08135969191789627,
0.08748125284910202,
-0.015547315590083599,
-0.02204763889312744,
-0.2133188098669052,
-0.020954590290784836,
0.03126795217394829,
0.1298801749944687,
-0.017911894246935844,
0.10366040468215942,
-0.009661566466093063,
-0.005430443212389946,
0.07438895851373672,
-0.0022737944964319468,
-0.033762380480766296,
-0.09058567136526108,
-0.02634373866021633,
-0.025147825479507446,
-0.04968058317899704,
-0.058459848165512085,
-0.10620462149381638,
-0.11375129967927933,
0.11213123798370361,
0.01782996952533722,
-0.014228565618395805,
-0.12060946226119995,
0.0980197861790657,
0.07856705039739609,
-0.0759081318974495,
0.04123838618397713,
0.0314280204474926,
0.05764174461364746,
0.03214576095342636,
-0.058248355984687805,
0.11837700009346008,
-0.060117706656455994,
-0.15967418253421783,
-0.056031662970781326,
0.09098325669765472,
0.05123685672879219,
0.057183887809515,
-0.02459125593304634,
0.01592181622982025,
-0.017681729048490524,
-0.09185586124658585,
0.05507479980587959,
-0.043758898973464966,
0.06366279721260071,
0.009771661832928658,
-0.020034994930028915,
0.05170690268278122,
-0.05626894533634186,
-0.012695123441517353,
0.14649665355682373,
0.2852877378463745,
-0.08899112790822983,
0.01338556595146656,
0.016166094690561295,
-0.06576605886220932,
-0.19075703620910645,
0.08007404953241348,
0.05862906947731972,
-0.0002682217163965106,
0.08668289333581924,
-0.1679680198431015,
0.09831501543521881,
0.10265666246414185,
0.00007738395652268082,
0.11561519652605057,
-0.36724144220352173,
-0.12785452604293823,
0.07975310832262039,
0.19122269749641418,
0.0768192782998085,
-0.15495578944683075,
0.0010400613537058234,
-0.0017586967442184687,
-0.14846964180469513,
0.0918133482336998,
-0.0764014944434166,
0.13566866517066956,
-0.01997825689613819,
0.08777736872434616,
0.01636609062552452,
-0.061533764004707336,
0.12190458178520203,
-0.0037856593262404203,
0.14050471782684326,
-0.06922630220651627,
-0.03958258777856827,
0.05324834957718849,
-0.037490032613277435,
-0.013398632407188416,
-0.04734349623322487,
0.02741350792348385,
-0.06068992242217064,
-0.011847357265651226,
-0.10540856420993805,
0.01313182432204485,
-0.0386306494474411,
-0.06701672077178955,
-0.04565482959151268,
0.0436239056289196,
0.04511984810233116,
-0.0034747819881886244,
0.1523420214653015,
-0.01036117598414421,
0.11424470692873001,
0.0485420785844326,
0.05981876328587532,
-0.06143272668123245,
-0.10705015063285828,
-0.0182176623493433,
0.00938886497169733,
0.04833870753645897,
-0.13539785146713257,
0.014984792098402977,
0.15288427472114563,
0.05017663910984993,
0.12158697843551636,
0.08688200265169144,
-0.03242550045251846,
0.03225881606340408,
0.06896401196718216,
-0.1572675108909607,
-0.11426348984241486,
0.001970690907910466,
-0.06495974957942963,
-0.07277394831180573,
0.053067829459905624,
0.07671596854925156,
-0.07567569613456726,
0.012378854677081108,
-0.0066458736546337605,
0.0064870379865169525,
-0.06723130494356155,
0.20519258081912994,
0.055626582354307175,
0.04165906831622124,
-0.10377777367830276,
0.07377950102090836,
0.018563872203230858,
-0.08771028369665146,
-0.0015163094503805041,
0.09134689718484879,
-0.06926306337118149,
-0.025384286418557167,
0.08161552250385284,
0.1909385770559311,
-0.07701335847377777,
-0.022326646372675896,
-0.14957816898822784,
-0.10618148744106293,
0.06986729055643082,
0.18539445102214813,
0.10010907053947449,
-0.007362076546996832,
-0.0520189143717289,
0.04726676270365715,
-0.11780749261379242,
0.07737337052822113,
0.023646822199225426,
0.08177544921636581,
-0.1495436429977417,
0.180617555975914,
0.011253352276980877,
0.05564451217651367,
-0.02602614462375641,
0.032979607582092285,
-0.11879242956638336,
0.0403553768992424,
-0.11327093094587326,
-0.03607723116874695,
-0.01482543908059597,
0.004480064380913973,
-0.013874433003365993,
-0.06252473592758179,
-0.06281433999538422,
0.005570410285145044,
-0.1278647929430008,
-0.022578440606594086,
0.04574063420295715,
0.02281276322901249,
-0.1264515370130539,
-0.03909725323319435,
0.027942046523094177,
-0.06324952095746994,
0.05568137392401695,
0.03585425019264221,
0.01412218902260065,
0.06630904972553253,
-0.172810897231102,
-0.022625451907515526,
0.0701264962553978,
-0.007172618061304092,
0.06300551444292068,
-0.03526623174548149,
-0.026233436539769173,
-0.02981232851743698,
0.08761729300022125,
0.012157831341028214,
0.062332265079021454,
-0.13706378638744354,
0.006085552275180817,
-0.033205557614564896,
-0.09258726239204407,
-0.05878211185336113,
0.053135547786951065,
0.06189845874905586,
0.036986932158470154,
0.16260986030101776,
-0.08272802829742432,
0.04484657570719719,
-0.21882472932338715,
-0.01628575287759304,
0.001987974625080824,
-0.10762587934732437,
-0.08230839669704437,
-0.07207721471786499,
0.08292986452579498,
-0.07573594152927399,
0.10918044298887253,
0.037184447050094604,
0.06530965119600296,
0.031509824097156525,
-0.03226585313677788,
-0.004310287069529295,
0.03402640298008919,
0.21053366363048553,
0.01141163520514965,
-0.03258558362722397,
0.08997436612844467,
0.07928993552923203,
0.09969432651996613,
0.13751739263534546,
0.2284683883190155,
0.15583281219005585,
-0.026213033124804497,
0.08913949131965637,
0.052004165947437286,
-0.06521408259868622,
-0.17312687635421753,
0.03661579638719559,
-0.05159622058272362,
0.09810591489076614,
-0.06126207113265991,
0.20282860100269318,
0.08685790747404099,
-0.18274620175361633,
0.06637920439243317,
-0.04609967768192291,
-0.10119998455047607,
-0.08042948693037033,
-0.036956753581762314,
-0.07009366899728775,
-0.1484755277633667,
0.026177890598773956,
-0.1025751456618309,
0.043455637991428375,
0.1513829380273819,
0.009848561137914658,
-0.012765137478709221,
0.21490661799907684,
0.0344172939658165,
0.03600281849503517,
0.05791601166129112,
0.014431260526180267,
-0.0295450109988451,
-0.0916864350438118,
-0.060308787971735,
0.018595119938254356,
-0.029993915930390358,
0.018103213980793953,
-0.06901443749666214,
-0.07793047279119492,
0.026576567441225052,
0.004978507291525602,
-0.09362301975488663,
0.023393843322992325,
0.021342402324080467,
0.0900571420788765,
0.026652397587895393,
0.0059236520901322365,
0.01674160733819008,
-0.028422784060239792,
0.24784211814403534,
-0.09310388565063477,
-0.08115433901548386,
-0.0813380628824234,
0.21793881058692932,
0.03168031945824623,
0.0009857736295089126,
0.008602683432400227,
-0.08155202865600586,
0.009304902516305447,
0.22909609973430634,
0.17173312604427338,
-0.13230612874031067,
-0.01044740155339241,
-0.00033869571052491665,
0.001624990371055901,
-0.02953352965414524,
0.1185554787516594,
0.12174198031425476,
0.052880603820085526,
-0.11488738656044006,
-0.05294225737452507,
-0.05272667109966278,
-0.018404606729745865,
-0.026273977011442184,
0.05005387216806412,
0.0670456737279892,
0.021993400529026985,
-0.06982455402612686,
0.07582568377256393,
-0.05894998088479042,
-0.1424732506275177,
0.10490449517965317,
-0.22768689692020416,
-0.1563224196434021,
-0.007417834363877773,
0.12165668606758118,
0.003909383434802294,
0.0601835623383522,
-0.042281754314899445,
0.0011347003746777773,
0.049397774040699005,
-0.005019341129809618,
-0.07907281816005707,
-0.10384263843297958,
0.08413011580705643,
-0.11650773882865906,
0.21724362671375275,
-0.0589740127325058,
0.033721596002578735,
0.11292649060487747,
0.06350234150886536,
-0.05074506253004074,
0.05617281794548035,
0.041699837893247604,
-0.12463388592004776,
-0.004178004804998636,
0.1248442530632019,
-0.03887335583567619,
0.05394022539258003,
0.03279820457100868,
-0.13262827694416046,
0.03248761221766472,
-0.05571531131863594,
-0.040658388286828995,
-0.027148038148880005,
-0.05002817139029503,
-0.06401900947093964,
0.11532068252563477,
0.20886008441448212,
-0.008116583339869976,
0.024410145357251167,
-0.08734676986932755,
0.016507580876350403,
0.06551630049943924,
0.04856019839644432,
-0.07830152660608292,
-0.21543224155902863,
0.006827480625361204,
0.07152926921844482,
-0.04189023748040199,
-0.20540539920330048,
-0.11126438528299332,
0.03656040132045746,
-0.054289672523736954,
-0.07179268449544907,
0.09031591564416885,
0.09077358990907669,
0.055838968604803085,
-0.05498761683702469,
-0.10473368316888809,
-0.058450717478990555,
0.17047332227230072,
-0.14718279242515564,
-0.0773560181260109
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b0_6
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2353
- Mean Iou: 0.6539
- Mean Accuracy: 0.7065
- Overall Accuracy: 0.9662
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.4233
- Accuracy Undropoff: 0.9897
- Iou Unlabeled: nan
- Iou Dropoff: 0.3423
- Iou Undropoff: 0.9656
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 0.9975 | 5.0 | 10 | 1.0470 | 0.2819 | 0.6747 | 0.7186 | nan | 0.6267 | 0.7226 | 0.0 | 0.1290 | 0.7167 |
| 0.8329 | 10.0 | 20 | 0.8435 | 0.3211 | 0.5026 | 0.9526 | nan | 0.0117 | 0.9934 | 0.0 | 0.0106 | 0.9526 |
| 0.6857 | 15.0 | 30 | 0.6184 | 0.3191 | 0.4994 | 0.9567 | nan | 0.0006 | 0.9981 | 0.0 | 0.0006 | 0.9567 |
| 0.5913 | 20.0 | 40 | 0.4793 | 0.3193 | 0.4997 | 0.9573 | nan | 0.0005 | 0.9988 | 0.0 | 0.0005 | 0.9573 |
| 0.5299 | 25.0 | 50 | 0.4529 | 0.3488 | 0.5442 | 0.9596 | nan | 0.0911 | 0.9973 | 0.0 | 0.0869 | 0.9595 |
| 0.4922 | 30.0 | 60 | 0.4037 | 0.4352 | 0.6983 | 0.9671 | nan | 0.4051 | 0.9915 | 0.0 | 0.3390 | 0.9666 |
| 0.4769 | 35.0 | 70 | 0.4161 | 0.4090 | 0.7560 | 0.9426 | nan | 0.5524 | 0.9595 | 0.0 | 0.2858 | 0.9412 |
| 0.3916 | 40.0 | 80 | 0.3343 | 0.6320 | 0.6946 | 0.9614 | nan | 0.4036 | 0.9856 | nan | 0.3033 | 0.9608 |
| 0.3567 | 45.0 | 90 | 0.3372 | 0.6374 | 0.7140 | 0.9598 | nan | 0.4458 | 0.9821 | nan | 0.3157 | 0.9591 |
| 0.3234 | 50.0 | 100 | 0.3074 | 0.6402 | 0.6883 | 0.9652 | nan | 0.3863 | 0.9903 | nan | 0.3157 | 0.9646 |
| 0.3181 | 55.0 | 110 | 0.3043 | 0.6396 | 0.7138 | 0.9606 | nan | 0.4446 | 0.9830 | nan | 0.3194 | 0.9599 |
| 0.2584 | 60.0 | 120 | 0.3069 | 0.6450 | 0.7204 | 0.9613 | nan | 0.4576 | 0.9831 | nan | 0.3294 | 0.9605 |
| 0.2566 | 65.0 | 130 | 0.2824 | 0.6431 | 0.7063 | 0.9630 | nan | 0.4263 | 0.9863 | nan | 0.3239 | 0.9623 |
| 0.2353 | 70.0 | 140 | 0.2763 | 0.6470 | 0.7046 | 0.9645 | nan | 0.4212 | 0.9880 | nan | 0.3301 | 0.9638 |
| 0.2368 | 75.0 | 150 | 0.2644 | 0.6474 | 0.6973 | 0.9658 | nan | 0.4044 | 0.9902 | nan | 0.3296 | 0.9652 |
| 0.2225 | 80.0 | 160 | 0.2673 | 0.6462 | 0.7089 | 0.9635 | nan | 0.4313 | 0.9866 | nan | 0.3296 | 0.9629 |
| 0.1976 | 85.0 | 170 | 0.2568 | 0.6449 | 0.7057 | 0.9637 | nan | 0.4244 | 0.9870 | nan | 0.3268 | 0.9630 |
| 0.1981 | 90.0 | 180 | 0.2572 | 0.6444 | 0.7110 | 0.9626 | nan | 0.4365 | 0.9855 | nan | 0.3269 | 0.9619 |
| 0.1857 | 95.0 | 190 | 0.2503 | 0.6504 | 0.7027 | 0.9658 | nan | 0.4157 | 0.9897 | nan | 0.3356 | 0.9652 |
| 0.1826 | 100.0 | 200 | 0.2345 | 0.6509 | 0.6984 | 0.9666 | nan | 0.4059 | 0.9909 | nan | 0.3357 | 0.9660 |
| 0.1818 | 105.0 | 210 | 0.2484 | 0.6506 | 0.7160 | 0.9637 | nan | 0.4458 | 0.9862 | nan | 0.3381 | 0.9630 |
| 0.1919 | 110.0 | 220 | 0.2343 | 0.6526 | 0.6996 | 0.9669 | nan | 0.4080 | 0.9912 | nan | 0.3389 | 0.9663 |
| 0.17 | 115.0 | 230 | 0.2377 | 0.6535 | 0.7065 | 0.9661 | nan | 0.4235 | 0.9896 | nan | 0.3416 | 0.9655 |
| 0.1739 | 120.0 | 240 | 0.2353 | 0.6539 | 0.7065 | 0.9662 | nan | 0.4233 | 0.9897 | nan | 0.3423 | 0.9656 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b0_6", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b0_6 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:52:53+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b0\_6
====================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2353
* Mean Iou: 0.6539
* Mean Accuracy: 0.7065
* Overall Accuracy: 0.9662
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.4233
* Accuracy Undropoff: 0.9897
* Iou Unlabeled: nan
* Iou Dropoff: 0.3423
* Iou Undropoff: 0.9656
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10668216645717621,
0.03452228009700775,
-0.001955996034666896,
0.11583433300256729,
0.17118558287620544,
0.028295427560806274,
0.1152672991156578,
0.1157500222325325,
-0.10835208743810654,
0.030481625348329544,
0.10422509908676147,
0.14300428330898285,
0.01632128469645977,
0.09492900222539902,
-0.020201684907078743,
-0.30762186646461487,
-0.025416793301701546,
0.032419878989458084,
-0.08495583385229111,
0.1249079778790474,
0.06513309478759766,
-0.1630345731973648,
0.0914556011557579,
-0.0034688466694206,
-0.2195017784833908,
0.01592184044420719,
-0.004851622506976128,
-0.03168648108839989,
0.16014404594898224,
0.023912712931632996,
0.11566649377346039,
0.00990807730704546,
0.11406362056732178,
-0.20257675647735596,
0.01772981323301792,
0.055963896214962006,
-0.004478502087295055,
0.0677139163017273,
0.0661034882068634,
0.002566681709140539,
0.15228892862796783,
-0.10584034025669098,
0.06737075001001358,
0.002023936016485095,
-0.14480088651180267,
-0.21224600076675415,
-0.07455022633075714,
0.02200588583946228,
0.07823474705219269,
0.09618998318910599,
-0.005171929951757193,
0.11348935961723328,
-0.09242809563875198,
0.11379505693912506,
0.2691955268383026,
-0.24320051074028015,
-0.08647218346595764,
0.039555616676807404,
0.0024925812613219023,
0.06708166003227234,
-0.13343501091003418,
0.008916690945625305,
0.032949063926935196,
0.04611779376864433,
0.1170024573802948,
-0.0330200120806694,
-0.09815603494644165,
0.026776010170578957,
-0.1384267956018448,
-0.033329449594020844,
0.054709844291210175,
0.05374779924750328,
-0.019952058792114258,
-0.033103495836257935,
-0.0683293342590332,
-0.1822303831577301,
-0.06617806106805801,
0.012883052229881287,
0.06644143909215927,
-0.0598788782954216,
-0.11478424817323685,
-0.016088983044028282,
-0.11010082811117172,
-0.08548025786876678,
-0.049414630979299545,
0.1266322284936905,
0.033910397440195084,
0.019352365285158157,
-0.0343242846429348,
0.12678514420986176,
-0.028107071295380592,
-0.13987018167972565,
0.016231566667556763,
0.029924705624580383,
-0.04298516735434532,
-0.03211383894085884,
-0.04920271039009094,
-0.06429684907197952,
-0.012736261822283268,
0.10928399860858917,
-0.06043541058897972,
0.06855661422014236,
0.03562033548951149,
0.050958555191755295,
-0.11495985835790634,
0.19116653501987457,
-0.06670752167701721,
-0.009020998142659664,
-0.03657402843236923,
0.058287110179662704,
0.0038287658244371414,
-0.022437362000346184,
-0.1049780398607254,
0.0038731968961656094,
0.07038122415542603,
-0.008917720057070255,
-0.08775182068347931,
0.06960032880306244,
-0.03893955796957016,
-0.011626145802438259,
0.002019679406657815,
-0.07545624673366547,
0.0458497516810894,
-0.000853220175486058,
-0.0840488150715828,
-0.02933313138782978,
0.0505036786198616,
0.015026706270873547,
0.014244863763451576,
0.1650863140821457,
-0.08640444278717041,
0.06335426867008209,
-0.11297277361154556,
-0.09956681728363037,
0.0006875023245811462,
-0.0864410474896431,
0.03629220649600029,
-0.07794832438230515,
-0.15083736181259155,
-0.009255703538656235,
0.07139384001493454,
-0.040475111454725266,
0.0031775925308465958,
-0.052167125046253204,
-0.09037140011787415,
0.0027780921664088964,
-0.008584096096456051,
0.16279540956020355,
-0.06506035476922989,
0.12273271381855011,
0.03824048116803169,
0.07206552475690842,
-0.0660700872540474,
0.03880484029650688,
-0.08548253774642944,
0.019758224487304688,
-0.2227982133626938,
0.04275309294462204,
-0.05086992308497429,
0.067560113966465,
-0.06008364260196686,
-0.1226377859711647,
0.0068888296373188496,
0.001753312535583973,
0.09199848026037216,
0.10535812377929688,
-0.22365745902061462,
-0.07566362619400024,
0.14855562150478363,
-0.07291552424430847,
-0.09930848330259323,
0.11356339603662491,
-0.06325499713420868,
0.011372840031981468,
0.060934897512197495,
0.19969454407691956,
0.053332164883613586,
-0.1365780234336853,
0.023041291162371635,
-0.01556539535522461,
0.049858953803777695,
-0.027456358075141907,
0.05094395950436592,
0.022096246480941772,
0.08760178089141846,
0.01927143521606922,
-0.06491079181432724,
0.06751275062561035,
-0.124356709420681,
-0.09698860347270966,
-0.025668298825621605,
-0.08645659685134888,
0.04229143261909485,
0.09025189280509949,
0.06171484664082527,
-0.10474032163619995,
-0.07818260788917542,
0.09154386818408966,
0.07635575532913208,
-0.06922173500061035,
0.039830777794122696,
-0.06529497355222702,
0.044472988694906235,
-0.016702212393283844,
-0.03679250180721283,
-0.17506472766399384,
-0.024978457018733025,
-0.021771812811493874,
0.034686796367168427,
0.029212797060608864,
0.021840563043951988,
0.0918044000864029,
0.08847380429506302,
-0.07230006903409958,
-0.024881964549422264,
-0.06549447774887085,
0.0022977262269705534,
-0.12209254503250122,
-0.22743944823741913,
-0.04419347271323204,
-0.007545523811131716,
0.09069961309432983,
-0.21444758772850037,
0.024340828880667686,
0.022555870935320854,
0.08911862969398499,
0.025102872401475906,
-0.031155860051512718,
-0.05332503095269203,
0.07699013501405716,
-0.010134764015674591,
-0.06550844758749008,
0.06988680362701416,
-0.0056714341044425964,
-0.06817638128995895,
-0.056383468210697174,
-0.11230379343032837,
0.16403992474079132,
0.1331801563501358,
-0.14674654603004456,
-0.0918625071644783,
-0.013985667377710342,
-0.0640120878815651,
-0.0333138071000576,
-0.04305218160152435,
0.03874405845999718,
0.1788344532251358,
-0.0002677168231457472,
0.13266460597515106,
-0.061546459794044495,
-0.0350022092461586,
0.02906111441552639,
-0.02722392976284027,
0.028192197903990746,
0.12883315980434418,
0.12493351101875305,
-0.0642624944448471,
0.12368166446685791,
0.12396900355815887,
-0.08083552122116089,
0.15006554126739502,
-0.033385615795850754,
-0.08015941828489304,
-0.017216235399246216,
-0.014320533722639084,
-0.008187088184058666,
0.17701038718223572,
-0.15018023550510406,
-0.016390357166528702,
-0.004461516160517931,
0.013593021780252457,
0.015381401404738426,
-0.25050392746925354,
-0.056337445974349976,
0.03881708160042763,
-0.044651735574007034,
-0.00926507730036974,
-0.023117568343877792,
-0.003986579366028309,
0.10449608415365219,
-0.007289177272468805,
-0.0753358006477356,
0.0013512653531506658,
-0.0073913466185331345,
-0.04895378276705742,
0.2074557989835739,
-0.05837719142436981,
-0.11970783025026321,
-0.0895998626947403,
-0.07626514136791229,
-0.03666219487786293,
0.0025776957627385855,
0.05860929936170578,
-0.10751813650131226,
-0.018485598266124725,
-0.059262312948703766,
0.017765309661626816,
0.007086309604346752,
0.03582917898893356,
-0.0010089564602822065,
-0.008640148676931858,
0.05668450519442558,
-0.0971149355173111,
-0.009685374796390533,
-0.06608551740646362,
-0.053390663117170334,
0.05445463955402374,
0.05849944055080414,
0.14778108894824982,
0.1347694993019104,
-0.02583364024758339,
0.019826622679829597,
-0.03356350585818291,
0.25934672355651855,
-0.09626150876283646,
-0.026832615956664085,
0.1185842901468277,
-0.01127053052186966,
0.05623108148574829,
0.1067696362733841,
0.0827367901802063,
-0.10919398069381714,
-0.0018074919935315847,
0.063935786485672,
-0.052085477858781815,
-0.1553507000207901,
-0.01570393145084381,
-0.058498919010162354,
-0.02952396869659424,
0.07645037770271301,
0.027644522488117218,
-0.0005509888869710267,
0.055298950523138046,
0.04738091304898262,
0.04222806170582771,
-0.022977497428655624,
0.050678979605436325,
0.08906951546669006,
0.03248363360762596,
0.11002167314291,
-0.04455726593732834,
-0.06593197584152222,
0.031019434332847595,
0.0027367109432816505,
0.2437392920255661,
-0.0158351082354784,
0.0965699777007103,
0.07310114800930023,
0.1629893034696579,
-0.011679907329380512,
0.048217449337244034,
-0.015909338369965553,
-0.06845609843730927,
-0.018871212378144264,
-0.04426874965429306,
-0.017078138887882233,
0.010171747766435146,
-0.051949501037597656,
0.03927450627088547,
-0.12558074295520782,
0.009040136821568012,
0.06802196800708771,
0.24910080432891846,
0.028212234377861023,
-0.3177371621131897,
-0.06550715863704681,
-0.006419237703084946,
-0.011601245030760765,
-0.009368089959025383,
0.006290399003773928,
0.15154822170734406,
-0.08207935094833374,
0.056600648909807205,
-0.0854916200041771,
0.08616773039102554,
-0.03596021234989166,
0.05108534172177315,
0.07746446132659912,
0.07374673336744308,
-0.003997485619038343,
0.05639810487627983,
-0.2854348123073578,
0.3021954894065857,
0.0018290458247065544,
0.0844842866063118,
-0.06451708823442459,
-0.03245364874601364,
0.03255452960729599,
0.08218684792518616,
0.0862865075469017,
-0.015558782033622265,
-0.020373491570353508,
-0.2145521491765976,
-0.02151002548635006,
0.030844321474432945,
0.12958677113056183,
-0.017364023253321648,
0.10302693396806717,
-0.009590191766619682,
-0.006017274688929319,
0.07404810935258865,
-0.00078987778397277,
-0.03486090898513794,
-0.0904926061630249,
-0.026160476729273796,
-0.024492409080266953,
-0.047999307513237,
-0.058972690254449844,
-0.10624150931835175,
-0.11552219837903976,
0.11203731596469879,
0.018825067207217216,
-0.01317568589001894,
-0.12102724611759186,
0.09884462505578995,
0.07937050610780716,
-0.07561321556568146,
0.04199516028165817,
0.03172530233860016,
0.05749772489070892,
0.032229747623205185,
-0.057488393038511276,
0.11801213026046753,
-0.05990540608763695,
-0.16055525839328766,
-0.055658452212810516,
0.09156057238578796,
0.05139395594596863,
0.057335928082466125,
-0.023601356893777847,
0.01574804075062275,
-0.01702164299786091,
-0.09211086481809616,
0.05349208042025566,
-0.04241287335753441,
0.06254646927118301,
0.009264731779694557,
-0.018936846405267715,
0.053364790976047516,
-0.05597256124019623,
-0.012699726037681103,
0.14638306200504303,
0.28513506054878235,
-0.08905017375946045,
0.01342394296079874,
0.015964359045028687,
-0.06538612395524979,
-0.19000448286533356,
0.07955155521631241,
0.05859869718551636,
-0.00005751884600613266,
0.08637966960668564,
-0.1663971096277237,
0.09852157533168793,
0.10411889851093292,
-0.000057413955801166594,
0.11399456858634949,
-0.3669925630092621,
-0.12837199866771698,
0.07960257679224014,
0.19012223184108734,
0.07559841871261597,
-0.15421649813652039,
0.0009688400314189494,
-0.002033200114965439,
-0.14825552701950073,
0.09130217880010605,
-0.0780077576637268,
0.1355607956647873,
-0.020395882427692413,
0.08504746109247208,
0.01619894802570343,
-0.061725325882434845,
0.12291354686021805,
-0.00456562265753746,
0.1403384506702423,
-0.06864003837108612,
-0.03862601891160011,
0.05441322922706604,
-0.03759187087416649,
-0.01317577250301838,
-0.04640738293528557,
0.02732822485268116,
-0.06128140911459923,
-0.012215357273817062,
-0.10521700978279114,
0.013581590726971626,
-0.039001017808914185,
-0.06668572127819061,
-0.046039871871471405,
0.04357583820819855,
0.04460809752345085,
-0.0033130552619695663,
0.15479449927806854,
-0.010639186948537827,
0.11508552730083466,
0.05015096440911293,
0.05995767191052437,
-0.06382566690444946,
-0.10589136928319931,
-0.01838368922472,
0.009097598493099213,
0.04811302199959755,
-0.13342462480068207,
0.015326726250350475,
0.1531793177127838,
0.05039472132921219,
0.12185995280742645,
0.0865125060081482,
-0.032498639076948166,
0.03175705671310425,
0.06932219117879868,
-0.1568727344274521,
-0.11310319602489471,
0.0011172330705448985,
-0.06492513418197632,
-0.07257074117660522,
0.05333353206515312,
0.07690292596817017,
-0.07568442821502686,
0.01218703668564558,
-0.006318105850368738,
0.0053465478122234344,
-0.06801267713308334,
0.20541970431804657,
0.05531667172908783,
0.041622113436460495,
-0.1036776453256607,
0.07279457151889801,
0.018411224707961082,
-0.08750921487808228,
-0.002308572642505169,
0.09208808839321136,
-0.06925085932016373,
-0.025191674008965492,
0.08091219514608383,
0.19003531336784363,
-0.07804623991250992,
-0.02309727482497692,
-0.14961956441402435,
-0.105857715010643,
0.06931725889444351,
0.186893031001091,
0.10050079226493835,
-0.00768991420045495,
-0.05220440402626991,
0.047207582741975784,
-0.11697019636631012,
0.0768924430012703,
0.024076569825410843,
0.08142545819282532,
-0.14971458911895752,
0.1811119168996811,
0.011487024836242199,
0.05469726026058197,
-0.02626340091228485,
0.03217598795890808,
-0.11936581134796143,
0.04047144949436188,
-0.11329479515552521,
-0.036771953105926514,
-0.015711015090346336,
0.004845113959163427,
-0.013893096707761288,
-0.06239005923271179,
-0.06276945769786835,
0.005082677584141493,
-0.12740683555603027,
-0.022364625707268715,
0.045550212264060974,
0.022588133811950684,
-0.12629267573356628,
-0.03886046260595322,
0.02745879255235195,
-0.06322000175714493,
0.05577126145362854,
0.035765569657087326,
0.014542865566909313,
0.06629293411970139,
-0.17224864661693573,
-0.02202991209924221,
0.06969162821769714,
-0.006774276029318571,
0.06285475939512253,
-0.035233449190855026,
-0.026241319254040718,
-0.0293502826243639,
0.08732601255178452,
0.01266384869813919,
0.061062224209308624,
-0.13707832992076874,
0.005039793439209461,
-0.03328782320022583,
-0.09327297657728195,
-0.05868352949619293,
0.05333911255002022,
0.061394982039928436,
0.03646638244390488,
0.16184620559215546,
-0.08299855887889862,
0.044320475310087204,
-0.21923847496509552,
-0.0163169763982296,
0.0020274538546800613,
-0.10760428756475449,
-0.08197945356369019,
-0.07211071252822876,
0.08307691663503647,
-0.07504528015851974,
0.11072805523872375,
0.036703549325466156,
0.06599593907594681,
0.031265392899513245,
-0.03308309614658356,
-0.0029485945124179125,
0.03387054428458214,
0.21230904757976532,
0.012027048505842686,
-0.032397668808698654,
0.09026296436786652,
0.07964222878217697,
0.09965947270393372,
0.1381053477525711,
0.2267380952835083,
0.1549333930015564,
-0.02653668262064457,
0.089117132127285,
0.05123714357614517,
-0.06471620500087738,
-0.17233321070671082,
0.03578842058777809,
-0.05129331722855568,
0.09767045825719833,
-0.06195820868015289,
0.20171283185482025,
0.0875607579946518,
-0.18375727534294128,
0.06577827036380768,
-0.04527178779244423,
-0.10189516097307205,
-0.08075357973575592,
-0.0383853018283844,
-0.07011755555868149,
-0.14938205480575562,
0.026120152324438095,
-0.10287898033857346,
0.04356784000992775,
0.15021295845508575,
0.010228297673165798,
-0.012655959464609623,
0.2140616625547409,
0.033855780959129333,
0.03627445548772812,
0.05768121778964996,
0.014813564717769623,
-0.028669752180576324,
-0.09245344251394272,
-0.06057759001851082,
0.017766166478395462,
-0.030833978205919266,
0.018237173557281494,
-0.06945143640041351,
-0.07720036059617996,
0.02684912085533142,
0.005890588741749525,
-0.09370170533657074,
0.024110222235322,
0.02115786261856556,
0.09061791002750397,
0.02681778185069561,
0.005865046754479408,
0.01642012968659401,
-0.028914494439959526,
0.2478397786617279,
-0.09309648722410202,
-0.08174651116132736,
-0.0815243124961853,
0.2189643383026123,
0.03225523233413696,
0.0009209680138155818,
0.008748773485422134,
-0.08171188831329346,
0.009967570193111897,
0.22951556742191315,
0.17165836691856384,
-0.1316705048084259,
-0.010039259679615498,
0.00017556168313603848,
0.0011848146095871925,
-0.030105551704764366,
0.11803902685642242,
0.1211584061384201,
0.05255655571818352,
-0.11442896723747253,
-0.052648089826107025,
-0.052683550864458084,
-0.019061902537941933,
-0.02703198418021202,
0.049321483820676804,
0.06710880994796753,
0.021521955728530884,
-0.06972035020589828,
0.07587417960166931,
-0.05898117646574974,
-0.1417282521724701,
0.10520590841770172,
-0.227431520819664,
-0.1562754362821579,
-0.0075513338670134544,
0.12149787694215775,
0.0036339014768600464,
0.05950748547911644,
-0.0416368767619133,
0.0016176262870430946,
0.049279309809207916,
-0.004903750494122505,
-0.07828832417726517,
-0.10440375655889511,
0.08430149406194687,
-0.11540862917900085,
0.2169649302959442,
-0.05928150936961174,
0.035032033920288086,
0.11344976723194122,
0.06377062946557999,
-0.04988444223999977,
0.056165602058172226,
0.04202229902148247,
-0.12509214878082275,
-0.004481189418584108,
0.12499622255563736,
-0.03820042684674263,
0.05529504641890526,
0.03262198343873024,
-0.13330787420272827,
0.03282380849123001,
-0.05601627379655838,
-0.04050467535853386,
-0.02716980315744877,
-0.050051067024469376,
-0.06377048790454865,
0.11575490981340408,
0.20849032700061798,
-0.008310714736580849,
0.02412879280745983,
-0.08693430572748184,
0.016328925266861916,
0.06596571952104568,
0.047013361006975174,
-0.07864978164434433,
-0.21639759838581085,
0.007063151802867651,
0.0700518786907196,
-0.0427006334066391,
-0.20340843498706818,
-0.11198334395885468,
0.03682626411318779,
-0.05475519597530365,
-0.07167350500822067,
0.09025808423757553,
0.090310238301754,
0.055267900228500366,
-0.054465629160404205,
-0.10541201382875443,
-0.05904494971036911,
0.1702825278043747,
-0.1475851684808731,
-0.07703559845685959
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b0_7
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2075
- Mean Iou: 0.6372
- Mean Accuracy: 0.6861
- Overall Accuracy: 0.9647
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3822
- Accuracy Undropoff: 0.9900
- Iou Unlabeled: nan
- Iou Dropoff: 0.3104
- Iou Undropoff: 0.9641
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 0.9508 | 5.0 | 10 | 1.0263 | 0.3104 | 0.5474 | 0.8717 | nan | 0.1937 | 0.9011 | 0.0 | 0.0605 | 0.8706 |
| 0.7814 | 10.0 | 20 | 0.7568 | 0.4971 | 0.5339 | 0.9361 | nan | 0.0952 | 0.9726 | nan | 0.0584 | 0.9359 |
| 0.642 | 15.0 | 30 | 0.5907 | 0.5134 | 0.5443 | 0.9494 | nan | 0.1026 | 0.9861 | nan | 0.0777 | 0.9492 |
| 0.5118 | 20.0 | 40 | 0.4804 | 0.3658 | 0.5923 | 0.9513 | nan | 0.2006 | 0.9839 | 0.0 | 0.1464 | 0.9509 |
| 0.4581 | 25.0 | 50 | 0.4405 | 0.3715 | 0.5915 | 0.9569 | nan | 0.1930 | 0.9900 | 0.0 | 0.1578 | 0.9565 |
| 0.4213 | 30.0 | 60 | 0.4146 | 0.3828 | 0.6136 | 0.9580 | nan | 0.2379 | 0.9892 | 0.0 | 0.1910 | 0.9575 |
| 0.3571 | 35.0 | 70 | 0.3750 | 0.3846 | 0.6180 | 0.9578 | nan | 0.2474 | 0.9887 | 0.0 | 0.1963 | 0.9574 |
| 0.3205 | 40.0 | 80 | 0.3478 | 0.5777 | 0.6202 | 0.9576 | nan | 0.2522 | 0.9882 | nan | 0.1982 | 0.9571 |
| 0.3114 | 45.0 | 90 | 0.3461 | 0.3895 | 0.6423 | 0.9541 | nan | 0.3022 | 0.9824 | 0.0 | 0.2150 | 0.9535 |
| 0.2747 | 50.0 | 100 | 0.3253 | 0.5875 | 0.6357 | 0.9575 | nan | 0.2847 | 0.9867 | nan | 0.2180 | 0.9570 |
| 0.2593 | 55.0 | 110 | 0.3083 | 0.5967 | 0.6599 | 0.9552 | nan | 0.3377 | 0.9820 | nan | 0.2387 | 0.9546 |
| 0.2293 | 60.0 | 120 | 0.2762 | 0.5966 | 0.6389 | 0.9606 | nan | 0.2880 | 0.9898 | nan | 0.2331 | 0.9601 |
| 0.2306 | 65.0 | 130 | 0.2655 | 0.6016 | 0.6587 | 0.9577 | nan | 0.3326 | 0.9848 | nan | 0.2462 | 0.9571 |
| 0.2118 | 70.0 | 140 | 0.2446 | 0.6039 | 0.6509 | 0.9605 | nan | 0.3133 | 0.9886 | nan | 0.2479 | 0.9600 |
| 0.2038 | 75.0 | 150 | 0.2395 | 0.6164 | 0.6708 | 0.9607 | nan | 0.3547 | 0.9870 | nan | 0.2727 | 0.9601 |
| 0.1895 | 80.0 | 160 | 0.2196 | 0.6254 | 0.6721 | 0.9636 | nan | 0.3542 | 0.9900 | nan | 0.2878 | 0.9630 |
| 0.1681 | 85.0 | 170 | 0.2176 | 0.6302 | 0.6829 | 0.9630 | nan | 0.3773 | 0.9884 | nan | 0.2979 | 0.9624 |
| 0.1612 | 90.0 | 180 | 0.2175 | 0.6334 | 0.6870 | 0.9633 | nan | 0.3857 | 0.9884 | nan | 0.3042 | 0.9627 |
| 0.1545 | 95.0 | 190 | 0.2140 | 0.6337 | 0.6816 | 0.9644 | nan | 0.3732 | 0.9900 | nan | 0.3035 | 0.9638 |
| 0.1551 | 100.0 | 200 | 0.2134 | 0.6357 | 0.6891 | 0.9637 | nan | 0.3896 | 0.9886 | nan | 0.3083 | 0.9631 |
| 0.1508 | 105.0 | 210 | 0.2090 | 0.6359 | 0.6865 | 0.9642 | nan | 0.3837 | 0.9894 | nan | 0.3083 | 0.9636 |
| 0.1536 | 110.0 | 220 | 0.2057 | 0.6346 | 0.6801 | 0.9650 | nan | 0.3694 | 0.9908 | nan | 0.3048 | 0.9644 |
| 0.1392 | 115.0 | 230 | 0.2083 | 0.6387 | 0.6890 | 0.9646 | nan | 0.3883 | 0.9896 | nan | 0.3133 | 0.9640 |
| 0.1446 | 120.0 | 240 | 0.2075 | 0.6372 | 0.6861 | 0.9647 | nan | 0.3822 | 0.9900 | nan | 0.3104 | 0.9641 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b0_7", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b0_7 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:53:02+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b0\_7
====================================
This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2075
* Mean Iou: 0.6372
* Mean Accuracy: 0.6861
* Overall Accuracy: 0.9647
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3822
* Accuracy Undropoff: 0.9900
* Iou Unlabeled: nan
* Iou Dropoff: 0.3104
* Iou Undropoff: 0.9641
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 8e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 8e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10688198357820511,
0.03653906285762787,
-0.0019876942969858646,
0.11651095002889633,
0.17239417135715485,
0.028636841103434563,
0.11559607833623886,
0.115211620926857,
-0.10894643515348434,
0.03108523227274418,
0.10397994518280029,
0.14233733713626862,
0.016241926699876785,
0.09403114020824432,
-0.019902048632502556,
-0.30677106976509094,
-0.025592254474759102,
0.032230716198682785,
-0.08399885147809982,
0.12485236674547195,
0.06511472910642624,
-0.16295339167118073,
0.09193342924118042,
-0.0034483156632632017,
-0.22024710476398468,
0.015738651156425476,
-0.0050399876199662685,
-0.03107568807899952,
0.16017043590545654,
0.023939212784171104,
0.11626701056957245,
0.009261652827262878,
0.11397325247526169,
-0.20249953866004944,
0.017709825187921524,
0.05624145641922951,
-0.004485060926526785,
0.06759331375360489,
0.06611405313014984,
0.0021820294205099344,
0.15244555473327637,
-0.10555457323789597,
0.06754391640424728,
0.002190707251429558,
-0.14472626149654388,
-0.21205687522888184,
-0.07400350272655487,
0.02171212062239647,
0.07814771682024002,
0.09599527716636658,
-0.005232831463217735,
0.11305727064609528,
-0.09116499871015549,
0.11351492255926132,
0.26952770352363586,
-0.24262161552906036,
-0.08584070950746536,
0.03813813999295235,
0.002210320672020316,
0.06740715354681015,
-0.1323588639497757,
0.008276996202766895,
0.032952845096588135,
0.04700843617320061,
0.11712255328893661,
-0.03307752683758736,
-0.10037209838628769,
0.026670752093195915,
-0.13861772418022156,
-0.03364162519574165,
0.054207801818847656,
0.0531088188290596,
-0.020304441452026367,
-0.032298553735017776,
-0.06815612316131592,
-0.18276458978652954,
-0.0663030743598938,
0.012981424108147621,
0.06636647880077362,
-0.059749674052000046,
-0.11355402320623398,
-0.015510715544223785,
-0.11017600446939468,
-0.08508671820163727,
-0.049485944211483,
0.1274702250957489,
0.03429650142788887,
0.019191378727555275,
-0.03484642133116722,
0.12668026983737946,
-0.02773442491889,
-0.1398511528968811,
0.017464827746152878,
0.03013869747519493,
-0.04249919578433037,
-0.03179987892508507,
-0.04946302995085716,
-0.06498194485902786,
-0.012907957658171654,
0.1086874008178711,
-0.06036284565925598,
0.06764314323663712,
0.03612072393298149,
0.051229529082775116,
-0.1141384169459343,
0.1905900537967682,
-0.06724415719509125,
-0.008354321122169495,
-0.03717796131968498,
0.05823054164648056,
0.003781732404604554,
-0.022669661790132523,
-0.10538602620363235,
0.004491383675485849,
0.07015682011842728,
-0.008948001079261303,
-0.08734467625617981,
0.06968463212251663,
-0.03960869833827019,
-0.011912056244909763,
0.0007122182287275791,
-0.07580918073654175,
0.045782823115587234,
-0.0006817637477070093,
-0.08350982517004013,
-0.029157595708966255,
0.05168670415878296,
0.01529520470649004,
0.013430440798401833,
0.1659192144870758,
-0.08620765060186386,
0.06348631531000137,
-0.1130705252289772,
-0.10012050718069077,
0.0006956008146516979,
-0.0870111882686615,
0.03623730316758156,
-0.07822422683238983,
-0.1491853892803192,
-0.009440447203814983,
0.07177142798900604,
-0.03986455500125885,
0.003189526265487075,
-0.05262012407183647,
-0.09011242538690567,
0.002615709789097309,
-0.008758100681006908,
0.16250267624855042,
-0.06530513614416122,
0.12276913225650787,
0.037628691643476486,
0.07225261628627777,
-0.06524758040904999,
0.03972683846950531,
-0.08491454273462296,
0.019620342180132866,
-0.22060072422027588,
0.04275904595851898,
-0.05027075484395027,
0.06802836060523987,
-0.060005415230989456,
-0.12214721739292145,
0.008285808376967907,
0.0027801895048469305,
0.09192229062318802,
0.10585800558328629,
-0.22399337589740753,
-0.07587088644504547,
0.14894963800907135,
-0.07307996600866318,
-0.09940942376852036,
0.11386564373970032,
-0.06362242996692657,
0.011474110186100006,
0.06129897013306618,
0.1983506828546524,
0.05372036620974541,
-0.13744762539863586,
0.022915653884410858,
-0.01577732339501381,
0.04954635724425316,
-0.028177542611956596,
0.05081452429294586,
0.021979697048664093,
0.0882234126329422,
0.019194990396499634,
-0.06558067351579666,
0.06817932426929474,
-0.12435249984264374,
-0.09696943312883377,
-0.025917358696460724,
-0.08651211857795715,
0.04172594100236893,
0.09031621366739273,
0.06123506277799606,
-0.10463245958089828,
-0.07846542447805405,
0.09165441244840622,
0.07608219236135483,
-0.06909333169460297,
0.039707034826278687,
-0.06548962742090225,
0.043910443782806396,
-0.016903666779398918,
-0.0369473472237587,
-0.17527778446674347,
-0.024900227785110474,
-0.02225329913198948,
0.033952128142118454,
0.028930863365530968,
0.02223893254995346,
0.09135854244232178,
0.0889480784535408,
-0.07131578773260117,
-0.024371914565563202,
-0.06492765247821808,
0.002373134484514594,
-0.12223345041275024,
-0.2279203236103058,
-0.04362659528851509,
-0.0076344250701367855,
0.08791347593069077,
-0.2132841795682907,
0.02430352009832859,
0.023219384253025055,
0.08830014616250992,
0.02559080347418785,
-0.03137360140681267,
-0.05285428464412689,
0.07710621505975723,
-0.010925707407295704,
-0.06567353010177612,
0.0703139677643776,
-0.005393281113356352,
-0.06790804117918015,
-0.055831070989370346,
-0.11282509565353394,
0.16406425833702087,
0.13333728909492493,
-0.14799368381500244,
-0.09199893474578857,
-0.012893875129520893,
-0.0640660896897316,
-0.03322582319378853,
-0.04311143606901169,
0.03914760425686836,
0.17985592782497406,
0.00011809038551291451,
0.13337408006191254,
-0.06071978807449341,
-0.03409799933433533,
0.029087916016578674,
-0.02720559388399124,
0.027698170393705368,
0.12904509902000427,
0.12621191143989563,
-0.0604666993021965,
0.12384507805109024,
0.12309633195400238,
-0.08085411041975021,
0.1486964076757431,
-0.03339635208249092,
-0.08035334199666977,
-0.01726088486611843,
-0.01432306133210659,
-0.00769517756998539,
0.17651164531707764,
-0.15099366009235382,
-0.01734057255089283,
-0.004149631131440401,
0.013642407022416592,
0.015466049313545227,
-0.250916063785553,
-0.05613907799124718,
0.0397733710706234,
-0.04448983818292618,
-0.008989253081381321,
-0.024172311648726463,
-0.004494678229093552,
0.10440003871917725,
-0.00785493478178978,
-0.07497568428516388,
0.0010939809726551175,
-0.007655345369130373,
-0.049310795962810516,
0.2070010006427765,
-0.058081500232219696,
-0.12020906805992126,
-0.09026594460010529,
-0.07742790877819061,
-0.035915497690439224,
0.003041956340894103,
0.0591307170689106,
-0.10838299244642258,
-0.018633564934134483,
-0.059220150113105774,
0.01842261105775833,
0.007692036684602499,
0.03613244742155075,
-0.001162659260444343,
-0.008312937803566456,
0.05648522824048996,
-0.09692550450563431,
-0.00989696104079485,
-0.06552489846944809,
-0.05262990668416023,
0.054758310317993164,
0.059347208589315414,
0.14897564053535461,
0.13530437648296356,
-0.0255984328687191,
0.019709277898073196,
-0.0325639508664608,
0.2586749494075775,
-0.09583822637796402,
-0.027375178411602974,
0.11892783641815186,
-0.012905166484415531,
0.05651063844561577,
0.10659842193126678,
0.08268172293901443,
-0.10930521041154861,
-0.0017602815059944987,
0.06383565068244934,
-0.05218963697552681,
-0.15548615157604218,
-0.01540535781532526,
-0.05826069042086601,
-0.030147451907396317,
0.07710491120815277,
0.027445418760180473,
-0.002628583926707506,
0.0557408444583416,
0.048225708305835724,
0.042823899537324905,
-0.024356098845601082,
0.05005079507827759,
0.08792537450790405,
0.03239109367132187,
0.10924874246120453,
-0.044474754482507706,
-0.06617734581232071,
0.03190938010811806,
0.0030062051955610514,
0.24450409412384033,
-0.01610560715198517,
0.09798083454370499,
0.07375004142522812,
0.16196469962596893,
-0.011960864067077637,
0.047864604741334915,
-0.01632845215499401,
-0.06792652606964111,
-0.01894477754831314,
-0.04422096535563469,
-0.017372533679008484,
0.010427874512970448,
-0.05290587246417999,
0.0391668975353241,
-0.12654295563697815,
0.010300243273377419,
0.06784266978502274,
0.2500917315483093,
0.02809820882976055,
-0.31819701194763184,
-0.06554398685693741,
-0.006162228994071484,
-0.011025694198906422,
-0.009670835919678211,
0.0067055318504571915,
0.15301978588104248,
-0.08134353905916214,
0.05559884384274483,
-0.08560620993375778,
0.0861295536160469,
-0.03746950626373291,
0.05120972916483879,
0.07820367813110352,
0.07370933890342712,
-0.004319734871387482,
0.055880967527627945,
-0.2865883708000183,
0.3017168641090393,
0.001620374503545463,
0.08435773849487305,
-0.064304418861866,
-0.03195526823401451,
0.03303099051117897,
0.08266603201627731,
0.08662337809801102,
-0.015407705679535866,
-0.022004852071404457,
-0.21451444923877716,
-0.021488137543201447,
0.031063731759786606,
0.1287105530500412,
-0.017173022031784058,
0.1035948246717453,
-0.009757723659276962,
-0.005306027829647064,
0.0742693617939949,
-0.0015392001951113343,
-0.03339507430791855,
-0.09077829122543335,
-0.025894852355122566,
-0.024427296593785286,
-0.04903256520628929,
-0.058868542313575745,
-0.1064874455332756,
-0.1150006651878357,
0.11179830878973007,
0.017861904576420784,
-0.014447547495365143,
-0.12083854526281357,
0.09815336763858795,
0.07833965867757797,
-0.0757899358868599,
0.04160116985440254,
0.031815797090530396,
0.05802605301141739,
0.03208249807357788,
-0.05753142386674881,
0.1181529089808464,
-0.06018245965242386,
-0.15957868099212646,
-0.05636918917298317,
0.09072721004486084,
0.051702916622161865,
0.05709807947278023,
-0.024706101045012474,
0.01568504609167576,
-0.017565278336405754,
-0.09192129224538803,
0.05393679440021515,
-0.04255485162138939,
0.06287126243114471,
0.009454045444726944,
-0.019709523767232895,
0.05197237432003021,
-0.05623637139797211,
-0.012761510908603668,
0.14680416882038116,
0.28500640392303467,
-0.08923042565584183,
0.013350757770240307,
0.016548991203308105,
-0.06610107421875,
-0.19023257493972778,
0.07929395884275436,
0.058105308562517166,
0.00008998906560009345,
0.08741090446710587,
-0.16739872097969055,
0.09827789664268494,
0.10361597687005997,
0.00025991597794927657,
0.11594543606042862,
-0.36797022819519043,
-0.1282859444618225,
0.0808480754494667,
0.19107834994792938,
0.07701307535171509,
-0.15563741326332092,
0.0008620609296485782,
-0.0019507973920553923,
-0.1499459445476532,
0.09106019884347916,
-0.0757884681224823,
0.13535983860492706,
-0.019908027723431587,
0.08612962067127228,
0.016527820378541946,
-0.061601947993040085,
0.12268770486116409,
-0.0035482824314385653,
0.14045614004135132,
-0.06954234838485718,
-0.039871007204055786,
0.05258661136031151,
-0.03769446536898613,
-0.01323764305561781,
-0.047285471111536026,
0.027024615556001663,
-0.06233619153499603,
-0.012260675430297852,
-0.10454952716827393,
0.012756403535604477,
-0.03878261148929596,
-0.06711819767951965,
-0.04569168761372566,
0.04335353523492813,
0.045359447598457336,
-0.0032095080241560936,
0.1532445102930069,
-0.010446382686495781,
0.11403460800647736,
0.04868902266025543,
0.05887012928724289,
-0.06381969153881073,
-0.10749541223049164,
-0.01864701882004738,
0.009322031401097775,
0.04835577681660652,
-0.13471733033657074,
0.015616605058312416,
0.1533302664756775,
0.05029422044754028,
0.12227745354175568,
0.08609794825315475,
-0.031964290887117386,
0.032287679612636566,
0.06911054253578186,
-0.15714982151985168,
-0.11284763365983963,
0.0014580510323867202,
-0.06605178862810135,
-0.0725119411945343,
0.052269332110881805,
0.07732946425676346,
-0.0758899673819542,
0.012607252225279808,
-0.006741808261722326,
0.006234102416783571,
-0.0673806369304657,
0.20487940311431885,
0.05585683509707451,
0.04153952747583389,
-0.10360956937074661,
0.07323930412530899,
0.018787220120429993,
-0.08744876831769943,
-0.0016568107530474663,
0.09135906398296356,
-0.06917626410722733,
-0.025194313377141953,
0.08136427402496338,
0.19110935926437378,
-0.07766006141901016,
-0.023394934833049774,
-0.15034817159175873,
-0.1057891845703125,
0.06982511281967163,
0.18450860679149628,
0.10062385350465775,
-0.0069414786994457245,
-0.05230945348739624,
0.04705843701958656,
-0.11749081313610077,
0.07734023034572601,
0.02442292682826519,
0.08144150674343109,
-0.14956334233283997,
0.1826903074979782,
0.011796488426625729,
0.05500001460313797,
-0.02602897584438324,
0.03271137550473213,
-0.11898889392614365,
0.040465790778398514,
-0.11510283499956131,
-0.03613242134451866,
-0.01559432502835989,
0.004944286309182644,
-0.013556567020714283,
-0.062393318861722946,
-0.06288383156061172,
0.005439265631139278,
-0.12745097279548645,
-0.02235981449484825,
0.04565766826272011,
0.022781969979405403,
-0.12614010274410248,
-0.039114393293857574,
0.02763887494802475,
-0.06340490281581879,
0.05586554482579231,
0.035160649567842484,
0.01393399853259325,
0.0654783844947815,
-0.17323091626167297,
-0.022153202444314957,
0.06986036151647568,
-0.0071016703732311726,
0.0629255622625351,
-0.03506511449813843,
-0.025827208533883095,
-0.029711080715060234,
0.08710791170597076,
0.012160781770944595,
0.06174147501587868,
-0.13714313507080078,
0.005756697151809931,
-0.032425958663225174,
-0.09245076030492783,
-0.05904403701424599,
0.05291257053613663,
0.06170393154025078,
0.03640706092119217,
0.162594735622406,
-0.08286266773939133,
0.04484540969133377,
-0.21880501508712769,
-0.016264908015727997,
0.0020525846630334854,
-0.10712820291519165,
-0.08189409226179123,
-0.07252685725688934,
0.08262094110250473,
-0.07527562975883484,
0.11093328148126602,
0.03733457252383232,
0.06539850682020187,
0.03136250376701355,
-0.03325005993247032,
-0.004015207756310701,
0.033998697996139526,
0.21095997095108032,
0.01131194457411766,
-0.032635658979415894,
0.08908738940954208,
0.07929940521717072,
0.1002504751086235,
0.13668541610240936,
0.22738473117351532,
0.15567807853221893,
-0.02615639567375183,
0.08923304080963135,
0.05156822130084038,
-0.06485996395349503,
-0.17366081476211548,
0.03594633936882019,
-0.051819268614053726,
0.09878953546285629,
-0.0613417811691761,
0.20290838181972504,
0.08652909845113754,
-0.18303073942661285,
0.06638329476118088,
-0.045369233936071396,
-0.10119159519672394,
-0.08036218583583832,
-0.0369911789894104,
-0.06983669102191925,
-0.14850695431232452,
0.025630207732319832,
-0.10259095579385757,
0.04344462975859642,
0.15139463543891907,
0.009968074038624763,
-0.013301460072398186,
0.21381555497646332,
0.03368299826979637,
0.03610896319150925,
0.05807976797223091,
0.01468526478856802,
-0.02893110364675522,
-0.09268814325332642,
-0.06092822179198265,
0.0181112140417099,
-0.02936984784901142,
0.01819498836994171,
-0.06922300904989243,
-0.07682016491889954,
0.026964617893099785,
0.005330764688551426,
-0.09382299333810806,
0.023652121424674988,
0.02023838460445404,
0.09042894095182419,
0.026142094284296036,
0.006157137453556061,
0.016918236389756203,
-0.028539298102259636,
0.24675273895263672,
-0.09260448068380356,
-0.08138827979564667,
-0.08192320168018341,
0.2162843942642212,
0.031685929745435715,
0.0008708940003998578,
0.008506551384925842,
-0.08156098425388336,
0.009330871514976025,
0.229658842086792,
0.17099733650684357,
-0.13237234950065613,
-0.010739377699792385,
0.00029733075643889606,
0.0015809581382200122,
-0.02964968793094158,
0.11865117400884628,
0.12137974798679352,
0.05262230709195137,
-0.1141558513045311,
-0.05330583080649376,
-0.05290493369102478,
-0.018460705876350403,
-0.026645606383681297,
0.049578577280044556,
0.06712731719017029,
0.02246268279850483,
-0.06960747390985489,
0.07578609138727188,
-0.05839916691184044,
-0.14314568042755127,
0.10566197335720062,
-0.2278074026107788,
-0.1565326750278473,
-0.006992189213633537,
0.12235560268163681,
0.0043228790163993835,
0.059727102518081665,
-0.041665781289339066,
0.0011999376583844423,
0.050969064235687256,
-0.00496900686994195,
-0.07886825501918793,
-0.10354293882846832,
0.08373842388391495,
-0.11604058742523193,
0.2180321216583252,
-0.05904214084148407,
0.03368491679430008,
0.11293542385101318,
0.06384728103876114,
-0.05000444874167442,
0.05594494938850403,
0.04137132316827774,
-0.12399120628833771,
-0.00446027796715498,
0.12379956990480423,
-0.03845841437578201,
0.05380887910723686,
0.032907139509916306,
-0.13257251679897308,
0.03222835436463356,
-0.05664975568652153,
-0.0409175381064415,
-0.02695395052433014,
-0.05119253322482109,
-0.0637136772274971,
0.11526946723461151,
0.2085876315832138,
-0.008235901594161987,
0.02401254139840603,
-0.0869651511311531,
0.016591254621744156,
0.06546622514724731,
0.04880765452980995,
-0.07835188508033752,
-0.21556445956230164,
0.007447326090186834,
0.07053332030773163,
-0.041941456496715546,
-0.2045643925666809,
-0.11180566996335983,
0.03667713701725006,
-0.0545315258204937,
-0.07178668677806854,
0.09040222316980362,
0.09079393744468689,
0.056087128818035126,
-0.0550009123980999,
-0.10460945963859558,
-0.058491162955760956,
0.17066523432731628,
-0.1471264660358429,
-0.07748393714427948
] |
null | null | null |
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="Ostfriese/q-taxi", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
| {"tags": ["Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "q-taxi", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "Taxi-v3", "type": "Taxi-v3"}, "metrics": [{"type": "mean_reward", "value": "7.56 +/- 2.71", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | Ostfriese/q-taxi | [
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | 2024-02-12T12:54:02+00:00 | [] | [] | TAGS
#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
|
# Q-Learning Agent playing1 Taxi-v3
This is a trained model of a Q-Learning agent playing Taxi-v3 .
## Usage
| [
"# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
"TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n",
"# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
32,
33
] | [
"passage: TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage"
] | [
0.048862796276807785,
-0.16549694538116455,
-0.005485367961227894,
0.02960980497300625,
0.1345081776380539,
-0.01784728653728962,
0.11895976960659027,
0.07759871333837509,
-0.07461097836494446,
-0.055395450443029404,
0.1418241262435913,
0.09088201075792313,
0.055222880095243454,
0.05699880048632622,
0.09511256217956543,
-0.27440664172172546,
0.048217080533504486,
-0.02918700873851776,
0.05621987581253052,
0.11878681182861328,
0.0670095682144165,
-0.040441032499074936,
0.061956584453582764,
0.11818158626556396,
-0.1018151044845581,
-0.007344264071434736,
0.035402704030275345,
-0.09440053254365921,
0.17413531243801117,
0.07204403728246689,
0.12337774783372879,
0.05132639780640602,
0.179361954331398,
-0.12762396037578583,
0.024310702458024025,
-0.0010275895474478602,
-0.10138072073459625,
-0.03909514099359512,
-0.012415820732712746,
-0.08349097520112991,
0.03230205550789833,
0.23522862792015076,
0.07199250161647797,
0.06632792949676514,
-0.17707863450050354,
-0.06584878265857697,
-0.04375573247671127,
0.069611094892025,
0.14951466023921967,
0.03758616745471954,
-0.033800311386585236,
0.1684885323047638,
-0.2564343810081482,
0.05066783353686333,
0.037275806069374084,
-0.42313119769096375,
0.017119819298386574,
0.1507398933172226,
0.15090937912464142,
0.06909667700529099,
-0.10573802888393402,
0.013512322679162025,
0.051325585693120956,
-0.0005318621988408267,
0.024325110018253326,
0.006554204970598221,
0.15601307153701782,
0.08537693321704865,
-0.1487821787595749,
-0.058576688170433044,
0.17441977560520172,
-0.03788546845316887,
-0.02613203600049019,
-0.039745692163705826,
0.0067160045728087425,
-0.06427708268165588,
-0.004067842848598957,
-0.1777995079755783,
0.00734262028709054,
0.06666424125432968,
-0.014348524622619152,
0.014901017770171165,
-0.035522811114788055,
-0.0966939702630043,
-0.023098144680261612,
-0.08592145889997482,
0.01677769608795643,
-0.006319406442344189,
-0.10187895596027374,
0.05002119392156601,
-0.061138734221458435,
0.0014382408699020743,
-0.05123179033398628,
-0.15047866106033325,
-0.049055423587560654,
-0.03481535613536835,
0.1474713832139969,
-0.0044205985032022,
-0.01873963139951229,
-0.03164304047822952,
0.15474793314933777,
0.049551334232091904,
-0.05370146036148071,
0.05625450983643532,
0.07605006545782089,
0.23867930471897125,
0.10401605814695358,
0.10196955502033234,
-0.06798075139522552,
0.10180158913135529,
-0.12330973148345947,
-0.08915644884109497,
-0.17508824169635773,
0.11820860952138901,
0.00015364694991149008,
0.1317785084247589,
-0.12023144960403442,
0.07898581773042679,
-0.067511186003685,
0.013453764840960503,
0.01636839471757412,
0.0820009782910347,
-0.012399360537528992,
0.10676060616970062,
-0.005061192903667688,
-0.06941985338926315,
0.014177112840116024,
0.05935845896601677,
0.03754841163754463,
-0.038601722568273544,
-0.03192409873008728,
-0.05762290954589844,
-0.05065649375319481,
-0.10128600150346756,
-0.06447898596525192,
0.018573462963104248,
-0.007677143905311823,
-0.1833900660276413,
-0.06407523155212402,
0.00897200871258974,
0.015712225809693336,
-0.03988850116729736,
-0.05148044601082802,
-0.15265507996082306,
-0.042461175471544266,
-0.015450406819581985,
-0.03500641882419586,
-0.06214277446269989,
-0.0383245050907135,
0.046435944736003876,
-0.07560601085424423,
0.013364278711378574,
0.023342855274677277,
0.05405820533633232,
-0.025881100445985794,
0.06068144738674164,
-0.08357544988393784,
0.09493788331747055,
-0.1540430635213852,
-0.03271956741809845,
-0.025445878505706787,
-0.041183918714523315,
0.1752462536096573,
0.06099751964211464,
-0.015994304791092873,
0.15260063111782074,
-0.17141541838645935,
-0.058121129870414734,
0.15596486628055573,
0.008629098534584045,
-0.09967197477817535,
-0.003560945624485612,
-0.09397093951702118,
0.1428760588169098,
0.08571921288967133,
0.2478504776954651,
0.12005335837602615,
-0.22748184204101562,
0.055358242243528366,
0.12515293061733246,
-0.14365963637828827,
0.10365243256092072,
0.07344598323106766,
0.005470725707709789,
-0.18886831402778625,
-0.06843198090791702,
-0.06121627986431122,
0.1053021252155304,
-0.08522345870733261,
-0.0776243582367897,
0.09323626756668091,
-0.05086790770292282,
0.24641476571559906,
-0.028281206265091896,
0.06174173951148987,
-0.026681531220674515,
-0.1389324963092804,
-0.01723906397819519,
0.060955192893743515,
0.05258452147245407,
-0.024835573509335518,
-0.25895482301712036,
0.13646544516086578,
0.048650871962308884,
0.025074828416109085,
0.004106190986931324,
-0.05691491439938545,
0.016934165731072426,
0.1511998474597931,
0.020012924447655678,
0.13717477023601532,
0.027723990380764008,
0.0706823319196701,
-0.006239562761038542,
-0.10560829937458038,
-0.04169593006372452,
0.061916545033454895,
-0.08518962562084198,
-0.06641357392072678,
0.011197872459888458,
-0.06935211271047592,
-0.11783787608146667,
-0.12166737765073776,
-0.026334572583436966,
-0.02980303019285202,
-0.07444227486848831,
0.02368103712797165,
0.06536602973937988,
-0.06702698022127151,
-0.0023908785078674555,
0.007125476840883493,
-0.011537045240402222,
0.16434046626091003,
0.011393417604267597,
-0.007796820718795061,
0.1328643560409546,
-0.11533161997795105,
0.12461213022470474,
0.049438029527664185,
-0.024806302040815353,
-0.04662557691335678,
0.0014137453399598598,
-0.057529181241989136,
0.029044216498732567,
-0.04390640929341316,
0.02774495631456375,
0.20111067593097687,
0.02772962674498558,
0.11389166116714478,
-0.0656520202755928,
0.04385066404938698,
-0.007961965166032314,
-0.009693224914371967,
0.018563594669103622,
0.07608018070459366,
0.07813210040330887,
-0.1324140727519989,
0.02262016013264656,
0.22455167770385742,
0.1385764330625534,
0.18313980102539062,
-0.010877152904868126,
0.06325667351484299,
-0.04875868931412697,
0.027505528181791306,
0.024100203067064285,
0.10314226150512695,
-0.10732068121433258,
-0.0322517491877079,
-0.025407759472727776,
0.023599207401275635,
-0.08197105675935745,
-0.1055799350142479,
-0.090115025639534,
0.01222382951527834,
-0.03125503659248352,
-0.15570329129695892,
0.13300658762454987,
-0.10451057553291321,
0.01802753657102585,
0.04692702740430832,
-0.22163605690002441,
0.11530312895774841,
0.014291439205408096,
-0.10303618758916855,
0.11281087249517441,
-0.12051989883184433,
-0.08699832111597061,
-0.05777236074209213,
-0.18658851087093353,
0.05280197039246559,
0.04673841595649719,
0.05166793242096901,
-0.18521739542484283,
0.024835903197526932,
0.05545609071850777,
0.13426995277404785,
-0.09743253141641617,
-0.07142634689807892,
-0.15038461983203888,
0.016068490222096443,
-0.033661190420389175,
-0.16029728949069977,
-0.005609163548797369,
-0.032781440764665604,
-0.18849676847457886,
-0.04539939761161804,
-0.15086813271045685,
-0.034627582877874374,
0.20464378595352173,
0.026907702907919884,
0.09480511397123337,
-0.07926445454359055,
0.3802889585494995,
-0.042039383202791214,
-0.06146497279405594,
-0.01321389526128769,
-0.07072482258081436,
0.02512686513364315,
0.13271741569042206,
0.0036099457647651434,
-0.017886579036712646,
-0.0037857077550143003,
0.0024592927657067776,
-0.06234965845942497,
-0.13400450348854065,
0.0028710351325571537,
0.03905198723077774,
0.1874423623085022,
0.004639793653041124,
0.06659388542175293,
0.03133883699774742,
0.057546284049749374,
0.07748064398765564,
0.030926106497645378,
0.0011591583024710417,
-0.01591806672513485,
0.06604493409395218,
-0.11684755235910416,
0.042466625571250916,
-0.030429253354668617,
-0.10143838077783585,
-0.013183288276195526,
0.07950251549482346,
0.12755028903484344,
0.17849206924438477,
-0.04790908098220825,
0.17489230632781982,
0.13580141961574554,
0.16576050221920013,
0.049315933138132095,
-0.020801831036806107,
-0.08773037046194077,
-0.06118565797805786,
0.004774159751832485,
-0.031952597200870514,
0.04869702458381653,
0.3231290578842163,
0.037619613111019135,
-0.09036035090684891,
0.11149907857179642,
0.009480619803071022,
0.05359881371259689,
0.022797370329499245,
-0.11162138730287552,
0.11170321702957153,
0.07968773692846298,
-0.06341761350631714,
-0.07602835446596146,
0.16758501529693604,
-0.1109386757016182,
-0.26646625995635986,
-0.11410990357398987,
-0.012305386364459991,
0.07903840392827988,
0.005651174578815699,
0.05498376116156578,
-0.11829282343387604,
-0.16034497320652008,
-0.034191906452178955,
0.1335442066192627,
-0.3077351450920105,
0.2065143585205078,
-0.0198091771453619,
0.06707923114299774,
-0.039657969027757645,
-0.07026876509189606,
0.09694647043943405,
0.13174086809158325,
0.29124146699905396,
0.01396956667304039,
0.04841272905468941,
-0.15176129341125488,
-0.0976925864815712,
0.0018439020495861769,
0.015482662245631218,
-0.02563396655023098,
0.028520405292510986,
-0.0540912002325058,
0.008404579944908619,
-0.018086453899741173,
0.2102297693490982,
-0.11316607892513275,
0.004344627261161804,
-0.06968966871500015,
-0.11707738786935806,
0.19409789144992828,
-0.07178345322608948,
-0.04543264955282211,
-0.14959357678890228,
-0.15512511134147644,
-0.004174166824668646,
-0.02413962036371231,
-0.019664527848362923,
-0.17603960633277893,
-0.18804074823856354,
-0.05204557999968529,
-0.005645004566758871,
-0.003464865731075406,
0.05867868289351463,
-0.07517234236001968,
-0.04805335775017738,
0.1009904220700264,
-0.07743175327777863,
-0.056063808500766754,
-0.1103200614452362,
0.1391381323337555,
0.06248528137803078,
0.16743235290050507,
0.05907081440091133,
0.0006117874872870743,
0.11471151560544968,
-0.02913086675107479,
0.11103474348783493,
-0.11291708797216415,
-0.17145049571990967,
-0.08334989100694656,
-0.018775060772895813,
0.09519003331661224,
-0.04789286106824875,
0.0028788831550627947,
0.2550160884857178,
0.14880181849002838,
-0.0897710770368576,
0.27680760622024536,
0.04414956644177437,
-0.09375058114528656,
-0.18432219326496124,
-0.15961645543575287,
0.03759992495179176,
0.060025621205568314,
0.13095876574516296,
-0.057205069810152054,
-0.08483537286520004,
-0.08492398262023926,
-0.07478608191013336,
-0.13140805065631866,
-0.24232175946235657,
-0.030598774552345276,
0.22874866425991058,
0.08656918257474899,
0.08219650387763977,
-0.012482990510761738,
-0.01186054851859808,
0.00526038184762001,
0.02680150233209133,
0.12018456310033798,
-0.13341329991817474,
0.11107480525970459,
0.022198403254151344,
0.044267985969781876,
0.009712530300021172,
0.07929777354001999,
0.03375575691461563,
-0.003218587953597307,
-0.0006439819699153304,
-0.0988350659608841,
-0.2596651017665863,
0.0816885456442833,
-0.01623627357184887,
-0.09960969537496567,
0.014988959766924381,
0.02061903104186058,
-0.2089255303144455,
0.011128270998597145,
-0.019883770495653152,
-0.03150356933474541,
-0.06483490765094757,
-0.10664787143468857,
-0.056551624089479446,
0.04928823933005333,
0.10853826254606247,
0.011660109274089336,
0.05354316532611847,
-0.0404130220413208,
0.07917837053537369,
0.0826287642121315,
0.15132710337638855,
0.06795957684516907,
-0.190711110830307,
-0.10953907668590546,
-0.0414445661008358,
0.12121522426605225,
-0.12505418062210083,
0.036917757242918015,
0.053161121904850006,
-0.016534561291337013,
0.14621229469776154,
0.1070784479379654,
-0.07452095299959183,
0.11915595084428787,
0.08904775977134705,
-0.04094788804650307,
-0.23367151618003845,
-0.07120766490697861,
0.11133213341236115,
0.07195597887039185,
-0.03961895406246185,
0.018120890483260155,
-0.04960581287741661,
-0.013980977237224579,
0.048759616911411285,
-0.0538676381111145,
-0.07230538129806519,
0.004421027842909098,
0.1247575581073761,
0.1029362753033638,
-0.04655474051833153,
0.01296416949480772,
0.037371400743722916,
0.003788623260334134,
0.04730486497282982,
0.0407949760556221,
-0.08269952982664108,
-0.04124005511403084,
0.02782733179628849,
0.37552911043167114,
-0.010165480896830559,
-0.020456433296203613,
0.018555615097284317,
-0.19949445128440857,
0.09135842323303223,
0.13205479085445404,
0.04697350412607193,
0.004247748292982578,
-0.08139242231845856,
0.026877427473664284,
-0.010625290684401989,
0.09936143457889557,
-0.07806670665740967,
-0.05493134260177612,
-0.21631066501140594,
-0.025010565295815468,
0.017490221187472343,
0.24077683687210083,
-0.08458559215068817,
-0.12801732122898102,
-0.20628872513771057,
0.13128381967544556,
-0.11333390325307846,
-0.03695881739258766,
-0.024473199620842934,
0.03926658630371094,
-0.01989821158349514,
0.06291737407445908,
-0.0710630789399147,
0.006373001262545586,
-0.11024709790945053,
0.055267609655857086,
0.04204455390572548,
0.1229788213968277,
0.014207782223820686,
0.02016810141503811,
0.05822525918483734,
-0.01837925612926483,
0.07173580676317215,
-0.06203491613268852,
-0.04550490900874138,
0.14224006235599518,
-0.020255116745829582,
-0.04152837023139,
-0.0483345128595829,
-0.036874305456876755,
0.11981741338968277,
-0.05059147998690605,
-0.007141099311411381,
-0.054929375648498535,
-0.06906463205814362,
0.03462086617946625,
-0.009175732731819153,
-0.008798843249678612,
0.06801853328943253,
0.04024988040328026,
-0.026994358748197556,
0.005263668950647116,
0.03447828069329262,
-0.10330043733119965,
-0.04955084249377251,
0.16955432295799255,
-0.0749620869755745,
0.10274054110050201,
-0.031069839373230934,
0.018015999346971512,
0.005847334861755371,
-0.022399673238396645,
-0.015360680408775806,
-0.1457086056470871,
-0.06137600541114807,
-0.09489979594945908,
0.11565322428941727,
0.08146517723798752,
0.03358805552124977,
0.04274565726518631,
0.019532648846507072,
-0.04414922371506691,
-0.038583990186452866,
0.12961317598819733,
0.08133101463317871,
0.012996876612305641,
0.01137041300535202,
0.01941833831369877,
-0.020302120596170425,
0.0028480992186814547,
-0.01250747125595808,
-0.07239153981208801,
-0.05874783173203468,
0.09400010108947754,
0.1600283533334732,
-0.06127211079001427,
-0.13325586915016174,
-0.020593497902154922,
0.04988488554954529,
0.0014717020094394684,
-0.08777432143688202,
0.04833676666021347,
0.15805292129516602,
-0.05623878911137581,
0.03216489031910896,
-0.09984751045703888,
-0.07263360917568207,
-0.16060975193977356,
-0.10029061883687973,
-0.06092562898993492,
-0.28350353240966797,
0.09752398729324341,
0.006392303854227066,
-0.014731393195688725,
0.059529416263103485,
0.051305368542671204,
-0.052508849650621414,
0.07068239152431488,
-0.18146829307079315,
-0.007054794579744339,
0.03497592359781265,
-0.13212306797504425,
0.02475893869996071,
-0.2378365397453308,
0.10198072344064713,
-0.04623803123831749,
-0.1519704908132553,
-0.04004510119557381,
0.0641569048166275,
-0.09540136158466339,
-0.01822364516556263,
-0.0475153923034668,
-0.01922670193016529,
0.01624443754553795,
-0.009348669089376926,
-0.031147832050919533,
0.13716529309749603,
0.02827494591474533,
-0.03268734738230705,
0.005254602525383234,
0.0223685409873724,
0.03955082967877388,
-0.0969657450914383,
-0.05986930429935455,
0.08311155438423157,
-0.031056145206093788,
0.14728976786136627,
0.000341245875461027,
0.04181376099586487,
-0.06758682429790497,
0.2593761384487152,
0.2023983597755432,
-0.12479214370250702,
0.008118697442114353,
-0.021801479160785675,
0.012670028023421764,
-0.041751839220523834,
0.13110700249671936,
0.013386172242462635,
0.12186761200428009,
-0.17513342201709747,
-0.01036517322063446,
-0.0818324014544487,
-0.04501292482018471,
0.06702108681201935,
0.14714950323104858,
0.15742522478103638,
0.03436789661645889,
-0.07328428328037262,
0.06722653657197952,
-0.30119743943214417,
0.20540550351142883,
-0.1346001923084259,
-0.01498429011553526,
-0.040251150727272034,
-0.058389630168676376,
0.061147745698690414,
0.11309876292943954,
0.10832664370536804,
-0.021150551736354828,
-0.0905047357082367,
-0.04486766457557678,
-0.039378076791763306,
-0.13019338250160217,
-0.02718670479953289,
0.1654091775417328,
0.06799814850091934,
0.31520840525627136,
-0.017577875405550003,
0.07702425122261047,
0.034410297870635986,
0.06451138854026794,
0.004519328009337187,
0.09537279605865479,
0.07960964739322662,
-0.06345855444669724,
-0.07373003661632538,
-0.001637450186535716,
0.05033271387219429,
0.14567798376083374,
-0.03826142102479935,
-0.18691548705101013,
0.15858715772628784,
0.07192251086235046,
-0.13762691617012024,
-0.05777517706155777,
0.08409425616264343,
-0.0739973932504654,
0.0550808347761631,
0.08115427941083908,
0.015876613557338715,
-0.017793258652091026,
-0.004664506763219833,
0.06074233725667,
0.024694660678505898,
-0.02343848906457424,
0.003570882137864828,
-0.08337053656578064,
-0.04151543974876404,
0.07267895340919495,
-0.0844460055232048,
-0.20546193420886993,
-0.0957019031047821,
-0.07551700621843338,
0.030557552352547646,
-0.0649830624461174,
0.12575586140155792,
0.1717868149280548,
0.0593598335981369,
-0.03307248651981354,
-0.10721943527460098,
-0.035562749952077866,
0.07602505385875702,
-0.044773899018764496,
-0.09409699589014053
] |
null | null | null |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-base-rank16
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.8231
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 13.9103 | 1.0 | 179 | 8.8481 |
| 7.5502 | 2.0 | 358 | 5.4331 |
| 5.5677 | 3.0 | 537 | 4.8231 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "microsoft/deberta-v3-base", "model-index": [{"name": "deberta-v3-base-rank16", "results": []}]} | null | alitolga/deberta-v3-base-rank16 | [
"safetensors",
"generated_from_trainer",
"base_model:microsoft/deberta-v3-base",
"license:mit",
"region:us"
] | 2024-02-12T12:55:32+00:00 | [] | [] | TAGS
#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us
| deberta-v3-base-rank16
======================
This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 4.8231
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.1+cu118
* Datasets 2.15.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
37,
98,
4,
33
] | [
"passage: TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
-0.10977909713983536,
-0.02383926510810852,
-0.000272897508693859,
0.0962144061923027,
0.18739436566829681,
0.03188342973589897,
0.1521315574645996,
0.04621896147727966,
-0.11039812862873077,
0.026030896231532097,
0.10344589501619339,
0.12346210330724716,
-0.01214214600622654,
0.12621824443340302,
-0.05480639636516571,
-0.22153477370738983,
0.015339897945523262,
0.017923252657055855,
-0.055215757340192795,
0.11165095120668411,
0.10440938174724579,
-0.16147270798683167,
0.0682869404554367,
-0.013526142574846745,
-0.23760250210762024,
0.03791137784719467,
0.04424617439508438,
-0.06407684087753296,
0.14295685291290283,
-0.009633136913180351,
0.18386055529117584,
0.012823836877942085,
0.13880614936351776,
-0.16603654623031616,
0.01076443400233984,
0.06743723154067993,
0.0072526587173342705,
0.05671124532818794,
0.06414835900068283,
0.008718214929103851,
0.08650407195091248,
-0.10005319863557816,
0.0757589116692543,
0.031208310276269913,
-0.13932667672634125,
-0.27417436242103577,
-0.08789606392383575,
0.0025896879378706217,
0.07555236667394638,
0.08566202968358994,
-0.013975093141198158,
0.17519612610340118,
-0.10044020414352417,
0.08535193651914597,
0.25552237033843994,
-0.25281932950019836,
-0.07021211832761765,
0.071146659553051,
0.023459527641534805,
0.08894380927085876,
-0.10143274813890457,
-0.02900632470846176,
0.0903119370341301,
0.04442868381738663,
0.12188786268234253,
-0.012826957739889622,
-0.11815144866704941,
0.009122032672166824,
-0.14692682027816772,
-0.004426182713359594,
0.0912783071398735,
0.03500741720199585,
-0.054377976804971695,
-0.004236732143908739,
-0.07277829945087433,
-0.1371414214372635,
-0.051367513835430145,
-0.04145136475563049,
0.06730754673480988,
-0.06141535937786102,
-0.08764902502298355,
0.01499799732118845,
-0.11625413596630096,
-0.11828317493200302,
-0.024499638006091118,
0.24395588040351868,
0.052924994379282,
0.028124088421463966,
-0.04764493182301521,
0.1291472613811493,
-0.05694502219557762,
-0.1366644948720932,
0.024524841457605362,
0.019550397992134094,
0.024960318580269814,
-0.05345170944929123,
-0.07289297133684158,
-0.0356169231235981,
0.027074923738837242,
0.1499943733215332,
-0.1311299353837967,
0.0237225741147995,
0.04559052735567093,
0.028581978753209114,
-0.11121973395347595,
0.16195543110370636,
-0.035876959562301636,
0.010333948768675327,
0.028275663033127785,
0.07690852880477905,
0.03842984512448311,
0.0128098726272583,
-0.07406949251890182,
0.005984250921756029,
0.1138528510928154,
0.03108055889606476,
-0.10245327651500702,
0.03943431377410889,
-0.04984300583600998,
0.02609124220907688,
0.022958341985940933,
-0.09585303068161011,
0.02446972019970417,
0.01637401059269905,
-0.0685202106833458,
-0.0464635044336319,
0.017798306420445442,
0.028543973341584206,
0.041776999831199646,
0.1314091831445694,
-0.09103567898273468,
0.033552948385477066,
-0.1253967434167862,
-0.11012474447488785,
-0.012358481995761395,
-0.013520389795303345,
0.038584187626838684,
-0.11949725449085236,
-0.1693306863307953,
-0.004231118597090244,
0.03667663410305977,
-0.015293833799660206,
0.01881149411201477,
-0.03158961609005928,
-0.0996800884604454,
-0.012826534919440746,
-0.026565516367554665,
0.11635850369930267,
-0.0694487988948822,
0.11646972596645355,
0.09728671610355377,
0.05548146739602089,
-0.09908585995435715,
0.02373643033206463,
-0.10591994971036911,
0.0024640432093292475,
-0.2569774389266968,
-0.005124390125274658,
-0.06571763008832932,
0.06924369931221008,
-0.03966303914785385,
-0.1075524166226387,
0.022764498367905617,
0.005322130396962166,
0.10874362289905548,
0.1000693067908287,
-0.17926017940044403,
-0.05148404836654663,
0.12636883556842804,
-0.1271693855524063,
-0.1113729402422905,
0.08568505942821503,
-0.04757939279079437,
0.06583791971206665,
0.07703462988138199,
0.16356876492500305,
-0.00956286396831274,
-0.15184687077999115,
0.010717145167291164,
-0.0590687021613121,
0.04493590071797371,
-0.05465210974216461,
0.0488610714673996,
0.0008339445339515805,
-0.021729113534092903,
0.032641030848026276,
-0.06953171640634537,
0.04686718434095383,
-0.1324886828660965,
-0.07485809177160263,
-0.06488070636987686,
-0.1212497353553772,
0.041503455489873886,
0.06225567311048508,
0.06155192852020264,
-0.1368376612663269,
-0.061608269810676575,
0.1359492540359497,
0.0928584486246109,
-0.05217166990041733,
0.004119411576539278,
-0.05063227564096451,
0.05520080775022507,
-0.041433896869421005,
-0.04931100085377693,
-0.1645110845565796,
-0.08925328403711319,
0.015393294394016266,
-0.0047068120911717415,
0.00327594974078238,
-0.044142138212919235,
0.07455213367938995,
0.11798553168773651,
-0.07449058443307877,
-0.04214201495051384,
-0.10275856405496597,
0.019334375858306885,
-0.1075754463672638,
-0.19162553548812866,
-0.03287539631128311,
-0.00888550654053688,
0.07661817222833633,
-0.20160453021526337,
0.04192623123526573,
-0.029132604598999023,
0.09545008093118668,
0.02637353353202343,
-0.005475367419421673,
-0.06444989144802094,
0.09747123718261719,
0.0045448471792042255,
-0.0580558106303215,
0.02937583066523075,
-0.021635031327605247,
-0.04795526713132858,
-0.09300464391708374,
-0.09899953752756119,
0.19842319190502167,
0.14743494987487793,
-0.10093656182289124,
-0.09002619981765747,
0.0297528188675642,
-0.06982388347387314,
-0.016209444031119347,
-0.06776061654090881,
0.020620150491595268,
0.11483170837163925,
-0.02282099425792694,
0.11661163717508316,
-0.09248635172843933,
-0.022356461733579636,
0.0014529010513797402,
-0.041367556899785995,
0.057399142533540726,
0.06753043085336685,
0.11225567758083344,
-0.053833018988370895,
0.11423299461603165,
0.15965376794338226,
-0.11110177636146545,
0.10334773361682892,
-0.06334836781024933,
-0.07238177955150604,
-0.015347606502473354,
-0.01436277199536562,
-0.007964075542986393,
0.1776101291179657,
-0.04427338019013405,
0.03807692229747772,
-0.009708701632916927,
0.017880810424685478,
0.027409786358475685,
-0.2506949007511139,
-0.052766069769859314,
-0.006493645720183849,
-0.04180464893579483,
-0.046832308173179626,
-0.01916314661502838,
0.020225372165441513,
0.10081995278596878,
-0.04058639332652092,
-0.057044293731451035,
0.0223590936511755,
0.00557175325229764,
-0.07168775796890259,
0.22610197961330414,
-0.07881317287683487,
-0.04608485847711563,
-0.10444154590368271,
0.012919694185256958,
-0.04146885499358177,
0.009725523181259632,
0.0729912519454956,
-0.11279396712779999,
-0.044001005589962006,
-0.06022777780890465,
0.05084332823753357,
0.07876109331846237,
0.039546042680740356,
0.007717863656580448,
0.01782396249473095,
0.09497642517089844,
-0.14543552696704865,
0.0006467378116212785,
-0.07175392657518387,
-0.08312161266803741,
0.056910187005996704,
0.08347267657518387,
0.11983032524585724,
0.13179658353328705,
-0.040337517857551575,
-0.02079136297106743,
-0.02325625531375408,
0.27377429604530334,
-0.06932184845209122,
-0.062200676649808884,
0.14498460292816162,
-0.0005378815694712102,
0.03789127990603447,
0.11220243573188782,
0.08702115714550018,
-0.13622024655342102,
0.029426908120512962,
0.025967992842197418,
-0.021020567044615746,
-0.21206539869308472,
-0.028446493670344353,
-0.0029190380591899157,
-0.07482894510030746,
0.0390218086540699,
0.021479828283190727,
-0.0022142441011965275,
0.06011484935879707,
0.02166934125125408,
0.048596709966659546,
-0.0249907448887825,
0.06053663790225983,
0.0602521076798439,
0.046088092029094696,
0.10996872186660767,
-0.04331322759389877,
-0.06801527738571167,
0.02170231193304062,
-0.04719885438680649,
0.22817444801330566,
0.00001491811781306751,
0.041673850268125534,
0.07939895242452621,
0.17735464870929718,
-0.002462110947817564,
0.10430179536342621,
0.019770093262195587,
-0.06619296967983246,
0.011421018280088902,
-0.07086790353059769,
-0.003634663764387369,
0.02175724133849144,
-0.14258019626140594,
0.08539474755525589,
-0.12784256041049957,
-0.004376190714538097,
0.0801885724067688,
0.22792589664459229,
0.02038872055709362,
-0.33866608142852783,
-0.07218042016029358,
0.0015513667603954673,
-0.007956861518323421,
-0.023518022149801254,
0.011211591772735119,
0.1214594691991806,
-0.04184851422905922,
0.027020303532481194,
-0.03522983193397522,
0.06242155283689499,
0.031152624636888504,
0.04701671749353409,
0.05710339918732643,
0.13777777552604675,
-0.01353135984390974,
0.03180978074669838,
-0.2967328429222107,
0.2733897864818573,
0.019609946757555008,
0.13773353397846222,
-0.023160841315984726,
-0.02387249656021595,
0.020564470440149307,
0.06624981760978699,
0.01824108138680458,
-0.030782584100961685,
-0.05063082277774811,
-0.23543617129325867,
-0.039089519530534744,
0.06226648762822151,
0.12959490716457367,
0.01751827634871006,
0.09988957643508911,
-0.00317860534414649,
0.012835978530347347,
0.10463982820510864,
-0.06836505234241486,
-0.19078455865383148,
-0.010738518089056015,
-0.0613342821598053,
0.02804485149681568,
0.016211025416851044,
-0.11761446297168732,
-0.1024993360042572,
-0.0700412392616272,
0.06540956348180771,
0.0037123053334653378,
-0.03701884299516678,
-0.11041536182165146,
0.07908197492361069,
0.09566164761781693,
-0.048732247203588486,
0.07471208274364471,
0.041078776121139526,
0.056677158921957016,
0.024378404021263123,
-0.03807076811790466,
0.10815038532018661,
-0.06901469081640244,
-0.2009645253419876,
-0.05414474010467529,
0.08314872533082962,
0.05137045681476593,
0.03196307271718979,
-0.01318739727139473,
0.015162421390414238,
0.008538338355720043,
-0.09961853921413422,
0.0038757321890443563,
-0.02996899001300335,
0.05577516183257103,
0.028683772310614586,
-0.051311735063791275,
-0.042351674288511276,
-0.050810445100069046,
-0.0363025926053524,
0.10146798938512802,
0.32721322774887085,
-0.06988053023815155,
-0.04230755195021629,
0.08364763855934143,
-0.03463691845536232,
-0.17070871591567993,
0.09220478683710098,
0.06034506857395172,
-0.0038430248387157917,
0.07597258687019348,
-0.11784222722053528,
0.12991152703762054,
0.14157889783382416,
-0.028757434338331223,
0.1240784227848053,
-0.28924304246902466,
-0.1585167944431305,
0.11064843088388443,
0.21439293026924133,
0.1088043749332428,
-0.1557728499174118,
-0.01764458417892456,
-0.04504897817969322,
-0.13108333945274353,
0.09324704110622406,
-0.15120381116867065,
0.08701140433549881,
-0.01612214185297489,
0.06821642071008682,
-0.0008300155750475824,
-0.06112472340464592,
0.14311541616916656,
0.0007622124394401908,
0.1586318016052246,
-0.03935972973704338,
-0.012870344333350658,
0.09020872414112091,
-0.021951651200652122,
0.0271515604108572,
-0.01850840076804161,
0.04274798557162285,
0.009107017889618874,
-0.023169470950961113,
-0.08148299902677536,
0.049713414162397385,
-0.0533994697034359,
-0.08378700911998749,
-0.024846956133842468,
0.018639277666807175,
-0.024928512051701546,
-0.04630934074521065,
0.09266750514507294,
0.03939923644065857,
0.16225983202457428,
0.07548313587903976,
0.029977168887853622,
-0.0779544860124588,
-0.00511569669470191,
0.03007619082927704,
-0.0344298779964447,
0.07267012447118759,
-0.13134963810443878,
0.012572791427373886,
0.1029602587223053,
0.020421480759978294,
0.10135439038276672,
0.08415261656045914,
-0.04837226867675781,
0.028388533741235733,
0.09198541939258575,
-0.16636347770690918,
-0.0741669088602066,
0.008622251451015472,
-0.039275527000427246,
-0.07221722602844238,
0.10561412572860718,
0.0799483135342598,
-0.09373471885919571,
-0.007967788726091385,
-0.03593408688902855,
-0.012631812132894993,
-0.07303313910961151,
0.2227294147014618,
0.10064034163951874,
0.05390103533864021,
-0.09364035725593567,
0.09540357440710068,
0.037226930260658264,
-0.019714446738362312,
-0.02629864402115345,
0.050324659794569016,
-0.0708092525601387,
-0.0081774378195405,
0.11950493603944778,
0.19487260282039642,
-0.07515739649534225,
-0.06573905795812607,
-0.18015339970588684,
-0.11609358340501785,
0.016829947009682655,
0.19300605356693268,
0.10496100038290024,
0.005628860089927912,
0.018334483727812767,
0.04501958191394806,
-0.13519175350666046,
0.08371682465076447,
0.013451937586069107,
0.09526120126247406,
-0.15351462364196777,
0.1688528209924698,
0.02303115464746952,
-0.0004302372981328517,
-0.03028959222137928,
0.06828100979328156,
-0.1316944658756256,
0.024528106674551964,
-0.1442081779241562,
-0.04870409145951271,
0.009894388727843761,
-0.0011268117232248187,
0.009574041701853275,
-0.07749912887811661,
-0.07624939829111099,
0.047325003892183304,
-0.10701669752597809,
-0.00812611822038889,
0.05367180332541466,
0.03715867921710014,
-0.15122228860855103,
-0.03743722289800644,
0.019229386001825333,
-0.06613703072071075,
0.03823935240507126,
0.0487651601433754,
0.03849530592560768,
0.09630363434553146,
-0.20997434854507446,
-0.0029430179856717587,
0.08880948275327682,
-0.016367431730031967,
0.0696609690785408,
-0.05353253334760666,
-0.02602485939860344,
-0.007858239114284515,
0.0978042483329773,
0.008416155353188515,
0.0867205411195755,
-0.13997472822666168,
-0.006191175431013107,
-0.02968805655837059,
-0.06472725421190262,
-0.04450218006968498,
-0.0349254235625267,
0.09429186582565308,
-0.013142097741365433,
0.18351885676383972,
-0.09520009905099869,
0.011849514208734035,
-0.21236686408519745,
-0.014442931860685349,
-0.028647301718592644,
-0.09178724139928818,
-0.15416806936264038,
-0.026780616492033005,
0.06469432264566422,
-0.04646223038434982,
0.14698436856269836,
-0.0015701946103945374,
0.06641637533903122,
0.0374632254242897,
-0.03034866787493229,
0.026866896077990532,
0.0457080602645874,
0.24175049364566803,
0.031129976734519005,
-0.00613299710676074,
0.03124288097023964,
0.0734565481543541,
0.11419452726840973,
0.01158747635781765,
0.22155217826366425,
0.18127477169036865,
-0.07043489068746567,
0.1019318550825119,
0.0614655576646328,
-0.06218587979674339,
-0.11618132889270782,
0.0043325223959982395,
-0.024654695764183998,
0.025565601885318756,
-0.037003397941589355,
0.18632014095783234,
0.10361658036708832,
-0.16563965380191803,
0.013917487114667892,
-0.0577508844435215,
-0.07344912737607956,
-0.09634224325418472,
0.07478474825620651,
-0.08116906881332397,
-0.19115056097507477,
0.026669880375266075,
-0.11170000582933426,
-0.01101001352071762,
0.1307188719511032,
-0.015901604667305946,
-0.008357066661119461,
0.235883891582489,
0.07531672716140747,
0.062061015516519547,
0.01887267641723156,
0.002806885400786996,
-0.03576918691396713,
-0.07197372615337372,
-0.1012999638915062,
0.005275350995361805,
-0.0329875648021698,
0.015152910724282265,
-0.05596904084086418,
-0.12911638617515564,
0.0355282686650753,
0.015552476979792118,
-0.09444280713796616,
0.024008115753531456,
0.033259183168411255,
0.050202906131744385,
0.02025768719613552,
0.011496256105601788,
0.022110527381300926,
-0.0040362440049648285,
0.2072332501411438,
-0.057010989636182785,
-0.13155171275138855,
-0.06247086822986603,
0.26245516538619995,
0.04406530410051346,
0.021840333938598633,
0.02593136578798294,
-0.12180887162685394,
0.010624225251376629,
0.15366508066654205,
0.14873500168323517,
-0.08162666112184525,
-0.0008891946054063737,
-0.04128885641694069,
-0.02712344564497471,
-0.09549082070589066,
0.12738096714019775,
0.1272391825914383,
0.04202356934547424,
-0.09513653069734573,
-0.020700950175523758,
-0.05248567834496498,
-0.008205144666135311,
-0.04144055396318436,
0.02868380770087242,
0.03159588947892189,
0.009482614696025848,
-0.059001702815294266,
0.07555936276912689,
-0.01746372878551483,
-0.11142945289611816,
0.0962703675031662,
-0.15968038141727448,
-0.14188013970851898,
-0.010838326066732407,
0.1389695703983307,
-0.023466376587748528,
0.06749363988637924,
-0.052638206630945206,
-0.005400174763053656,
0.04293496161699295,
-0.02973678521811962,
-0.051605794578790665,
-0.127696231007576,
0.0767015814781189,
-0.0755990594625473,
0.236514151096344,
-0.0324367955327034,
0.12624210119247437,
0.10700509697198868,
0.03140915557742119,
-0.09085901826620102,
0.12035630643367767,
0.03917442262172699,
-0.1436665952205658,
-0.0003976380976382643,
0.0872146487236023,
-0.052429888397455215,
0.0752221941947937,
0.01663416437804699,
-0.1661776900291443,
0.028236106038093567,
0.006344062741845846,
-0.0737876445055008,
-0.057386551052331924,
-0.06284373253583908,
-0.06381191313266754,
0.09128324687480927,
0.1598876714706421,
-0.03899944946169853,
0.04929110035300255,
-0.0702781155705452,
0.07627421617507935,
0.0825834795832634,
0.04035616293549538,
-0.028585845604538918,
-0.2825299799442291,
0.058108292520046234,
0.14923399686813354,
-0.056315988302230835,
-0.2234940379858017,
-0.07451563328504562,
0.01051599532365799,
-0.0654032900929451,
-0.09339899569749832,
0.06626714020967484,
0.11650731414556503,
0.05525080859661102,
-0.045404739677906036,
-0.1677107810974121,
-0.10386636853218079,
0.1699165403842926,
-0.14799459278583527,
-0.12518051266670227
] |
null | null | diffusers | ### my-rose-prp Dreambooth model trained by Priyanka004 following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 962822106015
Sample pictures of this concept:

| {"license": "creativeml-openrail-m", "tags": ["NxtWave-GenAI-Webinar", "text-to-image", "stable-diffusion"]} | text-to-image | Priyanka004/my-rose-prp | [
"diffusers",
"safetensors",
"NxtWave-GenAI-Webinar",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-12T12:56:24+00:00 | [] | [] | TAGS
#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
| ### my-rose-prp Dreambooth model trained by Priyanka004 following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 962822106015
Sample pictures of this concept:
!0
| [
"### my-rose-prp Dreambooth model trained by Priyanka004 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 962822106015\n\nSample pictures of this concept:\n\n !0"
] | [
"TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"### my-rose-prp Dreambooth model trained by Priyanka004 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 962822106015\n\nSample pictures of this concept:\n\n !0"
] | [
73,
56
] | [
"passage: TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n### my-rose-prp Dreambooth model trained by Priyanka004 following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: 962822106015\n\nSample pictures of this concept:\n\n !0"
] | [
-0.12228117138147354,
0.0922800824046135,
-0.0016109429998323321,
0.027936914935708046,
0.0684376135468483,
-0.03996383771300316,
0.1737826019525528,
-0.007999158464372158,
-0.005456462036818266,
0.04478279501199722,
0.1402447372674942,
0.07712545990943909,
0.016207892447710037,
0.17284344136714935,
-0.031703606247901917,
-0.13572783768177032,
0.04507136344909668,
0.08726625144481659,
0.015179364942014217,
0.07519181072711945,
0.08625978976488113,
-0.07827138900756836,
0.13974815607070923,
-0.025513438507914543,
-0.1508832722902298,
-0.01395483035594225,
-0.024287570267915726,
-0.04899701848626137,
0.07402662187814713,
0.04376189783215523,
0.06607003509998322,
0.10692556947469711,
0.03255767002701759,
-0.06014866754412651,
0.045953597873449326,
0.01109304465353489,
-0.046138886362314224,
0.05520512908697128,
0.02368391677737236,
0.0764121413230896,
0.1411103755235672,
0.0674251914024353,
-0.06204948574304581,
0.03844066709280014,
-0.08687539398670197,
-0.0006548706442117691,
0.023133033886551857,
0.14734701812267303,
0.1217406839132309,
0.09452833235263824,
0.008138573728501797,
0.11258519440889359,
0.06831899285316467,
0.10783810168504715,
0.11523646861314774,
-0.2832981050014496,
-0.10704311728477478,
0.17359356582164764,
0.0970410481095314,
0.040338534861803055,
-0.034277819097042084,
0.10162638872861862,
0.09567193686962128,
-0.010358973406255245,
0.03456578776240349,
-0.06483816355466843,
0.033833522349596024,
-0.08232805877923965,
-0.10403165966272354,
0.03018631599843502,
0.24336013197898865,
0.06209897622466087,
-0.024359147995710373,
-0.06116510555148125,
-0.10420890152454376,
-0.002922909567132592,
-0.04272576421499252,
-0.04860834404826164,
-0.06028991937637329,
0.026208844035863876,
-0.002885108580812812,
-0.05754014477133751,
-0.13304373621940613,
-0.053809843957424164,
-0.0031440446618944407,
0.12720072269439697,
0.005984982941299677,
0.06743663549423218,
-0.11926694959402084,
0.07383675128221512,
0.044000059366226196,
-0.11813493818044662,
0.01926233060657978,
-0.12377459555864334,
0.058890338987112045,
0.054326124489307404,
0.038291994482278824,
-0.07381480932235718,
0.05109306797385216,
0.007408895064145327,
0.008104355074465275,
-0.0123215913772583,
0.06919228285551071,
0.0840439647436142,
0.024619126692414284,
-0.07539722323417664,
-0.09629538655281067,
-0.07831717282533646,
0.012798832729458809,
-0.052661120891571045,
0.03503413498401642,
-0.028733238577842712,
-0.07680024951696396,
0.011168109253048897,
-0.06000922620296478,
0.03900284692645073,
0.01113472692668438,
0.04961491376161575,
-0.023627398535609245,
-0.04160508140921593,
0.2183012068271637,
0.053305041044950485,
-0.024740178138017654,
-0.04478305205702782,
0.015368682332336903,
0.06554563343524933,
0.06672332435846329,
-0.0167439803481102,
-0.021914390847086906,
-0.0017239702865481377,
-0.10164255648851395,
-0.04258255288004875,
-0.037702057510614395,
-0.011569986119866371,
0.013296965509653091,
-0.17540740966796875,
0.05263727530837059,
-0.18932609260082245,
-0.08519649505615234,
0.04988468810915947,
0.07284459471702576,
-0.03721005469560623,
-0.05782517418265343,
-0.033068228513002396,
-0.08359822630882263,
0.0002382809907430783,
-0.029682904481887817,
-0.02771083265542984,
-0.007891912944614887,
0.027747511863708496,
0.017889926210045815,
0.08386670798063278,
-0.20872385799884796,
-0.0000459565817436669,
-0.05282704532146454,
0.03359875828027725,
0.000369888060959056,
-0.020078497007489204,
-0.035880010575056076,
0.06733929365873337,
-0.02605586126446724,
-0.034953273832798004,
0.002750405576080084,
0.013086342252790928,
0.03446412459015846,
0.14425532519817352,
-0.08968811482191086,
0.010851923376321793,
0.15109898149967194,
-0.14874888956546783,
-0.19543278217315674,
0.07135780155658722,
0.04899449273943901,
0.08712498843669891,
0.04432891309261322,
0.09257686883211136,
0.06679224967956543,
-0.20273667573928833,
-0.025732791051268578,
0.014255285263061523,
-0.12937778234481812,
-0.2127988636493683,
-0.012886355631053448,
0.1467551440000534,
-0.07406410574913025,
0.018872052431106567,
-0.05301756039261818,
0.09361429512500763,
-0.09189784526824951,
-0.03678001090884209,
-0.04259318858385086,
-0.11765288561582565,
-0.04964281991124153,
0.016584472730755806,
0.022462595254182816,
-0.012487547472119331,
0.008428582921624184,
-0.10546960681676865,
0.037628304213285446,
-0.038519397377967834,
-0.013286382891237736,
-0.1338939070701599,
0.060500748455524445,
-0.0729559063911438,
0.0011856412747874856,
-0.029924161732196808,
-0.04882495477795601,
0.040938690304756165,
0.10233637690544128,
-0.02338225767016411,
0.14876630902290344,
0.06436092406511307,
0.0674581453204155,
-0.039356380701065063,
-0.07257719337940216,
0.08604034036397934,
0.018742335960268974,
-0.034314438700675964,
-0.15718044340610504,
0.05750052258372307,
-0.052634693682193756,
-0.06716471165418625,
-0.15767426788806915,
0.015700776129961014,
-0.002640298567712307,
0.1342693567276001,
0.049001242965459824,
-0.016796404495835304,
0.01986689679324627,
0.0133046954870224,
-0.05433836206793785,
0.007980777882039547,
0.07609548419713974,
0.023022210225462914,
-0.06327491998672485,
0.16409040987491608,
-0.112090103328228,
0.1752546727657318,
0.07683799415826797,
-0.07828789949417114,
-0.02162509225308895,
0.036731112748384476,
-0.07633884996175766,
-0.0010836926521733403,
0.017080575227737427,
0.0316767655313015,
-0.02371222898364067,
-0.04280245676636696,
0.10573452711105347,
-0.060934878885746,
-0.0019091704161837697,
0.08197563886642456,
-0.028625959530472755,
-0.04456326737999916,
0.09779389947652817,
0.07091173529624939,
-0.15011487901210785,
0.10919909924268723,
0.10397782176733017,
0.02526584081351757,
0.16080977022647858,
0.02145152911543846,
-0.0034535096492618322,
-0.06726561486721039,
0.07144371420145035,
0.010158699937164783,
0.23664385080337524,
-0.138417586684227,
0.023465238511562347,
0.01762186735868454,
-0.007711659651249647,
0.04339436814188957,
-0.07155801355838776,
-0.0773826390504837,
-0.018412204459309578,
-0.019391082227230072,
0.1212603747844696,
0.1060437262058258,
-0.13686567544937134,
0.08495908975601196,
-0.0826825499534607,
-0.09955857694149017,
0.034743230789899826,
-0.0008029124583117664,
-0.04037928581237793,
0.0766187533736229,
-0.039383310824632645,
-0.21738652884960175,
-0.12804459035396576,
-0.059567175805568695,
-0.052414290606975555,
-0.009953981265425682,
0.05362124741077423,
-0.03831266984343529,
-0.043185312300920486,
-0.08521207422018051,
-0.053733475506305695,
-0.04153679311275482,
0.016602924093604088,
0.041834332048892975,
0.028415044769644737,
-0.005569680593907833,
-0.054727014154195786,
0.023624466732144356,
-0.03761140629649162,
0.04078488424420357,
0.11509467661380768,
0.01629207469522953,
0.15636278688907623,
0.0857914537191391,
0.01532027032226324,
-0.013785208575427532,
0.02337701991200447,
0.22853857278823853,
-0.04555604234337807,
0.133409783244133,
0.18196681141853333,
0.05211963504552841,
0.0481005422770977,
0.1697607934474945,
0.03210997208952904,
-0.09033503383398056,
0.05427087843418121,
-0.06755650043487549,
-0.09434079378843307,
-0.1359962522983551,
-0.09406103938817978,
-0.05198023468255997,
0.13000060617923737,
-0.017074935138225555,
0.07161863148212433,
0.08501604944467545,
0.17000597715377808,
0.010587428696453571,
-0.012193786911666393,
-0.05627080798149109,
0.08705714344978333,
0.02345208078622818,
-0.043659087270498276,
0.04200233146548271,
-0.08765722066164017,
-0.05027260258793831,
0.09249050170183182,
-0.0007183354464359581,
0.12601891160011292,
0.029832417145371437,
0.01771499775350094,
0.09947788715362549,
0.16507743299007416,
0.13844692707061768,
0.10646598786115646,
-0.01954461634159088,
-0.06584855169057846,
-0.00915727112442255,
-0.06561286747455597,
0.07056154310703278,
0.05357314646244049,
-0.08942075818777084,
-0.05262741819024086,
0.05572754517197609,
0.05628778040409088,
-0.04261443763971329,
0.10928034782409668,
0.136261984705925,
-0.24565304815769196,
-0.005615840200334787,
0.005060131661593914,
0.05611736327409744,
-0.10251300036907196,
0.027272962033748627,
0.2595132887363434,
-0.006634474266320467,
0.05796762555837631,
-0.04293809458613396,
0.0813818946480751,
0.03022596798837185,
-0.005484783556312323,
-0.02724517695605755,
0.007217434234917164,
0.008715501986443996,
0.037139277905225754,
-0.1868356466293335,
0.19074830412864685,
-0.020336657762527466,
0.07537040114402771,
-0.00026847299886867404,
-0.05099516361951828,
-0.06792482733726501,
0.1665591299533844,
0.1853933185338974,
0.027945274487137794,
-0.01318118441849947,
-0.03457719087600708,
-0.11631736159324646,
0.04279443621635437,
0.058805201202631,
0.0467706173658371,
0.008620674721896648,
0.07363550364971161,
-0.03261810541152954,
-0.005290761590003967,
0.011596502736210823,
-0.23206207156181335,
-0.09412544965744019,
0.006055460311472416,
0.23384885489940643,
0.08718650043010712,
-0.026234138756990433,
0.026168731972575188,
-0.02167651243507862,
0.09939555823802948,
-0.16586819291114807,
-0.07527966797351837,
-0.07391319423913956,
-0.11007361859083176,
0.006008056458085775,
-0.04961295798420906,
0.02063286118209362,
-0.07375472784042358,
0.044411979615688324,
-0.029574012383818626,
-0.11241266131401062,
0.03317409008741379,
-0.15557889640331268,
-0.10274524241685867,
-0.09299639612436295,
0.046036574989557266,
0.05875813588500023,
-0.044157251715660095,
0.013260324485599995,
-0.055706024169921875,
-0.04999478533864021,
-0.11200230568647385,
0.03264354169368744,
0.059804219752550125,
-0.04735644906759262,
-0.06813506036996841,
-0.07359915226697922,
-0.024981539696455002,
-0.01043290738016367,
-0.06969194114208221,
0.05421007424592972,
0.24293744564056396,
-0.05961619317531586,
0.048200152814388275,
0.24124673008918762,
-0.045768558979034424,
-0.2420061081647873,
-0.1451549082994461,
-0.06946738064289093,
-0.026078736409544945,
-0.018387246876955032,
-0.11930374056100845,
0.1401301473379135,
0.007842647843062878,
-0.04671064391732216,
0.16003641486167908,
-0.24590837955474854,
-0.053832411766052246,
0.027897909283638,
0.14434607326984406,
0.30493295192718506,
-0.1322004199028015,
-0.03169286996126175,
-0.026665424928069115,
-0.14821766316890717,
0.1852724403142929,
-0.0006825161981396377,
0.059854160994291306,
-0.0736003890633583,
-0.007532117888331413,
-0.03380459547042847,
-0.04288988187909126,
0.07864999771118164,
-0.059922169893980026,
0.06826851516962051,
-0.0807000994682312,
0.08104415237903595,
0.1539168804883957,
-0.011892728507518768,
0.05775763466954231,
-0.09076359122991562,
0.05675102770328522,
-0.0785774514079094,
-0.020983101800084114,
-0.048850011080503464,
0.022226938977837563,
-0.0537874773144722,
-0.11179487407207489,
-0.05029766261577606,
0.011500676162540913,
-0.02196059376001358,
0.03425123542547226,
-0.013262336142361164,
0.0026994773652404547,
-0.0057783047668635845,
0.16176113486289978,
0.060076918452978134,
-0.08619851619005203,
-0.0003597265749704093,
-0.07281695306301117,
-0.054832544177770615,
0.12217709422111511,
-0.03303418681025505,
-0.039519499987363815,
0.10815545916557312,
-0.024712979793548584,
0.034062180668115616,
0.029920384287834167,
-0.03391003608703613,
0.04587092250585556,
0.11647241562604904,
-0.18356943130493164,
-0.17311324179172516,
-0.03515099361538887,
0.17197926342487335,
0.08231529593467712,
0.14098256826400757,
0.13521982729434967,
-0.10961488634347916,
0.03745998069643974,
-0.059392355382442474,
0.00901765562593937,
-0.03673064708709717,
0.036399997770786285,
0.007724581751972437,
0.0423094779253006,
-0.05557200685143471,
0.039739299565553665,
-0.03593205660581589,
-0.10187239944934845,
-0.027285700663924217,
0.04581436887383461,
-0.12164057046175003,
-0.09542503952980042,
0.02807346172630787,
0.12127772718667984,
-0.1349448710680008,
-0.096823550760746,
-0.02348034456372261,
-0.08868377655744553,
0.0464915856719017,
0.08285897225141525,
0.005681286565959454,
0.022468240931630135,
0.046070683747529984,
-0.016534531489014626,
-0.07008600234985352,
0.023861980065703392,
-0.0058566248044371605,
0.09970586746931076,
-0.24416162073612213,
-0.10759032517671585,
-0.006650220137089491,
0.046654414385557175,
-0.08625170588493347,
-0.006588581018149853,
-0.09256471693515778,
0.019917819648981094,
-0.03859264403581619,
0.04998733848333359,
-0.11174512654542923,
-0.06653449684381485,
-0.04188406839966774,
-0.01968485303223133,
-0.061171673238277435,
0.027726804837584496,
-0.019117305055260658,
0.0598924346268177,
0.037497151643037796,
-0.005855764728039503,
-0.04212654381990433,
-0.003882036078721285,
-0.01298617199063301,
-0.04604185000061989,
0.09194720536470413,
-0.03924256190657616,
-0.11250746250152588,
-0.03489355370402336,
-0.2129674255847931,
0.034864019602537155,
0.08089432865381241,
0.014428121969103813,
0.013143236748874187,
0.09256120026111603,
-0.0048539359122514725,
0.04568987712264061,
0.041502855718135834,
-0.024070516228675842,
0.030466420575976372,
-0.09406455606222153,
-0.006250554230064154,
-0.03154120594263077,
-0.01437333133071661,
-0.07412537932395935,
-0.02626132220029831,
0.07795652747154236,
0.040230732411146164,
0.13031627237796783,
-0.09138566255569458,
0.03194605931639671,
-0.04013020172715187,
0.04126081243157387,
0.10275278240442276,
-0.06089731678366661,
0.033390309661626816,
-0.042401280254125595,
-0.01395183801651001,
0.002681571524590254,
0.10455423593521118,
-0.06056913360953331,
-0.2528519034385681,
-0.008818486705422401,
-0.1521756947040558,
-0.06773695349693298,
-0.010748994536697865,
0.26587796211242676,
0.026120806112885475,
0.0004785059136338532,
-0.12262386083602905,
0.0625251904129982,
0.05629582330584526,
0.12759605050086975,
0.009935613721609116,
0.10644655674695969,
-0.005054286681115627,
0.08665446937084198,
0.028257107362151146,
-0.00037590067950077355,
-0.051675911992788315,
-0.009543636813759804,
-0.16911830008029938,
0.1247914507985115,
-0.027850288897752762,
0.06766233593225479,
0.13961170613765717,
0.012519849464297295,
-0.030396590009331703,
0.057462528347969055,
-0.04320257529616356,
-0.05139964446425438,
-0.20872774720191956,
-0.06607763469219208,
-0.13241124153137207,
0.0227535180747509,
-0.05949188396334648,
-0.03930259123444557,
-0.03280400112271309,
0.06375259160995483,
-0.08165311813354492,
0.07925871014595032,
0.08745131641626358,
0.009416592307388783,
0.09234175086021423,
0.01315437350422144,
-0.03342242166399956,
0.04657487943768501,
0.042375437915325165,
-0.022216368466615677,
0.005289389751851559,
-0.019380245357751846,
0.07885387539863586,
-0.049007538706064224,
0.06925986707210541,
0.054433900862932205,
-0.06054844707250595,
-0.03573547303676605,
-0.002082508523017168,
0.010237067006528378,
0.11398833245038986,
0.006794605869799852,
-0.007616558577865362,
0.01266671996563673,
0.11994814872741699,
0.0017543466528877616,
-0.033868782222270966,
-0.06262896955013275,
0.06365196406841278,
-0.1070241630077362,
0.07366114109754562,
-0.04035435616970062,
0.0007286263280548155,
-0.057632267475128174,
0.2824850380420685,
0.1499680131673813,
-0.06412573903799057,
0.010266291908919811,
-0.05235733836889267,
0.0163442250341177,
-0.06932087242603302,
0.10378265380859375,
0.03987923637032509,
0.2581128478050232,
-0.052315957844257355,
-0.02143535390496254,
-0.1147557869553566,
-0.028753649443387985,
-0.0725182294845581,
-0.10504129528999329,
0.013036221265792847,
-0.051870521157979965,
-0.09778575599193573,
0.08253772556781769,
-0.22863660752773285,
0.0058942013420164585,
0.1207369789481163,
0.0005368473357520998,
0.005102359689772129,
-0.03692008927464485,
0.14052487909793854,
0.023857085034251213,
0.04052290692925453,
-0.09525412321090698,
0.03393125906586647,
0.0744316503405571,
-0.0041800192557275295,
-0.07888906449079514,
0.05892308056354523,
0.006199117749929428,
-0.1569206267595291,
0.15503498911857605,
-0.008687681518495083,
0.03507806360721588,
0.07656798511743546,
-0.06313502788543701,
-0.14737942814826965,
0.10971690714359283,
-0.029995864257216454,
-0.08293823897838593,
-0.023617202416062355,
0.09435401856899261,
0.01113817933946848,
0.01437664870172739,
-0.0167853981256485,
-0.08437088876962662,
-0.046375125646591187,
0.0946219190955162,
0.04915909841656685,
-0.09129412472248077,
0.0849570780992508,
-0.026369374245405197,
0.10117235034704208,
-0.01636245846748352,
-0.05128026008605957,
-0.04626542329788208,
-0.03599807620048523,
0.05957769975066185,
-0.005752177909016609,
-0.05403793603181839,
0.06929927319288254,
-0.1482475847005844,
-0.03377096354961395,
0.05478638783097267,
0.06625432521104813,
-0.1551842987537384,
0.022491469979286194,
-0.16003556549549103,
0.0006616272148676217,
-0.057777803391218185,
0.004752124659717083,
0.21311724185943604,
0.017050419002771378,
-0.0001298617135034874,
-0.03579853102564812,
-0.015484350733458996,
0.035088490694761276,
-0.015008823946118355,
-0.1373949497938156
] |
null | null | stable-baselines3 |
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
| {"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "267.49 +/- 21.18", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | QMMMS/ppo-LunarLander-v2 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-12T12:57:58+00:00 | [] | [] | TAGS
#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
# PPO Agent playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2
using the stable-baselines3 library.
## Usage (with Stable-baselines3)
TODO: Add your code
| [
"# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.",
"## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
"TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.",
"## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
39,
41,
17
] | [
"passage: TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
0.03942384943366051,
0.04900386184453964,
-0.005304091144353151,
0.026427261531352997,
0.107408307492733,
-0.026511888951063156,
0.11188238859176636,
0.0814051404595375,
0.10722193866968155,
0.04762078449130058,
0.08338645845651627,
0.06030960753560066,
0.05080918222665787,
0.2571701407432556,
0.04754156619310379,
-0.22987541556358337,
0.036159250885248184,
-0.04869936779141426,
0.12395193427801132,
0.07178173214197159,
-0.0038484656251966953,
-0.06485428661108017,
0.020415637642145157,
-0.013290755450725555,
0.05367108806967735,
0.04282612353563309,
-0.01716216839849949,
-0.08207534998655319,
0.07169748842716217,
-0.06345846503973007,
0.06986866891384125,
0.07677983492612839,
0.13218913972377777,
-0.17832116782665253,
0.029566360637545586,
0.02571309357881546,
-0.07189024239778519,
0.01342033501714468,
0.008019951172173023,
0.05120139941573143,
0.17303818464279175,
0.019879888743162155,
0.07844575494527817,
-0.0025605305563658476,
-0.15412317216396332,
-0.018950799480080605,
0.0436202734708786,
0.12546207010746002,
0.08808347582817078,
0.04605821147561073,
0.01970590092241764,
0.17503218352794647,
-0.054352790117263794,
-0.028833400458097458,
0.21759237349033356,
-0.2881564497947693,
-0.031460098922252655,
0.321048766374588,
0.06997483223676682,
0.09725230932235718,
-0.07540661096572876,
-0.03619609400629997,
0.007783263456076384,
-0.013137873262166977,
-0.028666524216532707,
-0.07447073608636856,
0.17313385009765625,
0.05152064561843872,
-0.05057951435446739,
-0.09541505575180054,
0.16948209702968597,
0.006921638268977404,
0.0018855923553928733,
-0.019282981753349304,
0.009060598909854889,
0.07402525842189789,
-0.016097044572234154,
-0.07255112379789352,
0.057438433170318604,
0.05330665782094002,
0.019649166613817215,
-0.1435653269290924,
-0.10762494057416916,
-0.022740179672837257,
-0.008012006990611553,
0.17786912620067596,
-0.009255532175302505,
0.042902372777462006,
0.003065188182517886,
0.10384012013673782,
-0.12480384111404419,
-0.03354184702038765,
-0.0454259067773819,
-0.07565800100564957,
-0.0223417766392231,
-0.02058211714029312,
-0.03580251708626747,
0.07184842973947525,
0.11971849203109741,
0.027368178591132164,
0.09350208193063736,
0.047715865075588226,
-0.03206788748502731,
0.06343851238489151,
0.05555703118443489,
0.14222665131092072,
0.05807621404528618,
0.012854371219873428,
0.13179877400398254,
0.055213116109371185,
0.033023182302713394,
-0.0613492950797081,
-0.18252409994602203,
0.07489913702011108,
-0.07031869143247604,
0.007941240444779396,
0.12051256000995636,
-0.04480670019984245,
-0.1183447614312172,
-0.037500523030757904,
-0.017392054200172424,
-0.06224250793457031,
-0.025395862758159637,
0.0547584593296051,
-0.02883218228816986,
-0.03973718360066414,
0.0011496668448671699,
0.09384800493717194,
0.00953749567270279,
-0.1752052903175354,
0.03303423151373863,
-0.025042934343218803,
-0.10782608389854431,
0.009975161403417587,
0.0022444494534283876,
0.03394931182265282,
0.04408763721585274,
-0.11822668462991714,
-0.30899152159690857,
-0.07652641832828522,
0.05490870401263237,
-0.06516939401626587,
-0.18425025045871735,
-0.13193942606449127,
0.02454492449760437,
-0.09037084132432938,
-0.044885024428367615,
-0.12759265303611755,
-0.028549788519740105,
0.01743689924478531,
0.011519349180161953,
0.10758619755506516,
-0.0106219332665205,
-0.012188062071800232,
-0.1571401208639145,
0.008273907005786896,
-0.20951123535633087,
0.0890483483672142,
-0.019150104373693466,
0.037884220480918884,
-0.032381169497966766,
-0.07404014468193054,
0.030707746744155884,
0.052499737590551376,
-0.01474119070917368,
0.13510210812091827,
-0.15592676401138306,
-0.03691192343831062,
-0.007996266707777977,
-0.13611900806427002,
-0.04786273464560509,
-0.10358831286430359,
-0.04357128217816353,
0.13354332745075226,
0.018664736300706863,
0.15356586873531342,
-0.08709818124771118,
-0.0722038671374321,
0.20489206910133362,
-0.010411538183689117,
-0.12820468842983246,
-0.076752208173275,
0.10165707021951675,
0.021510310471057892,
-0.056606587022542953,
-0.02523270808160305,
-0.1839766949415207,
-0.0152357779443264,
-0.04550420492887497,
-0.047039128839969635,
0.01796751655638218,
-0.010888241231441498,
0.13837894797325134,
0.08494598418474197,
0.05018039792776108,
-0.06086122244596481,
-0.006730288732796907,
0.10779471695423126,
0.08823856711387634,
0.008680110797286034,
0.023406028747558594,
-0.05774238705635071,
0.09552932530641556,
-0.04003755748271942,
-0.0142367510125041,
-0.08283266425132751,
-0.036246106028556824,
-0.026256313547492027,
0.17507147789001465,
0.09440762549638748,
0.2257927656173706,
0.09567736834287643,
0.039160262793302536,
0.031270865350961685,
-0.13181598484516144,
-0.1425403207540512,
-0.0017254541162401438,
0.09020978957414627,
-0.14270411431789398,
-0.04119925573468208,
-0.08974775671958923,
-0.17768175899982452,
-0.12202505767345428,
0.0006432619411498308,
-0.17960017919540405,
0.06390921026468277,
0.05408334732055664,
-0.035177867859601974,
0.03272094577550888,
0.13032332062721252,
-0.011533179320394993,
-0.03967514634132385,
0.0831870287656784,
0.0379033200442791,
-0.041234664618968964,
-0.021742934361100197,
0.11885567009449005,
0.15673065185546875,
0.13124459981918335,
-0.03511447086930275,
0.004914294462651014,
0.07076404243707657,
-0.02309088408946991,
0.06539414077997208,
0.0558244064450264,
0.20973342657089233,
0.188301220536232,
0.038996949791908264,
0.008822928182780743,
-0.07048165798187256,
0.0855446457862854,
-0.0742373839020729,
-0.14302679896354675,
-0.05579735338687897,
0.08729292452335358,
0.016605578362941742,
0.023469142615795135,
0.08711627870798111,
0.024545932188630104,
0.09132762253284454,
0.15968108177185059,
0.01990218088030815,
-0.09659269452095032,
-0.050218869000673294,
0.01175848301500082,
0.027713103219866753,
0.04794301092624664,
-0.04514073207974434,
-0.00937939714640379,
0.017020760104060173,
-0.10303554683923721,
0.031789086759090424,
-0.1413339376449585,
-0.1358717679977417,
0.044326696544885635,
0.003906996920704842,
0.010907664895057678,
0.02786896750330925,
-0.0038291432429105043,
0.019039705395698547,
0.04351753741502762,
-0.06975466758012772,
0.047416772693395615,
-0.024745507165789604,
-0.020031947642564774,
0.03340689837932587,
-0.057257164269685745,
-0.205775648355484,
-0.17696654796600342,
0.00013708483311347663,
-0.09910997003316879,
0.10194740444421768,
0.018308809027075768,
-0.12373185902833939,
0.047737859189510345,
-0.05822649225592613,
0.027574289590120316,
-0.01875593699514866,
-0.049130141735076904,
0.10507171601057053,
0.1525275856256485,
-0.016146350651979446,
0.018018173053860664,
-0.04865182936191559,
-0.10157987475395203,
-0.19632206857204437,
0.0691583976149559,
0.04680244252085686,
0.014610917307436466,
0.10669491440057755,
0.018072687089443207,
0.02367905154824257,
-0.007674071006476879,
-0.016521066427230835,
-0.011659215204417706,
-0.08781040459871292,
0.31909599900245667,
0.04510033503174782,
-0.025173069909214973,
0.02041010931134224,
-0.0043001663871109486,
-0.028083480894565582,
0.03263787180185318,
-0.0985708013176918,
-0.07548979669809341,
-0.08774089068174362,
-0.04367410019040108,
-0.09784720093011856,
0.053299110382795334,
0.05916472524404526,
0.003188040340319276,
-0.07727594673633575,
0.04221395403146744,
0.11369874328374863,
-0.0923808291554451,
-0.07137343287467957,
0.07477962225675583,
0.0972946360707283,
-0.07331304252147675,
0.00012658814375754446,
0.00874367356300354,
0.023951783776283264,
0.037102166563272476,
0.06778035312891006,
-0.03966575115919113,
0.08589404821395874,
-0.19917890429496765,
0.0372927263379097,
0.106058269739151,
0.023754918947815895,
0.0638108178973198,
0.07643651217222214,
-0.1058402881026268,
-0.008500572293996811,
-0.032518330961465836,
-0.21341575682163239,
0.1668180525302887,
0.1355515867471695,
0.06788124144077301,
-0.025637222453951836,
-0.00461410591378808,
-0.0649740919470787,
0.05773647129535675,
0.02723747305572033,
-0.14758841693401337,
0.004883295856416225,
0.06064270809292793,
0.026899009943008423,
0.01614922471344471,
0.07971042394638062,
0.014697225764393806,
-0.1801026314496994,
-0.014406266622245312,
0.10730406641960144,
0.002390873385593295,
0.0053148469887673855,
-0.03175045922398567,
-0.1755964607000351,
0.0751047357916832,
0.004285442177206278,
0.07233936339616776,
-0.1676585078239441,
0.14297930896282196,
-0.10089799761772156,
0.07726949453353882,
-0.004285062663257122,
-0.021311495453119278,
0.02507244050502777,
-0.0541163794696331,
0.15163759887218475,
0.01058570109307766,
-0.021810131147503853,
-0.1200498715043068,
-0.1717042326927185,
-0.019227758049964905,
-0.11788936704397202,
-0.11679866164922714,
0.050424277782440186,
0.062185097485780716,
0.04923136904835701,
-0.061147067695856094,
0.1518532931804657,
-0.047422297298908234,
0.060713399201631546,
-0.06893875449895859,
-0.06755045056343079,
0.03764858841896057,
-0.12588608264923096,
-0.08176055550575256,
0.05573027580976486,
0.19166934490203857,
0.15833087265491486,
-0.02816431224346161,
-0.03472423925995827,
-0.047419581562280655,
-0.006212298292666674,
-0.007802055217325687,
0.0275666993111372,
0.023223137483000755,
0.07315318286418915,
-0.07681374251842499,
-0.11649256944656372,
0.033787861466407776,
-0.06713802367448807,
-0.055589709430933,
-0.015439179725944996,
0.1513158082962036,
0.04671623185276985,
0.07720734924077988,
-0.018946662545204163,
0.03887668624520302,
-0.001724981120787561,
-0.056474871933460236,
0.16197094321250916,
0.03885216265916824,
-0.05193585529923439,
0.06837689876556396,
0.053174007683992386,
0.043745119124650955,
0.03011113777756691,
-0.026783017441630363,
0.206032395362854,
0.1980147808790207,
0.014206883497536182,
0.2175983190536499,
0.03177616000175476,
-0.03772832080721855,
-0.1300560086965561,
-0.065880686044693,
-0.006372632458806038,
0.03559038043022156,
0.08070417493581772,
-0.18207235634326935,
-0.015011128038167953,
-0.05689644813537598,
-0.034518610686063766,
-0.15059494972229004,
-0.28553900122642517,
-0.05957856774330139,
0.20075850188732147,
0.14706264436244965,
0.27519428730010986,
-0.10432573407888412,
0.035197313874959946,
0.02663275972008705,
-0.04912831634283066,
-0.006501141935586929,
0.00018665487004909664,
0.10268618166446686,
-0.15421873331069946,
0.1176437959074974,
0.08486983180046082,
-0.019002694636583328,
0.01058861706405878,
-0.1619086116552353,
0.00936629343777895,
-0.12191236019134521,
0.05354422330856323,
0.1400289237499237,
-0.048128653317689896,
-0.054873593151569366,
0.14033560454845428,
-0.024562934413552284,
-0.22685599327087402,
-0.04648222774267197,
-0.043600670993328094,
-0.010640020482242107,
0.026607351377606392,
-0.1013401448726654,
0.04101909324526787,
0.1330099105834961,
0.009380043484270573,
0.1147187277674675,
0.11749245226383209,
-0.052566803991794586,
0.10792597383260727,
0.2257719188928604,
-0.018785694614052773,
0.04689010605216026,
-0.12743118405342102,
-0.0012336712097749114,
-0.028270328417420387,
0.013657891191542149,
-0.09504974633455276,
-0.09938385337591171,
0.02366873063147068,
0.02872389927506447,
0.009118586778640747,
0.0921793207526207,
-0.029922157526016235,
0.0759170651435852,
0.06817561388015747,
-0.13014446198940277,
-0.16288450360298157,
0.015828335657715797,
-0.007344507612287998,
0.08354310691356659,
0.00027861111448146403,
0.08878035843372345,
-0.11932205408811569,
-0.018093237653374672,
-0.03153328225016594,
-0.03319635987281799,
-0.130486860871315,
-0.07138993591070175,
0.06156524643301964,
0.028095467016100883,
-0.06602972000837326,
0.1398407518863678,
0.026440169662237167,
0.15942534804344177,
0.049197953194379807,
0.012499804608523846,
0.07227300107479095,
-0.05345509201288223,
0.1283530443906784,
0.13818155229091644,
-0.00868943240493536,
-0.05460423603653908,
-0.1013643890619278,
-0.10236792266368866,
0.08925779908895493,
-0.05773641914129257,
0.07476430386304855,
-0.14885357022285461,
-0.06675903499126434,
0.015772046521306038,
0.016141414642333984,
-0.09562095999717712,
0.02571965754032135,
-0.01625603251159191,
-0.18119946122169495,
0.056570518761873245,
-0.048285093158483505,
0.0440407395362854,
-0.06347788125276566,
-0.1110161691904068,
-0.17226378619670868,
0.06091433763504028,
0.08593481779098511,
-0.053876690566539764,
-0.12229149043560028,
0.011023230850696564,
-0.00012518465518951416,
-0.06341652572154999,
-0.05023367330431938,
0.09722746908664703,
-0.11020902544260025,
0.031452205032110214,
-0.012567701749503613,
0.08853451162576675,
-0.03510405123233795,
-0.011538895778357983,
0.044220831245183945,
-0.08039166033267975,
-0.009481523185968399,
0.03534642979502678,
-0.026372017338871956,
-0.04127239063382149,
-0.2689029574394226,
0.0036654395516961813,
0.0341104120016098,
0.02497158572077751,
0.07856601476669312,
0.011906822212040424,
0.021174922585487366,
0.03993808850646019,
-0.15396519005298615,
-0.013395369984209538,
0.14574195444583893,
-0.07689505815505981,
-0.022186370566487312,
0.05703273415565491,
-0.09054436534643173,
0.013882770203053951,
-0.030287226662039757,
0.1345842480659485,
0.023923413828015327,
0.06404478847980499,
-0.0851147472858429,
0.10106813907623291,
-0.1451139897108078,
-0.04998219385743141,
-0.01244612317532301,
0.09761348366737366,
0.07019034773111343,
-0.10272270441055298,
0.014697125181555748,
0.04210108891129494,
0.19416837394237518,
0.016384804621338844,
-0.0356343574821949,
-0.03396720811724663,
0.004015897400677204,
0.22076453268527985,
0.03044266067445278,
0.10457023978233337,
0.07281364500522614,
-0.026583973318338394,
0.12624378502368927,
0.09929762035608292,
0.11280370503664017,
-0.055645186454057693,
0.13904185593128204,
0.04667386785149574,
0.038641396909952164,
0.0614289753139019,
0.06836545467376709,
0.09098632633686066,
-0.0008288522367365658,
0.1138714924454689,
0.013811973854899406,
-0.02422109805047512,
-0.021335409954190254,
0.17759373784065247,
0.10501719266176224,
-0.14769648015499115,
0.029047364369034767,
-0.01258957851678133,
0.039933037012815475,
-0.014194529503583908,
-0.15634691715240479,
-0.07240267097949982,
-0.3315149247646332,
0.1226184144616127,
-0.07119352370500565,
0.019930170848965645,
0.007913772016763687,
-0.037425633519887924,
-0.03296699747443199,
-0.04477746784687042,
0.13151589035987854,
-0.013641550205647945,
-0.006079165264964104,
-0.04815853759646416,
-0.015360191464424133,
-0.11607866734266281,
-0.11200575530529022,
-0.013207737356424332,
-0.13671602308750153,
-0.010119039565324783,
0.05595948174595833,
0.003977729007601738,
0.01821410097181797,
-0.03142618387937546,
0.0024383175186812878,
0.06541839241981506,
-0.05751744285225868,
0.056182678788900375,
0.12097269296646118,
0.08766137808561325,
-0.1058853268623352,
0.031048951670527458,
0.2011747509241104,
0.04359564557671547,
-0.12483977526426315,
0.01449228823184967,
0.1819491684436798,
0.004885740112513304,
0.017068125307559967,
-0.006097703706473112,
-0.0540788508951664,
-0.07554277032613754,
0.1251034289598465,
0.08296554535627365,
-0.09985227137804031,
0.015833314508199692,
-0.0726347416639328,
-0.01594804972410202,
-0.06374675035476685,
0.10130585730075836,
0.09538925439119339,
0.04440245032310486,
-0.10621760785579681,
-0.08487539738416672,
-0.10891728103160858,
0.040588874369859695,
-0.08629853278398514,
-0.07311757653951645,
0.09629398584365845,
-0.07057105004787445,
-0.07029950618743896,
0.025521177798509598,
-0.17978744208812714,
-0.009467960335314274,
0.1711762249469757,
-0.24654000997543335,
-0.0916430801153183,
-0.10857923328876495,
0.14477859437465668,
0.016497576609253883,
0.1013975441455841,
-0.006207061931490898,
-0.007889035157859325,
-0.20577777922153473,
0.024890204891562462,
-0.05293011665344238,
-0.02073732763528824,
0.07814782857894897,
-0.09476397186517715,
0.22629831731319427,
-0.08276885002851486,
0.020940175279974937,
0.012659613974392414,
0.0870661810040474,
-0.030675338581204414,
0.09283176809549332,
-0.03660329803824425,
-0.12576518952846527,
-0.03620953485369682,
0.03001813031733036,
0.013904244638979435,
0.10071761906147003,
0.09772487729787827,
-0.03414725139737129,
0.03389119729399681,
0.09747414290904999,
0.04172342270612717,
-0.023843804374337196,
0.0360250361263752,
-0.17077107727527618,
0.02182629331946373,
-0.018498148769140244,
-0.06935930997133255,
0.03687669709324837,
-0.06603235751390457,
0.1639697551727295,
0.04022442549467087,
0.0670473501086235,
-0.036152735352516174,
0.0073931049555540085,
-0.014454689808189869,
-0.013775371946394444,
-0.026180334389209747,
-0.17259705066680908,
-0.10422050207853317,
-0.1347656100988388,
-0.012701659463346004,
-0.034971047192811966,
0.04591470584273338,
0.023234914988279343,
-0.0003200018545612693,
-0.014577031135559082,
-0.12090865522623062,
0.04360328987240791,
0.11146783083677292,
-0.04631396010518074,
-0.026193076744675636
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | fill-mask | cxbn12/dummy-model | [
"transformers",
"safetensors",
"camembert",
"fill-mask",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:58:20+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #camembert #fill-mask #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #camembert #fill-mask #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
48,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #camembert #fill-mask #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.07371913641691208,
0.15016672015190125,
-0.0038328575901687145,
0.021959224715828896,
0.11421514302492142,
0.01104127150028944,
0.07501126080751419,
0.10840724408626556,
-0.01738830842077732,
0.12608130276203156,
0.04254257678985596,
0.09817446768283844,
0.1138492301106453,
0.199096217751503,
0.0008185468032024801,
-0.20414641499519348,
0.06494008004665375,
-0.11679922789335251,
0.013512792997062206,
0.12357870489358902,
0.14277078211307526,
-0.10818105190992355,
0.06827287375926971,
-0.03530823811888695,
-0.023129651322960854,
-0.03467816859483719,
-0.06037485599517822,
-0.057233963161706924,
0.0652119442820549,
0.05637726932764053,
0.07027599215507507,
0.021783530712127686,
0.07911419868469238,
-0.2862502336502075,
0.020211070775985718,
0.07808363437652588,
0.003033190034329891,
0.06061071529984474,
0.07327340543270111,
-0.07740600407123566,
0.10405948758125305,
-0.058714453130960464,
0.15687522292137146,
0.07615838944911957,
-0.09984909743070602,
-0.18920426070690155,
-0.08388586342334747,
0.0944560244679451,
0.16961683332920074,
0.05770792067050934,
-0.037779927253723145,
0.14441505074501038,
-0.07788044959306717,
0.01635688915848732,
0.06620251387357712,
-0.07188857346773148,
-0.05398255214095116,
0.05330328270792961,
0.07410529255867004,
0.08480940014123917,
-0.13030220568180084,
-0.006425938103348017,
0.041556499898433685,
0.018644291907548904,
0.11218167841434479,
0.024553505703806877,
0.1331346035003662,
0.027172230184078217,
-0.1446155458688736,
-0.06490302830934525,
0.11010438948869705,
0.03676534444093704,
-0.060171596705913544,
-0.24540463089942932,
-0.006397032644599676,
-0.034024205058813095,
-0.02931126020848751,
-0.04043516144156456,
0.038742128759622574,
-0.02976202964782715,
0.08796697109937668,
0.004780875518918037,
-0.0694337710738182,
-0.0539354644715786,
0.08643266558647156,
0.06526253372430801,
0.02730497345328331,
-0.023766931146383286,
0.019028067588806152,
0.11778896301984787,
0.09524030983448029,
-0.11508234590291977,
-0.06472436338663101,
-0.06429606676101685,
-0.09421957284212112,
-0.046126365661621094,
0.03410613536834717,
0.06407878547906876,
0.04772079735994339,
0.20187009871006012,
0.010245309211313725,
0.04911359027028084,
0.03179587423801422,
0.018154004588723183,
0.06545849144458771,
0.06809286773204803,
-0.05246766656637192,
-0.12502087652683258,
-0.03880379721522331,
0.11529652029275894,
0.005272938869893551,
-0.034458499401807785,
-0.03499239310622215,
0.06193375214934349,
0.04468022659420967,
0.12100028246641159,
0.07017656415700912,
0.018671272322535515,
-0.07616458088159561,
-0.0456383191049099,
0.17992514371871948,
-0.15802793204784393,
0.021564429625868797,
0.015028289519250393,
-0.05151410773396492,
-0.035399794578552246,
0.018893880769610405,
0.008805080316960812,
-0.027893956750631332,
0.09908682852983475,
-0.06648943573236465,
-0.042881403118371964,
-0.10964479297399521,
-0.05655944347381592,
0.03227860853075981,
-0.025404054671525955,
-0.030388256534934044,
-0.04134857654571533,
-0.12968648970127106,
-0.07233735918998718,
0.07253419607877731,
-0.06468013674020767,
-0.06364650279283524,
-0.034243132919073105,
-0.06116553023457527,
0.015976591035723686,
0.001876375055871904,
0.13074848055839539,
-0.03097095526754856,
0.04830831661820412,
-0.051102057099342346,
0.07315338402986526,
0.13389816880226135,
0.03265800699591637,
-0.06522523611783981,
0.0667540580034256,
-0.2157580554485321,
0.10513640940189362,
-0.09177391976118088,
0.025229651480913162,
-0.1617206633090973,
-0.023555533960461617,
0.02337227389216423,
0.03702010586857796,
-0.014575046487152576,
0.14183253049850464,
-0.17782239615917206,
-0.037419240921735764,
0.19155152142047882,
-0.1289752572774887,
-0.09149885177612305,
0.06570210307836533,
-0.06176960468292236,
0.13090457022190094,
0.05110245943069458,
-0.024868300184607506,
0.05143950879573822,
-0.14239759743213654,
-0.020357230678200722,
-0.06019468232989311,
-0.014655063860118389,
0.1511567384004593,
0.06672269850969315,
-0.05394526571035385,
0.026362063363194466,
0.018959322944283485,
-0.022142108529806137,
-0.04549313336610794,
-0.035079773515462875,
-0.09853461384773254,
0.0056659625843167305,
-0.07887473702430725,
0.027319423854351044,
-0.02569001540541649,
-0.09047041833400726,
-0.04260211065411568,
-0.16159464418888092,
-0.0030133621767163277,
0.09794466942548752,
0.004464977886527777,
-0.029699385166168213,
-0.10171101987361908,
0.006227563600987196,
0.012603274546563625,
-0.009534215554594994,
-0.15087886154651642,
-0.055134519934654236,
0.023140477016568184,
-0.1731116622686386,
0.027628501877188683,
-0.04883555322885513,
0.036076854914426804,
0.04321683943271637,
-0.0464976541697979,
-0.02788419835269451,
0.013179686851799488,
0.018177764490246773,
-0.020932741463184357,
-0.25023549795150757,
-0.016578499227762222,
-0.050916917622089386,
0.18402138352394104,
-0.2457282841205597,
0.04974381625652313,
0.06207958236336708,
0.11928238719701767,
0.005071321502327919,
-0.04598625376820564,
0.038152433931827545,
-0.05267506465315819,
-0.038376711308956146,
-0.06653018295764923,
-0.003498279955238104,
-0.03353290632367134,
-0.049216046929359436,
0.04259004816412926,
-0.18427212536334991,
-0.028931014239788055,
0.11642012745141983,
0.07213902473449707,
-0.17119362950325012,
-0.0672503337264061,
-0.03523210808634758,
-0.05936194211244583,
-0.08785484731197357,
-0.055018261075019836,
0.09137790650129318,
0.04488954693078995,
0.05281013995409012,
-0.06955815106630325,
-0.05582066997885704,
0.018636789172887802,
-0.011962179094552994,
-0.032943833619356155,
0.08403272181749344,
0.0782623440027237,
-0.1201024055480957,
0.10603377223014832,
0.07190712541341782,
0.0666121393442154,
0.10566576570272446,
0.00849581602960825,
-0.09741519391536713,
-0.015489505603909492,
0.027061212807893753,
0.015399227850139141,
0.15160807967185974,
-0.07470542937517166,
0.03403806313872337,
0.04539733752608299,
-0.02878260798752308,
0.010284570045769215,
-0.10222785919904709,
0.018191754817962646,
0.03279995173215866,
-0.010358961299061775,
0.011486727744340897,
-0.04990274831652641,
0.01569819450378418,
0.10489126294851303,
0.035020604729652405,
0.0300652626901865,
0.018987147137522697,
-0.041449807584285736,
-0.12725664675235748,
0.177490234375,
-0.09366269409656525,
-0.25720953941345215,
-0.13012859225273132,
-0.007910270243883133,
0.044674649834632874,
-0.012968363240361214,
0.01963118650019169,
-0.056077007204294205,
-0.10966496169567108,
-0.10300976783037186,
0.027240756899118423,
0.05546927452087402,
-0.08336570858955383,
-0.06409040093421936,
0.04906666651368141,
0.04101676493883133,
-0.1223200336098671,
0.018897203728556633,
0.044678498059511185,
-0.06908224523067474,
0.01094250287860632,
0.05612223967909813,
0.08538828790187836,
0.18244652450084686,
0.009074121713638306,
-0.015549948439002037,
0.009165075607597828,
0.21726678311824799,
-0.15085577964782715,
0.09314005821943283,
0.1427789032459259,
-0.06266073137521744,
0.08362581580877304,
0.2021656185388565,
0.029309332370758057,
-0.09724124521017075,
0.038436971604824066,
0.03606608882546425,
-0.03997630253434181,
-0.24201864004135132,
-0.07739612460136414,
-0.0008780949865467846,
-0.06965447962284088,
0.10162385553121567,
0.08712173253297806,
0.11680830270051956,
0.05148936063051224,
-0.11143417656421661,
-0.06938411295413971,
0.0482625775039196,
0.12080063670873642,
-0.031788170337677,
0.0013731889193877578,
0.09863253682851791,
-0.02819245494902134,
0.021711504086852074,
0.09146450459957123,
0.01600269228219986,
0.18734489381313324,
0.04614405706524849,
0.13374663889408112,
0.09305742383003235,
0.06553691625595093,
0.019125180318951607,
0.020628999918699265,
0.023993849754333496,
0.0272738765925169,
-0.02170303277671337,
-0.08313114196062088,
-0.007017331663519144,
0.14035393297672272,
0.035235244780778885,
0.037257660180330276,
0.0019501916831359267,
-0.04574238508939743,
0.07132025063037872,
0.17276623845100403,
0.017445886507630348,
-0.23019763827323914,
-0.06521078944206238,
0.07371465116739273,
-0.06897614896297455,
-0.1169905811548233,
-0.0173257514834404,
0.02386533096432686,
-0.1834314912557602,
0.045889999717473984,
-0.02516929619014263,
0.10181453824043274,
-0.10305027663707733,
-0.02251409739255905,
0.03795233741402626,
0.06367214769124985,
-0.034207336604595184,
0.07622484862804413,
-0.20384810864925385,
0.14992335438728333,
0.007868208922445774,
0.0655534490942955,
-0.10755813866853714,
0.08234389871358871,
0.02186938375234604,
-0.000078731776739005,
0.16976791620254517,
-0.005332923959940672,
-0.07166474312543869,
-0.08968684077262878,
-0.08007729798555374,
-0.015301639214158058,
0.09766006469726562,
-0.11606097221374512,
0.09088002890348434,
-0.005532135721296072,
-0.033772390335798264,
-0.001003169920295477,
-0.11506054550409317,
-0.13568595051765442,
-0.1810564249753952,
0.050793085247278214,
-0.12042605131864548,
0.03483053296804428,
-0.110326386988163,
-0.06079995632171631,
-0.039059121161699295,
0.19374844431877136,
-0.19769349694252014,
-0.08100385963916779,
-0.15129372477531433,
-0.06937769800424576,
0.11475351452827454,
-0.04169437661767006,
0.08333124965429306,
0.00575080793350935,
0.20940551161766052,
-0.005428771022707224,
-0.00006152192509034649,
0.09395886957645416,
-0.09701906889677048,
-0.20549637079238892,
-0.09645431488752365,
0.1337248831987381,
0.12921380996704102,
0.045738961547613144,
-0.0006359491380862892,
0.025451062247157097,
-0.004552708938717842,
-0.1098034530878067,
0.04068325087428093,
0.14948917925357819,
0.10009516030550003,
0.04517345502972603,
-0.022168826311826706,
-0.14335748553276062,
-0.10383975505828857,
-0.053878508508205414,
0.012351157143712044,
0.1937102973461151,
-0.07130398601293564,
0.16393013298511505,
0.15254592895507812,
-0.06195027753710747,
-0.21360230445861816,
0.03530298173427582,
0.030577631667256355,
-0.0027425598818808794,
0.04211503639817238,
-0.20226545631885529,
0.07177475094795227,
0.012299909256398678,
-0.06052505224943161,
0.1329660564661026,
-0.17330452799797058,
-0.14791011810302734,
0.09466386586427689,
0.07588644325733185,
-0.20206265151500702,
-0.12915512919425964,
-0.09465188533067703,
-0.05156787857413292,
-0.10244981199502945,
0.08578440546989441,
-0.006577404215931892,
0.00796047504991293,
0.03550057113170624,
0.020307740196585655,
0.014843028970062733,
-0.053856946527957916,
0.19742366671562195,
-0.0028309037443250418,
0.04755605757236481,
-0.07560843974351883,
-0.07401026040315628,
0.03885151445865631,
-0.06642770022153854,
0.08509338647127151,
-0.019818376749753952,
0.0031493608839809895,
-0.11036774516105652,
-0.066634401679039,
-0.04840898886322975,
0.03775059059262276,
-0.08615048974752426,
-0.09698852151632309,
-0.052785180509090424,
0.10407061874866486,
0.09429827332496643,
-0.03676796704530716,
-0.07167164236307144,
-0.0930488258600235,
0.061755917966365814,
0.2197171002626419,
0.17922423779964447,
0.07432297617197037,
-0.08127956092357635,
-0.007698057219386101,
-0.023898649960756302,
0.056424181908369064,
-0.20845407247543335,
0.04458294063806534,
0.03555328771471977,
0.03221617266535759,
0.13381335139274597,
-0.020805353298783302,
-0.16324804723262787,
-0.04733991622924805,
0.05880686268210411,
-0.0678478479385376,
-0.16000410914421082,
0.0050316303968429565,
0.08159641921520233,
-0.1564016193151474,
-0.05528028681874275,
0.028295164927840233,
-0.03214212507009506,
-0.02573547326028347,
0.0017541897250339389,
0.08101537823677063,
0.02034606598317623,
0.10651972889900208,
0.06467299908399582,
0.11348457634449005,
-0.10312975943088531,
0.0721626952290535,
0.08422582596540451,
-0.11087015271186829,
0.03811759874224663,
0.05570476874709129,
-0.06352023035287857,
-0.03376225009560585,
0.02857513539493084,
0.08655036240816116,
0.034245528280735016,
-0.07327460497617722,
0.0009771488839760423,
-0.11352569609880447,
0.06755116581916809,
0.1397746354341507,
0.037518635392189026,
0.006101919338107109,
0.0450783297419548,
0.03180363029241562,
-0.09886960685253143,
0.11541297286748886,
0.04517350345849991,
0.034903425723314285,
-0.05006372928619385,
-0.0023413829039782286,
0.04492645338177681,
-0.012664028443396091,
-0.018137352541089058,
-0.03934599831700325,
-0.06449457257986069,
-0.007642639800906181,
-0.15736740827560425,
0.025448406115174294,
-0.06760244071483612,
0.00670814560726285,
0.014806132763624191,
-0.031345803290605545,
0.004022547043859959,
0.011439152993261814,
-0.07757596671581268,
-0.04447099193930626,
-0.002302665961906314,
0.10618019104003906,
-0.16193822026252747,
0.005553076509386301,
0.08726800233125687,
-0.12766145169734955,
0.07833597809076309,
0.0009211061405949295,
-0.008060677908360958,
0.019680539146065712,
-0.13721711933612823,
0.060838859528303146,
-0.00897155050188303,
0.007872733287513256,
0.026538081467151642,
-0.21100404858589172,
0.002521090442314744,
-0.049982182681560516,
-0.06133849546313286,
-0.0025593596510589123,
-0.038511838763952255,
-0.11365798115730286,
0.10289128124713898,
0.019270801916718483,
-0.08019789308309555,
-0.017085609957575798,
0.04939700663089752,
0.10854220390319824,
-0.051504261791706085,
0.14170297980308533,
-0.019941674545407295,
0.06102161481976509,
-0.18276308476924896,
-0.016857357695698738,
-0.019154565408825874,
0.018956458196043968,
-0.030912168323993683,
-0.00755320256575942,
0.05403618514537811,
-0.02111445739865303,
0.22888365387916565,
-0.022312728688120842,
0.021317902952432632,
0.06538864225149155,
0.001540902303531766,
-0.011879486963152885,
0.0934894010424614,
0.04819492623209953,
0.01572871394455433,
0.019354134798049927,
0.016223670914769173,
-0.044685494154691696,
-0.009909945540130138,
-0.12684708833694458,
0.08657418191432953,
0.1663091480731964,
0.09711839258670807,
-0.0032635980751365423,
0.04937102645635605,
-0.11183884739875793,
-0.0907219871878624,
0.09691082686185837,
-0.03293531388044357,
-0.00868645403534174,
-0.04863553121685982,
0.13773348927497864,
0.15863844752311707,
-0.18532606959342957,
0.07007761299610138,
-0.06718835979700089,
-0.056737493723630905,
-0.1084170863032341,
-0.17889203131198883,
-0.0613018274307251,
-0.03356925770640373,
-0.007798245642334223,
-0.055792298167943954,
0.0640881136059761,
0.11015819013118744,
0.01509533915668726,
0.006389363668859005,
0.0909981057047844,
-0.03807319328188896,
0.008552610874176025,
0.043729886412620544,
0.05398762971162796,
0.014643821865320206,
-0.06274322420358658,
0.006796086672693491,
0.005115681793540716,
0.038099709898233414,
0.05580848827958107,
0.030108658596873283,
-0.015502022579312325,
0.012847079895436764,
-0.019972529262304306,
-0.10291805118322372,
0.03934168070554733,
-0.027247389778494835,
-0.04759282246232033,
0.14979983866214752,
0.021485881879925728,
-0.001124731614254415,
-0.023320944979786873,
0.22557686269283295,
-0.06558782607316971,
-0.07872041314840317,
-0.14238019287586212,
0.13879473507404327,
-0.04238482937216759,
0.05087779834866524,
0.04888312891125679,
-0.10371565818786621,
0.034762755036354065,
0.14829161763191223,
0.14918026328086853,
-0.030515480786561966,
0.011137944646179676,
0.01326063647866249,
0.0031382672023028135,
-0.02608977071940899,
0.0531524196267128,
0.04641692712903023,
0.12000146508216858,
-0.06667191535234451,
0.09562870860099792,
-0.008612480014562607,
-0.09277759492397308,
-0.022383572533726692,
0.13433672487735748,
0.0041870372369885445,
0.02557336911559105,
-0.08038719743490219,
0.12375041842460632,
-0.061774857342243195,
-0.25456687808036804,
0.0664278045296669,
-0.06466920673847198,
-0.15146252512931824,
-0.01940576545894146,
0.01946347765624523,
0.00014033516345079988,
0.026333073154091835,
0.059298913925886154,
-0.061054691672325134,
0.15236622095108032,
0.03699715808033943,
-0.0695895180106163,
-0.07827477157115936,
0.07940807193517685,
-0.07792537659406662,
0.3044506311416626,
0.0072919647209346294,
0.055240001529455185,
0.09688185155391693,
-0.03667605668306351,
-0.13230539858341217,
0.03721073642373085,
0.08846048265695572,
-0.04729968681931496,
0.06561979651451111,
0.20971114933490753,
-0.011837940663099289,
0.10654570162296295,
0.07382412999868393,
-0.08525184541940689,
0.05251358076930046,
-0.09512167423963547,
-0.09520350396633148,
-0.08843419700860977,
0.09512295573949814,
-0.06087517365813255,
0.14968523383140564,
0.13144882023334503,
-0.04547085613012314,
0.004147613886743784,
-0.023350544273853302,
0.053780317306518555,
-0.0006241541123017669,
0.11119713634252548,
0.024692893028259277,
-0.19301405549049377,
0.03286419063806534,
-0.005306687671691179,
0.0987875759601593,
-0.2564460039138794,
-0.08652844280004501,
0.039186857640743256,
-0.010480973869562149,
-0.053097501397132874,
0.12077769637107849,
0.055750805884599686,
0.049317218363285065,
-0.05613971874117851,
-0.05211269110441208,
-0.005156795959919691,
0.16217851638793945,
-0.10507626086473465,
-0.002282210160046816
] |
null | null | transformers | Model description:
Model: bert-base-multilingual-cased
Dataset: TASTEset
Unshuffled ratio: ['1']
Shuffled ratio: ['0']
Best exact match epoch: 10
Best exact match: 97.79
Best epoch: 10
Drop duplicates: ['1']
Max epochs = 10
Optimizer lr = 3e-05
Optimizer eps = 1e-08
Batch size = 32
Dataset path = pgajo/EW-TT-PE_U1_S0_DROP1_mbert
Results
| epoch | train_loss | train_f1 | train_exact | dev_loss | dev_f1 | dev_exact | test_loss | test_f1 | test_exact |
|--------:|-------------:|-----------:|--------------:|-----------:|---------:|------------:|------------:|----------:|-------------:|
| 1 | 2.94 | 17.7 | 10.02 | 0.76 | 69.4 | 58.56 | 0 | 0 | 0 |
| 2 | 0.38 | 86.34 | 80.72 | 0.15 | 95.66 | 91.99 | 0 | 0 | 0 |
| 3 | 0.08 | 97.33 | 95.44 | 0.08 | 98.38 | 95.86 | 0 | 0 | 0 |
| 4 | 0.04 | 98.98 | 98.27 | 0.09 | 98.09 | 96.41 | 0 | 0 | 0 |
| 5 | 0.03 | 98.94 | 98.41 | 0.08 | 98.44 | 96.41 | 0 | 0 | 0 |
| 6 | 0.02 | 99.32 | 98.76 | 0.08 | 98.57 | 97.24 | 0 | 0 | 0 |
| 7 | 0.02 | 99.44 | 99.24 | 0.05 | 98.44 | 97.51 | 0 | 0 | 0 |
| 8 | 0.01 | 99.82 | 99.59 | 0.07 | 98.47 | 97.24 | 0 | 0 | 0 |
| 9 | 0.01 | 99.8 | 99.65 | 0.07 | 98.66 | 97.24 | 0 | 0 | 0 |
| 10 | 0.01 | 99.82 | 99.65 | 0.06 | 98.59 | 97.79 | 0 | 0 | 0 | | {} | question-answering | pgajo/mbert_EW-TT-PE_U1_S0_DROP1_mbert_E10_DEV98.0 | [
"transformers",
"safetensors",
"bert",
"question-answering",
"endpoints_compatible",
"region:us"
] | 2024-02-12T12:59:25+00:00 | [] | [] | TAGS
#transformers #safetensors #bert #question-answering #endpoints_compatible #region-us
| Model description:
```
Model: bert-base-multilingual-cased
Dataset: TASTEset
Unshuffled ratio: ['1']
Shuffled ratio: ['0']
Best exact match epoch: 10
Best exact match: 97.79
Best epoch: 10
Drop duplicates: ['1']
Max epochs = 10
Optimizer lr = 3e-05
Optimizer eps = 1e-08
Batch size = 32
Dataset path = pgajo/EW-TT-PE_U1_S0_DROP1_mbert
```
Results
| [] | [
"TAGS\n#transformers #safetensors #bert #question-answering #endpoints_compatible #region-us \n"
] | [
30
] | [
"passage: TAGS\n#transformers #safetensors #bert #question-answering #endpoints_compatible #region-us \n"
] | [
-0.03100396879017353,
0.011429967358708382,
-0.009655450470745564,
-0.0477571114897728,
0.071015864610672,
0.001686002011410892,
0.08008057624101639,
0.05985769256949425,
0.11401950567960739,
0.02590048313140869,
0.1903941035270691,
0.16566626727581024,
-0.07932274788618088,
0.015106523409485817,
-0.13172350823879242,
-0.13182127475738525,
0.11529869586229324,
0.03778080269694328,
-0.03543904423713684,
0.10329030454158783,
0.05029234290122986,
-0.12624382972717285,
0.04368755966424942,
-0.06763096153736115,
-0.062081653624773026,
0.06668882071971893,
0.04820772260427475,
-0.08198674768209457,
0.13085918128490448,
0.03362511843442917,
0.2047542929649353,
0.04677434265613556,
-0.1182841956615448,
-0.21163156628608704,
0.03874710574746132,
-0.011287915520370007,
-0.05873045325279236,
0.019588099792599678,
0.032477255910634995,
-0.07909006625413895,
-0.11140874028205872,
0.027899496257305145,
0.014707351103425026,
0.08549544960260391,
-0.18314984440803528,
-0.16563549637794495,
-0.06621148437261581,
-0.053103990852832794,
0.12317322194576263,
0.08563494682312012,
-0.020668305456638336,
0.1935536116361618,
-0.15425218641757965,
0.0928223505616188,
0.1380285918712616,
-0.32555314898490906,
-0.0027393975760787725,
0.093502476811409,
0.11618221551179886,
0.05096927657723427,
-0.02073126845061779,
0.09022705256938934,
0.07546665519475937,
-0.00581451877951622,
-0.06733445823192596,
-0.0957256555557251,
-0.012503020465373993,
0.09702391922473907,
-0.07598375529050827,
-0.052956461906433105,
0.2470276802778244,
0.031026924028992653,
0.013565225526690483,
-0.008941343985497952,
-0.10310965776443481,
0.030862320214509964,
0.02648748643696308,
-0.06024225428700447,
-0.02690120041370392,
0.06734149158000946,
-0.0001909599086502567,
0.005896252579987049,
-0.1221570298075676,
-0.006722765974700451,
-0.22672583162784576,
0.2768072187900543,
-0.0018987046787515283,
0.08534801006317139,
-0.2428436279296875,
0.015660421922802925,
-0.06141046807169914,
-0.0824490636587143,
-0.013059272430837154,
-0.09494815766811371,
-0.009192516095936298,
-0.02866560034453869,
-0.04682322219014168,
0.015530125238001347,
0.12870869040489197,
0.20563961565494537,
-0.017999636009335518,
0.04083723947405815,
-0.061628565192222595,
0.0725679025053978,
0.03914913535118103,
0.09992070496082306,
0.010195896960794926,
-0.020322704687714577,
-0.016003627330064774,
-0.13105420768260956,
-0.008767413906753063,
-0.03738516569137573,
-0.05202561616897583,
-0.022937579080462456,
0.01343182846903801,
0.16656653583049774,
0.057803552597761154,
0.021070659160614014,
-0.08621648699045181,
0.05785249546170235,
0.022443469613790512,
-0.04320667311549187,
-0.017870478332042694,
0.00882878340780735,
0.06155950948596001,
0.0885266587138176,
-0.07562171667814255,
0.04524178430438042,
0.016779053956270218,
0.06491811573505402,
-0.07376032322645187,
-0.06024041771888733,
-0.019815200939774513,
-0.022853199392557144,
0.06425601989030838,
-0.06728833168745041,
0.08267539739608765,
-0.1562412828207016,
-0.08226612955331802,
0.011612122878432274,
0.02970954217016697,
0.007305266335606575,
0.06759197264909744,
-0.014567295089364052,
-0.039057523012161255,
-0.03480268642306328,
-0.07194317877292633,
-0.10265897214412689,
-0.07100482285022736,
0.06559862941503525,
0.037085019052028656,
0.029506711289286613,
-0.08701489865779877,
0.0126223498955369,
-0.10313430428504944,
0.0696413442492485,
-0.07926147431135178,
-0.03626604750752449,
-0.030684340745210648,
0.19216585159301758,
-0.03995077684521675,
-0.013410759158432484,
-0.11826255917549133,
0.05234655737876892,
-0.05254388228058815,
0.21867278218269348,
-0.03809955716133118,
-0.03585023805499077,
0.23391962051391602,
-0.09690817445516586,
-0.2571674883365631,
0.07713238894939423,
0.006013390142470598,
0.017324132844805717,
0.10797587037086487,
0.19150643050670624,
-0.016850516200065613,
-0.11185130476951599,
0.0474415123462677,
0.11249569058418274,
-0.15280477702617645,
-0.0624573640525341,
0.025971313938498497,
-0.0582793690264225,
-0.1464228332042694,
0.016458844766020775,
0.051048628985881805,
0.04815160855650902,
-0.08806464076042175,
-0.03191754221916199,
-0.02947526052594185,
-0.018536636605858803,
0.061611421406269073,
0.04005695879459381,
0.026151038706302643,
-0.12002047151327133,
0.017315825447440147,
-0.051940858364105225,
-0.04731830582022667,
0.03846436366438866,
0.007411974482238293,
-0.12714537978172302,
0.07094167917966843,
-0.131436288356781,
0.020615974441170692,
-0.16280385851860046,
-0.19247999787330627,
-0.013410934247076511,
0.10532321780920029,
-0.05276893824338913,
0.20171119272708893,
0.11623696237802505,
-0.10492526739835739,
-0.01685560680925846,
-0.07052898406982422,
0.1616603285074234,
0.05628864839673042,
-0.02636071853339672,
-0.04867614805698395,
0.07146526873111725,
-0.10356242209672928,
-0.10846276581287384,
-0.05549529939889908,
-0.01631050743162632,
0.13880129158496857,
0.10532583296298981,
0.04163223132491112,
0.06328489631414413,
-0.012810224667191505,
0.017701199278235435,
-0.008262974210083485,
0.018305214121937752,
0.07581605017185211,
-0.03447617590427399,
-0.11924053728580475,
0.11601310968399048,
-0.1444002240896225,
0.3729725480079651,
0.16846853494644165,
-0.23041868209838867,
0.01894976757466793,
-0.026126159355044365,
-0.030978791415691376,
0.034767232835292816,
0.05344981700181961,
-0.017914773896336555,
0.01958848536014557,
0.031971078366041183,
0.07821214944124222,
-0.03785416856408119,
-0.05193689465522766,
-0.015433255583047867,
-0.07395049929618835,
-0.06607450544834137,
0.07275120168924332,
-0.03483232855796814,
-0.21013760566711426,
0.1599646657705307,
0.31365448236465454,
0.09703507274389267,
0.08886944502592087,
-0.0816551148891449,
-0.028012678027153015,
-0.0039048483595252037,
0.07745775580406189,
-0.022175131365656853,
0.0646965503692627,
-0.19559495151042938,
0.002697455231100321,
0.0718853622674942,
0.040101438760757446,
0.051995899528265,
-0.1255539059638977,
-0.08741874992847443,
0.02883525937795639,
0.010361172258853912,
-0.0510454997420311,
0.08942679315805435,
0.01958455704152584,
0.10355164110660553,
0.03094480000436306,
-0.025720693171024323,
0.12157201766967773,
-0.0424032099545002,
-0.08322477340698242,
0.16933336853981018,
-0.11445565521717072,
-0.22569596767425537,
-0.07213949412107468,
-0.10141351073980331,
0.023521440103650093,
0.043139949440956116,
0.07353874295949936,
-0.13277705013751984,
-0.06267919391393661,
0.050284892320632935,
0.04398718848824501,
-0.11532527953386307,
0.034965697675943375,
0.011176006868481636,
0.0742565244436264,
-0.047823816537857056,
-0.06598490476608276,
-0.06332776695489883,
-0.03295988216996193,
-0.06356722116470337,
0.1191829964518547,
-0.10939455777406693,
0.1207437515258789,
0.09475167840719223,
0.04165811091661453,
0.036363665014505386,
-0.027820978313684464,
0.21290433406829834,
-0.11579988896846771,
-0.03179406374692917,
0.15926754474639893,
-0.07346773147583008,
0.07930222153663635,
0.20331227779388428,
0.017215436324477196,
-0.1255631297826767,
0.04482865333557129,
-0.03777764365077019,
-0.08158078044652939,
-0.24055063724517822,
-0.04635780677199364,
-0.08391188085079193,
0.07882910221815109,
-0.018682004883885384,
0.04367469623684883,
0.10718972235918045,
0.09847458451986313,
0.02698599174618721,
-0.15794047713279724,
0.009259669110178947,
0.060280539095401764,
0.19491833448410034,
-0.0554194450378418,
0.09747976064682007,
-0.07872258871793747,
-0.14044831693172455,
0.058162905275821686,
0.07227057963609695,
0.11210840195417404,
0.18135450780391693,
0.0031284119468182325,
0.07501647621393204,
0.11561381816864014,
0.14170172810554504,
0.14721226692199707,
0.028168288990855217,
-0.09393750876188278,
-0.012610750272870064,
0.000841298489831388,
-0.071214459836483,
0.04935174807906151,
0.06255429983139038,
-0.09986883401870728,
-0.016300853341817856,
-0.16199824213981628,
0.11020834743976593,
0.05675990507006645,
0.08375607430934906,
-0.13229906558990479,
0.008182737976312637,
0.12653344869613647,
-0.016539672389626503,
-0.04231732711195946,
0.12035517394542694,
0.07884106040000916,
-0.08249315619468689,
0.04244247451424599,
-0.04095182567834854,
0.11129532009363174,
0.07417996227741241,
0.09555985778570175,
-0.096460722386837,
-0.16630028188228607,
0.02183578908443451,
0.07979494333267212,
-0.27919045090675354,
0.28428587317466736,
0.032050203531980515,
-0.04338350147008896,
-0.06692010164260864,
-0.039031147956848145,
-0.04415836185216904,
0.1649855673313141,
0.21534205973148346,
-0.006029482930898666,
-0.12515726685523987,
-0.10306360572576523,
0.060360122472047806,
0.07373268157243729,
0.15369689464569092,
-0.022843722254037857,
0.01709183119237423,
-0.02581469528377056,
0.01907532475888729,
0.0005263579660095274,
0.027384355664253235,
-0.00807490199804306,
-0.10579172521829605,
-0.003417222760617733,
0.027430731803178787,
0.11391840875148773,
-0.05235821753740311,
0.053690437227487564,
-0.07520826160907745,
0.11101158708333969,
-0.08321993052959442,
-0.024513524025678635,
-0.10570400953292847,
-0.159481018781662,
0.09931088238954544,
-0.0652543157339096,
0.02730567753314972,
-0.06895346194505692,
-0.034800801426172256,
-0.06456287950277328,
-0.1387634426355362,
0.15311841666698456,
-0.12774962186813354,
-0.014343206770718098,
-0.05910857394337654,
0.1744864135980606,
-0.057705219835042953,
-0.014981103129684925,
0.022769484668970108,
0.058170903474092484,
-0.08365354686975479,
-0.09320548176765442,
0.012634269893169403,
-0.08999879658222198,
0.07918208837509155,
0.07504331320524216,
-0.010605372488498688,
0.011236832477152348,
0.017805295065045357,
0.011543014086782932,
0.1833728551864624,
0.2684391736984253,
-0.03611943498253822,
0.05449281632900238,
0.21387790143489838,
0.009187204763293266,
-0.3001823127269745,
-0.03780132532119751,
-0.20396788418293,
-0.06599479168653488,
0.0035966881550848484,
-0.01841581240296364,
0.15771964192390442,
0.038633719086647034,
-0.05389995872974396,
0.06213739886879921,
-0.16254091262817383,
-0.0409867987036705,
0.17554175853729248,
0.02816466987133026,
0.5083365440368652,
-0.16917727887630463,
-0.09572464227676392,
-0.01933435909450054,
-0.21105335652828217,
0.09465035051107407,
-0.0792510136961937,
0.00545540964230895,
0.027481064200401306,
0.0250190868973732,
0.03670221567153931,
-0.09177862852811813,
0.1804729551076889,
-0.0251461174339056,
0.07020123302936554,
-0.08957348763942719,
-0.09517528116703033,
0.0571230947971344,
-0.00989442877471447,
-0.004209878388792276,
0.0377814881503582,
0.043195612728595734,
-0.09419526904821396,
-0.02725309133529663,
-0.07557959109544754,
0.05808710306882858,
0.029764346778392792,
-0.06465182453393936,
-0.024149267002940178,
-0.034049443900585175,
0.0040148478001356125,
-0.006224581506103277,
0.3219931423664093,
-0.07817333191633224,
0.1998085230588913,
0.0308726467192173,
0.17342960834503174,
-0.20313303172588348,
0.014420399442315102,
0.002336042234674096,
-0.07989436388015747,
0.09632785618305206,
-0.054569393396377563,
0.0957014411687851,
0.14680208265781403,
-0.03774647042155266,
0.04170471802353859,
0.09971088171005249,
0.044757623225450516,
-0.023297281935811043,
0.12041250616312027,
-0.2069728821516037,
-0.19302959740161896,
0.006711400113999844,
0.002523706993088126,
0.0443287193775177,
0.1371040642261505,
0.08772092312574387,
0.10595496743917465,
0.007110828999429941,
-0.019849922508001328,
-0.013635226525366306,
-0.07197124511003494,
0.015518625266849995,
0.07721489667892456,
0.05103190615773201,
-0.0915357917547226,
0.07368962466716766,
-0.044682856649160385,
-0.2505898177623749,
-0.011277278885245323,
0.010972370393574238,
-0.1136656329035759,
-0.09253716468811035,
-0.0640796348452568,
0.11949943006038666,
-0.0853467583656311,
-0.07717446982860565,
-0.033551741391420364,
-0.13546887040138245,
0.036930788308382034,
0.2936263084411621,
0.08502552658319473,
0.10473651438951492,
0.05559305474162102,
-0.024962520226836205,
0.02628864347934723,
-0.022201525047421455,
-0.0632605329155922,
0.0033800466917455196,
-0.10716227442026138,
-0.10930395126342773,
-0.0539650060236454,
0.1258552223443985,
-0.10030562430620193,
-0.0463426411151886,
-0.20223698019981384,
0.07721703499555588,
-0.17302681505680084,
-0.07449597120285034,
-0.1311258226633072,
-0.05869106575846672,
0.011798324063420296,
-0.1269368678331375,
-0.043847475200891495,
-0.0405474416911602,
-0.11593431234359741,
0.0941464975476265,
0.06928019225597382,
0.006738580297678709,
-0.09351341426372528,
-0.052371736615896225,
0.14618384838104248,
-0.039895832538604736,
0.07875484228134155,
0.12324118614196777,
-0.11218003928661346,
0.09794780611991882,
-0.19827678799629211,
-0.10873684287071228,
0.09223955124616623,
-0.020392343401908875,
0.07176221162080765,
0.06298419088125229,
-0.0209525004029274,
0.09442277252674103,
0.03166748583316803,
0.07961104065179825,
-0.041231222450733185,
-0.09570163488388062,
0.02909303456544876,
0.012143692001700401,
-0.16935859620571136,
-0.031028112396597862,
-0.1383150815963745,
0.138075590133667,
-0.03250321373343468,
0.13132928311824799,
-0.0014017382636666298,
0.0942121222615242,
-0.0393197238445282,
0.0214883740991354,
0.022810328751802444,
-0.15824435651302338,
0.014284737408161163,
-0.04512546584010124,
0.00530107831582427,
-0.042201071977615356,
0.2832597494125366,
-0.13215987384319305,
0.07444287836551666,
0.07330053299665451,
-0.007652656175196171,
0.048707786947488785,
0.035340797156095505,
0.2554089426994324,
0.08575175702571869,
-0.05636623501777649,
-0.11349837481975555,
0.047768156975507736,
-0.03974492475390434,
-0.16682684421539307,
0.08966261893510818,
0.16476166248321533,
-0.021509341895580292,
0.09579425305128098,
-0.015587063506245613,
0.04206113517284393,
0.003570155706256628,
-0.20271413028240204,
-0.03418423607945442,
-0.028696484863758087,
0.0342242605984211,
0.06175161153078079,
0.19321276247501373,
-0.02510346844792366,
0.027360908687114716,
-0.06739696860313416,
-0.006428796332329512,
-0.16893014311790466,
-0.05832986161112785,
-0.09619798511266708,
-0.10513351857662201,
0.056126669049263,
-0.10675669461488724,
-0.02991390973329544,
0.11837480962276459,
0.07225114107131958,
-0.014147752895951271,
0.20032523572444916,
-0.0034852379467338324,
-0.01854041963815689,
0.010509109124541283,
0.005002413876354694,
0.06455502659082413,
0.07439646869897842,
-0.007380056194961071,
-0.10331036895513535,
-0.07467203587293625,
-0.07210230082273483,
0.04836762696504593,
-0.09930044412612915,
-0.01744663715362549,
-0.142163947224617,
-0.09089858829975128,
-0.06536278873682022,
0.1318330466747284,
-0.08915292471647263,
0.10780727118253708,
-0.019095079973340034,
0.01910819485783577,
0.05497001111507416,
0.22086337208747864,
-0.07868800312280655,
-0.07071682065725327,
-0.060905519872903824,
0.16298183798789978,
0.004298616200685501,
0.15630026161670685,
-0.03950318321585655,
-0.0016224056016653776,
-0.0332493931055069,
0.2914927303791046,
0.16758738458156586,
-0.04768482968211174,
0.05667643994092941,
0.013426431454718113,
0.043882496654987335,
0.059551939368247986,
0.034976501017808914,
0.07581301033496857,
0.25021910667419434,
-0.07689207047224045,
-0.01975826919078827,
0.022277116775512695,
-0.00035899964859709144,
-0.055962271988391876,
0.045156292617321014,
0.029317067936062813,
-0.019586384296417236,
-0.08728770166635513,
0.12731784582138062,
-0.10686571151018143,
0.08306804299354553,
0.05728748440742493,
-0.15720857679843903,
-0.014027200639247894,
-0.022743018344044685,
0.1905868649482727,
-0.06110110133886337,
0.11211711168289185,
-0.030706269666552544,
-0.13290581107139587,
-0.02404458075761795,
0.04101835936307907,
-0.1852385401725769,
-0.056675106287002563,
0.08444182574748993,
0.05783277377486229,
0.06356650590896606,
0.01799783855676651,
0.008918672800064087,
0.09269910305738449,
-0.0174893569201231,
-0.06227288395166397,
0.09672212600708008,
0.09302622079849243,
-0.11702378839254379,
-0.10226112604141235,
-0.03835497796535492,
0.03587648272514343,
-0.007181957364082336,
0.07796690613031387,
-0.23804201185703278,
0.04944111034274101,
0.012472385540604591,
-0.06038458272814751,
-0.06527353823184967,
0.0485636405646801,
-0.06548506766557693,
0.04292919486761093,
0.025255493819713593,
-0.00807290431112051,
0.015648027881979942,
-0.0017639343859627843,
0.056236833333969116,
0.04547872394323349,
-0.07353842258453369,
-0.10449795424938202,
-0.04468516260385513,
-0.040538545697927475,
0.15919344127178192,
-0.0320364348590374,
-0.12340949475765228,
-0.02860189974308014,
-0.014523285441100597,
0.07767149806022644,
-0.07934793829917908,
0.009319511242210865,
0.09768388420343399,
0.05723276734352112,
0.0005386354750953615,
-0.18609586358070374,
0.047480739653110504,
0.08650989830493927,
-0.0709119662642479,
-0.08683779090642929
] |
null | null | null |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-base-rank8
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.8366
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 13.9313 | 1.0 | 179 | 8.8102 |
| 7.5134 | 2.0 | 358 | 5.4105 |
| 5.566 | 3.0 | 537 | 4.8366 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "microsoft/deberta-v3-base", "model-index": [{"name": "deberta-v3-base-rank8", "results": []}]} | null | alitolga/deberta-v3-base-rank8 | [
"safetensors",
"generated_from_trainer",
"base_model:microsoft/deberta-v3-base",
"license:mit",
"region:us"
] | 2024-02-12T13:01:18+00:00 | [] | [] | TAGS
#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us
| deberta-v3-base-rank8
=====================
This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 4.8366
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.1+cu118
* Datasets 2.15.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
37,
98,
4,
33
] | [
"passage: TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
-0.10977909713983536,
-0.02383926510810852,
-0.000272897508693859,
0.0962144061923027,
0.18739436566829681,
0.03188342973589897,
0.1521315574645996,
0.04621896147727966,
-0.11039812862873077,
0.026030896231532097,
0.10344589501619339,
0.12346210330724716,
-0.01214214600622654,
0.12621824443340302,
-0.05480639636516571,
-0.22153477370738983,
0.015339897945523262,
0.017923252657055855,
-0.055215757340192795,
0.11165095120668411,
0.10440938174724579,
-0.16147270798683167,
0.0682869404554367,
-0.013526142574846745,
-0.23760250210762024,
0.03791137784719467,
0.04424617439508438,
-0.06407684087753296,
0.14295685291290283,
-0.009633136913180351,
0.18386055529117584,
0.012823836877942085,
0.13880614936351776,
-0.16603654623031616,
0.01076443400233984,
0.06743723154067993,
0.0072526587173342705,
0.05671124532818794,
0.06414835900068283,
0.008718214929103851,
0.08650407195091248,
-0.10005319863557816,
0.0757589116692543,
0.031208310276269913,
-0.13932667672634125,
-0.27417436242103577,
-0.08789606392383575,
0.0025896879378706217,
0.07555236667394638,
0.08566202968358994,
-0.013975093141198158,
0.17519612610340118,
-0.10044020414352417,
0.08535193651914597,
0.25552237033843994,
-0.25281932950019836,
-0.07021211832761765,
0.071146659553051,
0.023459527641534805,
0.08894380927085876,
-0.10143274813890457,
-0.02900632470846176,
0.0903119370341301,
0.04442868381738663,
0.12188786268234253,
-0.012826957739889622,
-0.11815144866704941,
0.009122032672166824,
-0.14692682027816772,
-0.004426182713359594,
0.0912783071398735,
0.03500741720199585,
-0.054377976804971695,
-0.004236732143908739,
-0.07277829945087433,
-0.1371414214372635,
-0.051367513835430145,
-0.04145136475563049,
0.06730754673480988,
-0.06141535937786102,
-0.08764902502298355,
0.01499799732118845,
-0.11625413596630096,
-0.11828317493200302,
-0.024499638006091118,
0.24395588040351868,
0.052924994379282,
0.028124088421463966,
-0.04764493182301521,
0.1291472613811493,
-0.05694502219557762,
-0.1366644948720932,
0.024524841457605362,
0.019550397992134094,
0.024960318580269814,
-0.05345170944929123,
-0.07289297133684158,
-0.0356169231235981,
0.027074923738837242,
0.1499943733215332,
-0.1311299353837967,
0.0237225741147995,
0.04559052735567093,
0.028581978753209114,
-0.11121973395347595,
0.16195543110370636,
-0.035876959562301636,
0.010333948768675327,
0.028275663033127785,
0.07690852880477905,
0.03842984512448311,
0.0128098726272583,
-0.07406949251890182,
0.005984250921756029,
0.1138528510928154,
0.03108055889606476,
-0.10245327651500702,
0.03943431377410889,
-0.04984300583600998,
0.02609124220907688,
0.022958341985940933,
-0.09585303068161011,
0.02446972019970417,
0.01637401059269905,
-0.0685202106833458,
-0.0464635044336319,
0.017798306420445442,
0.028543973341584206,
0.041776999831199646,
0.1314091831445694,
-0.09103567898273468,
0.033552948385477066,
-0.1253967434167862,
-0.11012474447488785,
-0.012358481995761395,
-0.013520389795303345,
0.038584187626838684,
-0.11949725449085236,
-0.1693306863307953,
-0.004231118597090244,
0.03667663410305977,
-0.015293833799660206,
0.01881149411201477,
-0.03158961609005928,
-0.0996800884604454,
-0.012826534919440746,
-0.026565516367554665,
0.11635850369930267,
-0.0694487988948822,
0.11646972596645355,
0.09728671610355377,
0.05548146739602089,
-0.09908585995435715,
0.02373643033206463,
-0.10591994971036911,
0.0024640432093292475,
-0.2569774389266968,
-0.005124390125274658,
-0.06571763008832932,
0.06924369931221008,
-0.03966303914785385,
-0.1075524166226387,
0.022764498367905617,
0.005322130396962166,
0.10874362289905548,
0.1000693067908287,
-0.17926017940044403,
-0.05148404836654663,
0.12636883556842804,
-0.1271693855524063,
-0.1113729402422905,
0.08568505942821503,
-0.04757939279079437,
0.06583791971206665,
0.07703462988138199,
0.16356876492500305,
-0.00956286396831274,
-0.15184687077999115,
0.010717145167291164,
-0.0590687021613121,
0.04493590071797371,
-0.05465210974216461,
0.0488610714673996,
0.0008339445339515805,
-0.021729113534092903,
0.032641030848026276,
-0.06953171640634537,
0.04686718434095383,
-0.1324886828660965,
-0.07485809177160263,
-0.06488070636987686,
-0.1212497353553772,
0.041503455489873886,
0.06225567311048508,
0.06155192852020264,
-0.1368376612663269,
-0.061608269810676575,
0.1359492540359497,
0.0928584486246109,
-0.05217166990041733,
0.004119411576539278,
-0.05063227564096451,
0.05520080775022507,
-0.041433896869421005,
-0.04931100085377693,
-0.1645110845565796,
-0.08925328403711319,
0.015393294394016266,
-0.0047068120911717415,
0.00327594974078238,
-0.044142138212919235,
0.07455213367938995,
0.11798553168773651,
-0.07449058443307877,
-0.04214201495051384,
-0.10275856405496597,
0.019334375858306885,
-0.1075754463672638,
-0.19162553548812866,
-0.03287539631128311,
-0.00888550654053688,
0.07661817222833633,
-0.20160453021526337,
0.04192623123526573,
-0.029132604598999023,
0.09545008093118668,
0.02637353353202343,
-0.005475367419421673,
-0.06444989144802094,
0.09747123718261719,
0.0045448471792042255,
-0.0580558106303215,
0.02937583066523075,
-0.021635031327605247,
-0.04795526713132858,
-0.09300464391708374,
-0.09899953752756119,
0.19842319190502167,
0.14743494987487793,
-0.10093656182289124,
-0.09002619981765747,
0.0297528188675642,
-0.06982388347387314,
-0.016209444031119347,
-0.06776061654090881,
0.020620150491595268,
0.11483170837163925,
-0.02282099425792694,
0.11661163717508316,
-0.09248635172843933,
-0.022356461733579636,
0.0014529010513797402,
-0.041367556899785995,
0.057399142533540726,
0.06753043085336685,
0.11225567758083344,
-0.053833018988370895,
0.11423299461603165,
0.15965376794338226,
-0.11110177636146545,
0.10334773361682892,
-0.06334836781024933,
-0.07238177955150604,
-0.015347606502473354,
-0.01436277199536562,
-0.007964075542986393,
0.1776101291179657,
-0.04427338019013405,
0.03807692229747772,
-0.009708701632916927,
0.017880810424685478,
0.027409786358475685,
-0.2506949007511139,
-0.052766069769859314,
-0.006493645720183849,
-0.04180464893579483,
-0.046832308173179626,
-0.01916314661502838,
0.020225372165441513,
0.10081995278596878,
-0.04058639332652092,
-0.057044293731451035,
0.0223590936511755,
0.00557175325229764,
-0.07168775796890259,
0.22610197961330414,
-0.07881317287683487,
-0.04608485847711563,
-0.10444154590368271,
0.012919694185256958,
-0.04146885499358177,
0.009725523181259632,
0.0729912519454956,
-0.11279396712779999,
-0.044001005589962006,
-0.06022777780890465,
0.05084332823753357,
0.07876109331846237,
0.039546042680740356,
0.007717863656580448,
0.01782396249473095,
0.09497642517089844,
-0.14543552696704865,
0.0006467378116212785,
-0.07175392657518387,
-0.08312161266803741,
0.056910187005996704,
0.08347267657518387,
0.11983032524585724,
0.13179658353328705,
-0.040337517857551575,
-0.02079136297106743,
-0.02325625531375408,
0.27377429604530334,
-0.06932184845209122,
-0.062200676649808884,
0.14498460292816162,
-0.0005378815694712102,
0.03789127990603447,
0.11220243573188782,
0.08702115714550018,
-0.13622024655342102,
0.029426908120512962,
0.025967992842197418,
-0.021020567044615746,
-0.21206539869308472,
-0.028446493670344353,
-0.0029190380591899157,
-0.07482894510030746,
0.0390218086540699,
0.021479828283190727,
-0.0022142441011965275,
0.06011484935879707,
0.02166934125125408,
0.048596709966659546,
-0.0249907448887825,
0.06053663790225983,
0.0602521076798439,
0.046088092029094696,
0.10996872186660767,
-0.04331322759389877,
-0.06801527738571167,
0.02170231193304062,
-0.04719885438680649,
0.22817444801330566,
0.00001491811781306751,
0.041673850268125534,
0.07939895242452621,
0.17735464870929718,
-0.002462110947817564,
0.10430179536342621,
0.019770093262195587,
-0.06619296967983246,
0.011421018280088902,
-0.07086790353059769,
-0.003634663764387369,
0.02175724133849144,
-0.14258019626140594,
0.08539474755525589,
-0.12784256041049957,
-0.004376190714538097,
0.0801885724067688,
0.22792589664459229,
0.02038872055709362,
-0.33866608142852783,
-0.07218042016029358,
0.0015513667603954673,
-0.007956861518323421,
-0.023518022149801254,
0.011211591772735119,
0.1214594691991806,
-0.04184851422905922,
0.027020303532481194,
-0.03522983193397522,
0.06242155283689499,
0.031152624636888504,
0.04701671749353409,
0.05710339918732643,
0.13777777552604675,
-0.01353135984390974,
0.03180978074669838,
-0.2967328429222107,
0.2733897864818573,
0.019609946757555008,
0.13773353397846222,
-0.023160841315984726,
-0.02387249656021595,
0.020564470440149307,
0.06624981760978699,
0.01824108138680458,
-0.030782584100961685,
-0.05063082277774811,
-0.23543617129325867,
-0.039089519530534744,
0.06226648762822151,
0.12959490716457367,
0.01751827634871006,
0.09988957643508911,
-0.00317860534414649,
0.012835978530347347,
0.10463982820510864,
-0.06836505234241486,
-0.19078455865383148,
-0.010738518089056015,
-0.0613342821598053,
0.02804485149681568,
0.016211025416851044,
-0.11761446297168732,
-0.1024993360042572,
-0.0700412392616272,
0.06540956348180771,
0.0037123053334653378,
-0.03701884299516678,
-0.11041536182165146,
0.07908197492361069,
0.09566164761781693,
-0.048732247203588486,
0.07471208274364471,
0.041078776121139526,
0.056677158921957016,
0.024378404021263123,
-0.03807076811790466,
0.10815038532018661,
-0.06901469081640244,
-0.2009645253419876,
-0.05414474010467529,
0.08314872533082962,
0.05137045681476593,
0.03196307271718979,
-0.01318739727139473,
0.015162421390414238,
0.008538338355720043,
-0.09961853921413422,
0.0038757321890443563,
-0.02996899001300335,
0.05577516183257103,
0.028683772310614586,
-0.051311735063791275,
-0.042351674288511276,
-0.050810445100069046,
-0.0363025926053524,
0.10146798938512802,
0.32721322774887085,
-0.06988053023815155,
-0.04230755195021629,
0.08364763855934143,
-0.03463691845536232,
-0.17070871591567993,
0.09220478683710098,
0.06034506857395172,
-0.0038430248387157917,
0.07597258687019348,
-0.11784222722053528,
0.12991152703762054,
0.14157889783382416,
-0.028757434338331223,
0.1240784227848053,
-0.28924304246902466,
-0.1585167944431305,
0.11064843088388443,
0.21439293026924133,
0.1088043749332428,
-0.1557728499174118,
-0.01764458417892456,
-0.04504897817969322,
-0.13108333945274353,
0.09324704110622406,
-0.15120381116867065,
0.08701140433549881,
-0.01612214185297489,
0.06821642071008682,
-0.0008300155750475824,
-0.06112472340464592,
0.14311541616916656,
0.0007622124394401908,
0.1586318016052246,
-0.03935972973704338,
-0.012870344333350658,
0.09020872414112091,
-0.021951651200652122,
0.0271515604108572,
-0.01850840076804161,
0.04274798557162285,
0.009107017889618874,
-0.023169470950961113,
-0.08148299902677536,
0.049713414162397385,
-0.0533994697034359,
-0.08378700911998749,
-0.024846956133842468,
0.018639277666807175,
-0.024928512051701546,
-0.04630934074521065,
0.09266750514507294,
0.03939923644065857,
0.16225983202457428,
0.07548313587903976,
0.029977168887853622,
-0.0779544860124588,
-0.00511569669470191,
0.03007619082927704,
-0.0344298779964447,
0.07267012447118759,
-0.13134963810443878,
0.012572791427373886,
0.1029602587223053,
0.020421480759978294,
0.10135439038276672,
0.08415261656045914,
-0.04837226867675781,
0.028388533741235733,
0.09198541939258575,
-0.16636347770690918,
-0.0741669088602066,
0.008622251451015472,
-0.039275527000427246,
-0.07221722602844238,
0.10561412572860718,
0.0799483135342598,
-0.09373471885919571,
-0.007967788726091385,
-0.03593408688902855,
-0.012631812132894993,
-0.07303313910961151,
0.2227294147014618,
0.10064034163951874,
0.05390103533864021,
-0.09364035725593567,
0.09540357440710068,
0.037226930260658264,
-0.019714446738362312,
-0.02629864402115345,
0.050324659794569016,
-0.0708092525601387,
-0.0081774378195405,
0.11950493603944778,
0.19487260282039642,
-0.07515739649534225,
-0.06573905795812607,
-0.18015339970588684,
-0.11609358340501785,
0.016829947009682655,
0.19300605356693268,
0.10496100038290024,
0.005628860089927912,
0.018334483727812767,
0.04501958191394806,
-0.13519175350666046,
0.08371682465076447,
0.013451937586069107,
0.09526120126247406,
-0.15351462364196777,
0.1688528209924698,
0.02303115464746952,
-0.0004302372981328517,
-0.03028959222137928,
0.06828100979328156,
-0.1316944658756256,
0.024528106674551964,
-0.1442081779241562,
-0.04870409145951271,
0.009894388727843761,
-0.0011268117232248187,
0.009574041701853275,
-0.07749912887811661,
-0.07624939829111099,
0.047325003892183304,
-0.10701669752597809,
-0.00812611822038889,
0.05367180332541466,
0.03715867921710014,
-0.15122228860855103,
-0.03743722289800644,
0.019229386001825333,
-0.06613703072071075,
0.03823935240507126,
0.0487651601433754,
0.03849530592560768,
0.09630363434553146,
-0.20997434854507446,
-0.0029430179856717587,
0.08880948275327682,
-0.016367431730031967,
0.0696609690785408,
-0.05353253334760666,
-0.02602485939860344,
-0.007858239114284515,
0.0978042483329773,
0.008416155353188515,
0.0867205411195755,
-0.13997472822666168,
-0.006191175431013107,
-0.02968805655837059,
-0.06472725421190262,
-0.04450218006968498,
-0.0349254235625267,
0.09429186582565308,
-0.013142097741365433,
0.18351885676383972,
-0.09520009905099869,
0.011849514208734035,
-0.21236686408519745,
-0.014442931860685349,
-0.028647301718592644,
-0.09178724139928818,
-0.15416806936264038,
-0.026780616492033005,
0.06469432264566422,
-0.04646223038434982,
0.14698436856269836,
-0.0015701946103945374,
0.06641637533903122,
0.0374632254242897,
-0.03034866787493229,
0.026866896077990532,
0.0457080602645874,
0.24175049364566803,
0.031129976734519005,
-0.00613299710676074,
0.03124288097023964,
0.0734565481543541,
0.11419452726840973,
0.01158747635781765,
0.22155217826366425,
0.18127477169036865,
-0.07043489068746567,
0.1019318550825119,
0.0614655576646328,
-0.06218587979674339,
-0.11618132889270782,
0.0043325223959982395,
-0.024654695764183998,
0.025565601885318756,
-0.037003397941589355,
0.18632014095783234,
0.10361658036708832,
-0.16563965380191803,
0.013917487114667892,
-0.0577508844435215,
-0.07344912737607956,
-0.09634224325418472,
0.07478474825620651,
-0.08116906881332397,
-0.19115056097507477,
0.026669880375266075,
-0.11170000582933426,
-0.01101001352071762,
0.1307188719511032,
-0.015901604667305946,
-0.008357066661119461,
0.235883891582489,
0.07531672716140747,
0.062061015516519547,
0.01887267641723156,
0.002806885400786996,
-0.03576918691396713,
-0.07197372615337372,
-0.1012999638915062,
0.005275350995361805,
-0.0329875648021698,
0.015152910724282265,
-0.05596904084086418,
-0.12911638617515564,
0.0355282686650753,
0.015552476979792118,
-0.09444280713796616,
0.024008115753531456,
0.033259183168411255,
0.050202906131744385,
0.02025768719613552,
0.011496256105601788,
0.022110527381300926,
-0.0040362440049648285,
0.2072332501411438,
-0.057010989636182785,
-0.13155171275138855,
-0.06247086822986603,
0.26245516538619995,
0.04406530410051346,
0.021840333938598633,
0.02593136578798294,
-0.12180887162685394,
0.010624225251376629,
0.15366508066654205,
0.14873500168323517,
-0.08162666112184525,
-0.0008891946054063737,
-0.04128885641694069,
-0.02712344564497471,
-0.09549082070589066,
0.12738096714019775,
0.1272391825914383,
0.04202356934547424,
-0.09513653069734573,
-0.020700950175523758,
-0.05248567834496498,
-0.008205144666135311,
-0.04144055396318436,
0.02868380770087242,
0.03159588947892189,
0.009482614696025848,
-0.059001702815294266,
0.07555936276912689,
-0.01746372878551483,
-0.11142945289611816,
0.0962703675031662,
-0.15968038141727448,
-0.14188013970851898,
-0.010838326066732407,
0.1389695703983307,
-0.023466376587748528,
0.06749363988637924,
-0.052638206630945206,
-0.005400174763053656,
0.04293496161699295,
-0.02973678521811962,
-0.051605794578790665,
-0.127696231007576,
0.0767015814781189,
-0.0755990594625473,
0.236514151096344,
-0.0324367955327034,
0.12624210119247437,
0.10700509697198868,
0.03140915557742119,
-0.09085901826620102,
0.12035630643367767,
0.03917442262172699,
-0.1436665952205658,
-0.0003976380976382643,
0.0872146487236023,
-0.052429888397455215,
0.0752221941947937,
0.01663416437804699,
-0.1661776900291443,
0.028236106038093567,
0.006344062741845846,
-0.0737876445055008,
-0.057386551052331924,
-0.06284373253583908,
-0.06381191313266754,
0.09128324687480927,
0.1598876714706421,
-0.03899944946169853,
0.04929110035300255,
-0.0702781155705452,
0.07627421617507935,
0.0825834795832634,
0.04035616293549538,
-0.028585845604538918,
-0.2825299799442291,
0.058108292520046234,
0.14923399686813354,
-0.056315988302230835,
-0.2234940379858017,
-0.07451563328504562,
0.01051599532365799,
-0.0654032900929451,
-0.09339899569749832,
0.06626714020967484,
0.11650731414556503,
0.05525080859661102,
-0.045404739677906036,
-0.1677107810974121,
-0.10386636853218079,
0.1699165403842926,
-0.14799459278583527,
-0.12518051266670227
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-glue-50K
This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0798
- Em accuracy: 86.196
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.35.1
- Pytorch 2.1.2+cu121
- Datasets 2.14.7
- Tokenizers 0.14.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "google-t5/t5-small", "model-index": [{"name": "t5-glue-50K", "results": []}]} | text2text-generation | macabdul9/t5-small-glue-50K | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-12T13:07:02+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google-t5/t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# t5-glue-50K
This model is a fine-tuned version of google-t5/t5-small on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0798
- Em accuracy: 86.196
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.35.1
- Pytorch 2.1.2+cu121
- Datasets 2.14.7
- Tokenizers 0.14.1
| [
"# t5-glue-50K\n\nThis model is a fine-tuned version of google-t5/t5-small on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0798\n- Em accuracy: 86.196",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 128\n- eval_batch_size: 64\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 512\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.7\n- Tokenizers 0.14.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google-t5/t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# t5-glue-50K\n\nThis model is a fine-tuned version of google-t5/t5-small on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0798\n- Em accuracy: 86.196",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 128\n- eval_batch_size: 64\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 512\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.7\n- Tokenizers 0.14.1"
] | [
82,
59,
6,
12,
8,
3,
113,
4,
35
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-google-t5/t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# t5-glue-50K\n\nThis model is a fine-tuned version of google-t5/t5-small on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0798\n- Em accuracy: 86.196## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 128\n- eval_batch_size: 64\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 512\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.35.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.7\n- Tokenizers 0.14.1"
] | [
-0.10742668062448502,
0.16421237587928772,
-0.0030522106681019068,
0.08707010000944138,
0.1339249610900879,
0.016370883211493492,
0.09991015493869781,
0.13746045529842377,
-0.055538251996040344,
0.12484076619148254,
0.07783602923154831,
0.039237864315509796,
0.07382412999868393,
0.17399683594703674,
0.00981739442795515,
-0.24012359976768494,
-0.012043790891766548,
-0.03401672840118408,
-0.07524416595697403,
0.09884072095155716,
0.11094125360250473,
-0.07600103318691254,
0.06700175255537033,
0.0009729810408316553,
-0.12762567400932312,
0.01042719092220068,
-0.022945541888475418,
-0.05924633890390396,
0.07449619472026825,
0.03720849007368088,
0.021468527615070343,
0.025362566113471985,
0.09489700943231583,
-0.20137721300125122,
-0.006493523251265287,
0.07212720811367035,
0.03146498277783394,
0.09263629466295242,
0.07374531775712967,
0.007783356122672558,
0.026130670681595802,
-0.16382189095020294,
0.06926660984754562,
0.05614521726965904,
-0.06527040898799896,
-0.15721409022808075,
-0.07552844285964966,
0.10732664912939072,
0.1066460832953453,
0.08134039491415024,
-0.009179227985441685,
0.1316891759634018,
-0.02074790745973587,
0.06424913555383682,
0.1921045482158661,
-0.2838720977306366,
-0.06045927852392197,
0.0711517184972763,
0.04492757469415665,
0.058155350387096405,
-0.09917132556438446,
0.026393087580800056,
0.054340630769729614,
-0.0023354196455329657,
0.08490726351737976,
0.012770435772836208,
0.014646649360656738,
-0.011354484595358372,
-0.09904230386018753,
-0.04224936664104462,
0.22614754736423492,
0.10014965385198593,
-0.04597415030002594,
-0.1352270096540451,
-0.046288035809993744,
-0.10743241757154465,
-0.013360964134335518,
-0.04704706370830536,
0.0424199141561985,
-0.029150642454624176,
-0.06689278036355972,
-0.05898091197013855,
-0.06697843968868256,
-0.05304521322250366,
0.030700018629431725,
0.07309195399284363,
0.03158565238118172,
-0.009210806339979172,
0.0037900202441960573,
0.10112398117780685,
-0.019685685634613037,
-0.13628193736076355,
-0.04315832257270813,
0.004422484431415796,
-0.11570310592651367,
-0.04785430058836937,
-0.019243821501731873,
0.007372218184173107,
0.0274127759039402,
0.1661754697561264,
0.006684518419206142,
0.08945076912641525,
0.05079527199268341,
0.010500801727175713,
-0.004843489732593298,
0.1533401608467102,
-0.04918047785758972,
-0.08660334348678589,
0.016370011493563652,
0.07941237092018127,
0.020430568605661392,
-0.02303626760840416,
-0.07395848631858826,
-0.01752290315926075,
0.12321321666240692,
0.06122526153922081,
0.016024738550186157,
0.04600584879517555,
-0.05963073670864105,
-0.03141270577907562,
0.005477564409375191,
-0.13135020434856415,
0.03110412508249283,
-0.025184795260429382,
-0.06466145068407059,
-0.021024897694587708,
0.04059815779328346,
-0.010365800000727177,
-0.05190388485789299,
0.0029883289244025946,
-0.08498293906450272,
-0.04236297309398651,
-0.03926025331020355,
-0.015390713699162006,
0.010484836995601654,
-0.037763360887765884,
-0.008535442873835564,
-0.08514508605003357,
-0.196059450507164,
-0.06527348607778549,
0.0317155197262764,
-0.08256285637617111,
-0.10109575092792511,
-0.030837038531899452,
-0.04997576028108597,
0.028803681954741478,
-0.023636413738131523,
0.0790363997220993,
-0.01716293767094612,
0.05925297364592552,
0.025737181305885315,
0.03391372784972191,
0.09582727402448654,
0.029615040868520737,
-0.07714331150054932,
0.05701206251978874,
-0.14743539690971375,
0.1339201182126999,
-0.09651058912277222,
0.05215688422322273,
-0.1627848893404007,
-0.07127481698989868,
-0.02485600672662258,
-0.03416259214282036,
0.06989242136478424,
0.15589115023612976,
-0.14145486056804657,
-0.0347454771399498,
0.14432744681835175,
-0.0507938489317894,
-0.11218847334384918,
0.1026928648352623,
-0.0019229799509048462,
0.04117250442504883,
0.061199408024549484,
0.15262284874916077,
0.141377255320549,
-0.08355345577001572,
-0.030551455914974213,
0.025693882256746292,
0.07652778178453445,
0.04377281665802002,
0.06996868550777435,
-0.05760106444358826,
-0.0005820169462822378,
0.034317661076784134,
-0.07770371437072754,
0.0016931574791669846,
-0.0713280737400055,
-0.06881222128868103,
-0.06062733381986618,
-0.07416867464780807,
0.04098868370056152,
0.0188249871134758,
0.03723953664302826,
-0.05136862024664879,
-0.14633521437644958,
0.028120044618844986,
0.10683926194906235,
-0.06503498554229736,
0.0011525898007676005,
-0.06708128750324249,
0.08398903161287308,
-0.04223978519439697,
-0.0007415135623887181,
-0.1698833853006363,
-0.1419771909713745,
0.0667673870921135,
-0.10657443106174469,
0.03565678000450134,
-0.022414594888687134,
0.05209597200155258,
0.04599284008145332,
-0.04644951596856117,
-0.03787022456526756,
-0.05295135825872421,
-0.012177948839962482,
-0.07580310851335526,
-0.16246694326400757,
-0.04068512097001076,
-0.007597694173455238,
0.1757725179195404,
-0.22448863089084625,
0.018618863075971603,
-0.004140470176935196,
0.1443239450454712,
-0.00622898992151022,
-0.07882999628782272,
0.01765061356127262,
-0.028042206540703773,
-0.024799460545182228,
-0.1294141411781311,
0.022524600848555565,
0.01314880046993494,
-0.12008939683437347,
-0.0432661771774292,
-0.13165777921676636,
0.04515212029218674,
0.06275572627782822,
0.08382529020309448,
-0.10962986946105957,
-0.06113334745168686,
-0.05067369341850281,
-0.05002112314105034,
-0.05515548959374428,
-0.03870004415512085,
0.18939127027988434,
0.009305568411946297,
0.10699024051427841,
-0.07010859251022339,
-0.07867234945297241,
0.01767314039170742,
-0.0002180873998440802,
-0.030064282938838005,
0.07009178400039673,
0.026367753744125366,
-0.1266847401857376,
0.08020351827144623,
0.0814322829246521,
-0.004133009817451239,
0.12133903056383133,
-0.06551109999418259,
-0.07010391354560852,
-0.04687543585896492,
0.034800585359334946,
0.006446402985602617,
0.08873573690652847,
-0.10499577224254608,
0.004527813754975796,
0.03597415238618851,
0.008067256771028042,
0.024726131930947304,
-0.10378062725067139,
0.014418194070458412,
0.03350754827260971,
-0.04320254176855087,
0.029387937858700752,
-0.02257809229195118,
-0.018322398886084557,
0.07436085492372513,
0.03494678810238838,
0.027123358100652695,
0.028118183836340904,
-0.009451496414840221,
-0.1002277210354805,
0.18615439534187317,
-0.09681249409914017,
-0.1642020046710968,
-0.13377496600151062,
0.09149231761693954,
-0.06119179353117943,
-0.01972985453903675,
-0.0025640963576734066,
-0.07185059040784836,
-0.06078342720866203,
-0.08379215747117996,
0.016468316316604614,
-0.048639945685863495,
0.006712956354022026,
0.08089982718229294,
0.02198753133416176,
0.11126963049173355,
-0.10715918242931366,
0.015662619844079018,
0.02327880449593067,
-0.08424673974514008,
-0.020339518785476685,
0.011976849287748337,
0.09849737584590912,
0.11149889975786209,
0.0009709009900689125,
0.026166632771492004,
-0.03544168546795845,
0.20874769985675812,
-0.09056136757135391,
0.044290829449892044,
0.12322589755058289,
0.0476590134203434,
0.06016393378376961,
0.1048501506447792,
0.009912089444696903,
-0.08902043849229813,
0.04672990366816521,
0.043341249227523804,
-0.019001763314008713,
-0.24434524774551392,
-0.023216288536787033,
-0.029127473011612892,
-0.04585416242480278,
0.11517663300037384,
0.06588273495435715,
0.0359732024371624,
0.0762813612818718,
-0.029941363260149956,
0.06373012065887451,
0.01334778219461441,
0.09351897239685059,
0.10210179537534714,
0.045650772750377655,
0.09266111999750137,
-0.02318655326962471,
0.008147344924509525,
0.06467723846435547,
0.0163813978433609,
0.254746675491333,
-0.019064366817474365,
0.16517868638038635,
0.015685804188251495,
0.13006778061389923,
-0.025090180337429047,
0.03625565394759178,
0.0440339557826519,
0.014631777070462704,
0.0067065199837088585,
-0.06938343495130539,
-0.0313815213739872,
0.03580771014094353,
0.022545307874679565,
0.026836734265089035,
-0.07921483367681503,
0.06431616097688675,
0.025858426466584206,
0.23569929599761963,
0.060044221580028534,
-0.30760669708251953,
-0.0987476259469986,
0.027843816205859184,
-0.019612688571214676,
-0.08435887098312378,
0.013942824676632881,
0.08938339352607727,
-0.1390749216079712,
0.09580083191394806,
-0.048728346824645996,
0.08348017185926437,
-0.06596676260232925,
-0.011495924554765224,
0.05446866899728775,
0.10080230236053467,
0.005182352848351002,
0.10371477156877518,
-0.17757777869701385,
0.17681489884853363,
0.0061647649854421616,
0.0689236968755722,
-0.050219930708408356,
0.06601181626319885,
-0.0027113098185509443,
0.04630904272198677,
0.1490873545408249,
0.005000206176191568,
-0.05005178600549698,
-0.13519471883773804,
-0.12126103788614273,
0.023928368464112282,
0.11098263412714005,
-0.12135665863752365,
0.07361332327127457,
-0.05799451097846031,
-0.015021231025457382,
0.026835735887289047,
-0.06909641623497009,
-0.19237279891967773,
-0.15040317177772522,
0.03109966404736042,
-0.026793098077178,
0.015981703996658325,
-0.08477120846509933,
-0.09620940685272217,
-0.056152261793613434,
0.23109006881713867,
-0.05396087467670441,
-0.07314957678318024,
-0.16091206669807434,
0.10861870646476746,
0.13692690432071686,
-0.07621552795171738,
0.04211313650012016,
-0.0027312051970511675,
0.14934322237968445,
0.05636752396821976,
-0.07603993266820908,
0.07451692968606949,
-0.06766664236783981,
-0.18288061022758484,
-0.06017028167843819,
0.14369747042655945,
0.020957956090569496,
0.03303952515125275,
0.008792801760137081,
0.008658668957650661,
0.022717759013175964,
-0.10181639343500137,
0.026135938242077827,
0.07116243988275528,
0.044901035726070404,
0.04446887969970703,
-0.039582718163728714,
0.0233265683054924,
-0.047116246074438095,
-0.02476665936410427,
0.09458927810192108,
0.23158499598503113,
-0.09283509850502014,
0.04764702916145325,
0.04948478937149048,
-0.07759765535593033,
-0.1609218269586563,
0.034743063151836395,
0.12699025869369507,
0.025198504328727722,
0.04761412739753723,
-0.162607342004776,
0.10195820033550262,
0.08697705715894699,
-0.0315016433596611,
0.013966756872832775,
-0.2635170817375183,
-0.14773868024349213,
0.061998818069696426,
0.07031701505184174,
-0.0334380604326725,
-0.14576196670532227,
-0.0852266177535057,
-0.02843598462641239,
-0.069337859749794,
0.05513318255543709,
-0.014740423299372196,
0.08734381943941116,
0.0015008981572464108,
0.02909621223807335,
0.039919037371873856,
-0.03101562149822712,
0.15285558998584747,
0.04873694106936455,
0.04228983819484711,
-0.05450952798128128,
0.060790158808231354,
0.10580848902463913,
-0.08731888234615326,
0.07344172894954681,
-0.02976674772799015,
0.10770164430141449,
-0.12948653101921082,
-0.02608853578567505,
-0.046291399747133255,
0.08203785121440887,
-0.06480424851179123,
-0.04489613324403763,
-0.055266574025154114,
0.05402936041355133,
0.05460652709007263,
-0.02383204735815525,
0.05888306722044945,
0.026983482763171196,
0.06206028163433075,
0.11833634227514267,
0.0875660702586174,
0.04246607422828674,
-0.1600538194179535,
-0.009197971783578396,
-0.02209276705980301,
0.02671908214688301,
-0.1552698314189911,
0.02437390387058258,
0.10981026291847229,
0.04837746173143387,
0.10347684472799301,
0.025979753583669662,
-0.07358887791633606,
-0.005533472169190645,
0.04043544828891754,
-0.08427964895963669,
-0.20284172892570496,
-0.05373857915401459,
-0.05462908372282982,
-0.14870896935462952,
0.018931977450847626,
0.10372503846883774,
-0.048081401735544205,
-0.005246168002486229,
-0.014393717050552368,
0.021423345431685448,
0.010890334844589233,
0.17152215540409088,
0.03610524162650108,
0.0808241069316864,
-0.06482481211423874,
0.1394946426153183,
0.09678680449724197,
-0.05502532795071602,
0.05881548672914505,
0.03997239097952843,
-0.09417186677455902,
-0.015969883650541306,
0.053586188703775406,
0.08957237005233765,
-0.004747471772134304,
-0.028207387775182724,
-0.06050267443060875,
-0.07335817068815231,
0.03789392113685608,
0.01277666911482811,
0.027764014899730682,
-0.003764648223295808,
-0.004336201585829258,
0.006891762372106314,
-0.10619186609983444,
0.1070498451590538,
0.08109135180711746,
0.08372010290622711,
-0.17735081911087036,
0.043354351073503494,
0.012078877538442612,
0.046999599784612656,
-0.021879468113183975,
0.004025527276098728,
-0.0892702043056488,
-0.04653475806117058,
-0.055663224309682846,
0.006773099768906832,
-0.02909153513610363,
-0.0034318529069423676,
-0.01491644885390997,
-0.0505211278796196,
-0.032387591898441315,
0.05267339199781418,
-0.04401426389813423,
-0.0937218889594078,
0.018661951646208763,
0.08446469902992249,
-0.11151521652936935,
-0.00088317139307037,
0.04485108330845833,
-0.11694782972335815,
0.12553830444812775,
0.03162972256541252,
0.04277939349412918,
0.0036390419118106365,
-0.0754433199763298,
0.04057377204298973,
0.024014046415686607,
0.03225928172469139,
0.022298604249954224,
-0.11090192943811417,
0.007733557838946581,
-0.04330897331237793,
0.027265775948762894,
-0.002103877253830433,
0.013905146159231663,
-0.12631674110889435,
-0.05158951133489609,
-0.07690855115652084,
-0.01710335724055767,
-0.0641905888915062,
0.05322372913360596,
0.06527596712112427,
0.004224597010761499,
0.1239134669303894,
-0.07069795578718185,
0.025659624487161636,
-0.23807546496391296,
-0.021259866654872894,
0.004468049854040146,
-0.018034430220723152,
-0.052245792001485825,
-0.0325443372130394,
0.08396007865667343,
-0.04424820840358734,
0.09805852919816971,
-0.019961457699537277,
0.11111254245042801,
0.029062630608677864,
-0.011046864092350006,
0.028391791507601738,
0.018594665452837944,
0.15866681933403015,
0.06333889812231064,
0.004084850195795298,
0.08389648795127869,
-0.013030276633799076,
0.053680695593357086,
-0.030849842354655266,
0.09868073463439941,
0.10977354645729065,
-0.10455707460641861,
0.06417638063430786,
0.057507168501615524,
-0.12061132490634918,
-0.1721477508544922,
0.1433221697807312,
-0.05260837450623512,
0.1291036456823349,
-0.050015661865472794,
0.08276787400245667,
0.10851109027862549,
-0.16120128333568573,
0.036599017679691315,
-0.04863414913415909,
-0.11339342594146729,
-0.10325706750154495,
-0.13409127295017242,
-0.08628431707620621,
-0.1405348926782608,
0.026948675513267517,
-0.1069922223687172,
0.0020294119603931904,
0.06546266376972198,
0.004156600218266249,
-0.008660267107188702,
0.14380976557731628,
-0.015537061728537083,
-0.021487684920430183,
0.07840847223997116,
0.013925121165812016,
-0.017097104340791702,
-0.053377509117126465,
-0.05589595437049866,
0.03470032662153244,
0.04523349180817604,
0.07623914629220963,
-0.022089650854468346,
0.017089372500777245,
0.05452543869614601,
0.005724822171032429,
-0.08078263700008392,
0.0032678940333426,
0.0110108507797122,
-0.020981311798095703,
-0.003723004600033164,
0.03744376823306084,
-0.020914461463689804,
-0.0473925843834877,
0.26335522532463074,
-0.06536982208490372,
-0.026418671011924744,
-0.12387602031230927,
0.12739473581314087,
0.005143731366842985,
-0.004549904260784388,
0.058485958725214005,
-0.12734512984752655,
0.0015729169826954603,
0.15229946374893188,
0.10524461418390274,
-0.004038294777274132,
-0.028259214013814926,
0.003976642619818449,
-0.025879422202706337,
-0.04593423753976822,
0.11694486439228058,
0.08390562236309052,
0.007612553425133228,
-0.031002942472696304,
0.02591804787516594,
0.02795243076980114,
-0.04625049978494644,
-0.08681076020002365,
0.10879097878932953,
-0.006239255890250206,
0.03300140053033829,
-0.02318398654460907,
0.09530524164438248,
-0.015547750517725945,
-0.18169233202934265,
0.039219893515110016,
-0.13942091166973114,
-0.17921437323093414,
-0.03116633929312229,
0.04373005032539368,
-0.0189447533339262,
0.050507210195064545,
0.006689794827252626,
0.007358981296420097,
0.13887090981006622,
-0.011752977035939693,
-0.07804512232542038,
-0.05271473526954651,
0.042466551065444946,
-0.07349533587694168,
0.24402621388435364,
-0.0020957973320037127,
0.04325471445918083,
0.12456134706735611,
-0.004073563497513533,
-0.16860157251358032,
0.01196415163576603,
0.06942223757505417,
-0.012182588689029217,
0.08879595994949341,
0.15039943158626556,
-0.010870290920138359,
0.08855745941400528,
0.059990689158439636,
-0.07280626893043518,
-0.033349379897117615,
-0.03504509478807449,
0.03108918108046055,
-0.12030965089797974,
0.0246543250977993,
-0.07064661383628845,
0.16367699205875397,
0.14436525106430054,
-0.06745003163814545,
-0.019916221499443054,
-0.06835772097110748,
0.0392681248486042,
0.03324446454644203,
0.12641309201717377,
0.02026730589568615,
-0.18572776019573212,
0.019286448135972023,
-0.003237730823457241,
0.04541395604610443,
-0.22433629631996155,
-0.08903418481349945,
0.03316640108823776,
-0.04607969895005226,
-0.05257672443985939,
0.1433458775281906,
0.07256147265434265,
0.011958546936511993,
-0.03722769021987915,
-0.10464265197515488,
-0.049506448209285736,
0.1439552903175354,
-0.17140288650989532,
-0.048661969602108
] |
null | null | null |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-base-rank32
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.8605
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 13.9481 | 1.0 | 179 | 8.8949 |
| 7.5843 | 2.0 | 358 | 5.4569 |
| 5.6191 | 3.0 | 537 | 4.8605 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "microsoft/deberta-v3-base", "model-index": [{"name": "deberta-v3-base-rank32", "results": []}]} | null | alitolga/deberta-v3-base-rank32 | [
"safetensors",
"generated_from_trainer",
"base_model:microsoft/deberta-v3-base",
"license:mit",
"region:us"
] | 2024-02-12T13:07:12+00:00 | [] | [] | TAGS
#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us
| deberta-v3-base-rank32
======================
This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 4.8605
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.1+cu118
* Datasets 2.15.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
37,
98,
4,
33
] | [
"passage: TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
-0.10977909713983536,
-0.02383926510810852,
-0.000272897508693859,
0.0962144061923027,
0.18739436566829681,
0.03188342973589897,
0.1521315574645996,
0.04621896147727966,
-0.11039812862873077,
0.026030896231532097,
0.10344589501619339,
0.12346210330724716,
-0.01214214600622654,
0.12621824443340302,
-0.05480639636516571,
-0.22153477370738983,
0.015339897945523262,
0.017923252657055855,
-0.055215757340192795,
0.11165095120668411,
0.10440938174724579,
-0.16147270798683167,
0.0682869404554367,
-0.013526142574846745,
-0.23760250210762024,
0.03791137784719467,
0.04424617439508438,
-0.06407684087753296,
0.14295685291290283,
-0.009633136913180351,
0.18386055529117584,
0.012823836877942085,
0.13880614936351776,
-0.16603654623031616,
0.01076443400233984,
0.06743723154067993,
0.0072526587173342705,
0.05671124532818794,
0.06414835900068283,
0.008718214929103851,
0.08650407195091248,
-0.10005319863557816,
0.0757589116692543,
0.031208310276269913,
-0.13932667672634125,
-0.27417436242103577,
-0.08789606392383575,
0.0025896879378706217,
0.07555236667394638,
0.08566202968358994,
-0.013975093141198158,
0.17519612610340118,
-0.10044020414352417,
0.08535193651914597,
0.25552237033843994,
-0.25281932950019836,
-0.07021211832761765,
0.071146659553051,
0.023459527641534805,
0.08894380927085876,
-0.10143274813890457,
-0.02900632470846176,
0.0903119370341301,
0.04442868381738663,
0.12188786268234253,
-0.012826957739889622,
-0.11815144866704941,
0.009122032672166824,
-0.14692682027816772,
-0.004426182713359594,
0.0912783071398735,
0.03500741720199585,
-0.054377976804971695,
-0.004236732143908739,
-0.07277829945087433,
-0.1371414214372635,
-0.051367513835430145,
-0.04145136475563049,
0.06730754673480988,
-0.06141535937786102,
-0.08764902502298355,
0.01499799732118845,
-0.11625413596630096,
-0.11828317493200302,
-0.024499638006091118,
0.24395588040351868,
0.052924994379282,
0.028124088421463966,
-0.04764493182301521,
0.1291472613811493,
-0.05694502219557762,
-0.1366644948720932,
0.024524841457605362,
0.019550397992134094,
0.024960318580269814,
-0.05345170944929123,
-0.07289297133684158,
-0.0356169231235981,
0.027074923738837242,
0.1499943733215332,
-0.1311299353837967,
0.0237225741147995,
0.04559052735567093,
0.028581978753209114,
-0.11121973395347595,
0.16195543110370636,
-0.035876959562301636,
0.010333948768675327,
0.028275663033127785,
0.07690852880477905,
0.03842984512448311,
0.0128098726272583,
-0.07406949251890182,
0.005984250921756029,
0.1138528510928154,
0.03108055889606476,
-0.10245327651500702,
0.03943431377410889,
-0.04984300583600998,
0.02609124220907688,
0.022958341985940933,
-0.09585303068161011,
0.02446972019970417,
0.01637401059269905,
-0.0685202106833458,
-0.0464635044336319,
0.017798306420445442,
0.028543973341584206,
0.041776999831199646,
0.1314091831445694,
-0.09103567898273468,
0.033552948385477066,
-0.1253967434167862,
-0.11012474447488785,
-0.012358481995761395,
-0.013520389795303345,
0.038584187626838684,
-0.11949725449085236,
-0.1693306863307953,
-0.004231118597090244,
0.03667663410305977,
-0.015293833799660206,
0.01881149411201477,
-0.03158961609005928,
-0.0996800884604454,
-0.012826534919440746,
-0.026565516367554665,
0.11635850369930267,
-0.0694487988948822,
0.11646972596645355,
0.09728671610355377,
0.05548146739602089,
-0.09908585995435715,
0.02373643033206463,
-0.10591994971036911,
0.0024640432093292475,
-0.2569774389266968,
-0.005124390125274658,
-0.06571763008832932,
0.06924369931221008,
-0.03966303914785385,
-0.1075524166226387,
0.022764498367905617,
0.005322130396962166,
0.10874362289905548,
0.1000693067908287,
-0.17926017940044403,
-0.05148404836654663,
0.12636883556842804,
-0.1271693855524063,
-0.1113729402422905,
0.08568505942821503,
-0.04757939279079437,
0.06583791971206665,
0.07703462988138199,
0.16356876492500305,
-0.00956286396831274,
-0.15184687077999115,
0.010717145167291164,
-0.0590687021613121,
0.04493590071797371,
-0.05465210974216461,
0.0488610714673996,
0.0008339445339515805,
-0.021729113534092903,
0.032641030848026276,
-0.06953171640634537,
0.04686718434095383,
-0.1324886828660965,
-0.07485809177160263,
-0.06488070636987686,
-0.1212497353553772,
0.041503455489873886,
0.06225567311048508,
0.06155192852020264,
-0.1368376612663269,
-0.061608269810676575,
0.1359492540359497,
0.0928584486246109,
-0.05217166990041733,
0.004119411576539278,
-0.05063227564096451,
0.05520080775022507,
-0.041433896869421005,
-0.04931100085377693,
-0.1645110845565796,
-0.08925328403711319,
0.015393294394016266,
-0.0047068120911717415,
0.00327594974078238,
-0.044142138212919235,
0.07455213367938995,
0.11798553168773651,
-0.07449058443307877,
-0.04214201495051384,
-0.10275856405496597,
0.019334375858306885,
-0.1075754463672638,
-0.19162553548812866,
-0.03287539631128311,
-0.00888550654053688,
0.07661817222833633,
-0.20160453021526337,
0.04192623123526573,
-0.029132604598999023,
0.09545008093118668,
0.02637353353202343,
-0.005475367419421673,
-0.06444989144802094,
0.09747123718261719,
0.0045448471792042255,
-0.0580558106303215,
0.02937583066523075,
-0.021635031327605247,
-0.04795526713132858,
-0.09300464391708374,
-0.09899953752756119,
0.19842319190502167,
0.14743494987487793,
-0.10093656182289124,
-0.09002619981765747,
0.0297528188675642,
-0.06982388347387314,
-0.016209444031119347,
-0.06776061654090881,
0.020620150491595268,
0.11483170837163925,
-0.02282099425792694,
0.11661163717508316,
-0.09248635172843933,
-0.022356461733579636,
0.0014529010513797402,
-0.041367556899785995,
0.057399142533540726,
0.06753043085336685,
0.11225567758083344,
-0.053833018988370895,
0.11423299461603165,
0.15965376794338226,
-0.11110177636146545,
0.10334773361682892,
-0.06334836781024933,
-0.07238177955150604,
-0.015347606502473354,
-0.01436277199536562,
-0.007964075542986393,
0.1776101291179657,
-0.04427338019013405,
0.03807692229747772,
-0.009708701632916927,
0.017880810424685478,
0.027409786358475685,
-0.2506949007511139,
-0.052766069769859314,
-0.006493645720183849,
-0.04180464893579483,
-0.046832308173179626,
-0.01916314661502838,
0.020225372165441513,
0.10081995278596878,
-0.04058639332652092,
-0.057044293731451035,
0.0223590936511755,
0.00557175325229764,
-0.07168775796890259,
0.22610197961330414,
-0.07881317287683487,
-0.04608485847711563,
-0.10444154590368271,
0.012919694185256958,
-0.04146885499358177,
0.009725523181259632,
0.0729912519454956,
-0.11279396712779999,
-0.044001005589962006,
-0.06022777780890465,
0.05084332823753357,
0.07876109331846237,
0.039546042680740356,
0.007717863656580448,
0.01782396249473095,
0.09497642517089844,
-0.14543552696704865,
0.0006467378116212785,
-0.07175392657518387,
-0.08312161266803741,
0.056910187005996704,
0.08347267657518387,
0.11983032524585724,
0.13179658353328705,
-0.040337517857551575,
-0.02079136297106743,
-0.02325625531375408,
0.27377429604530334,
-0.06932184845209122,
-0.062200676649808884,
0.14498460292816162,
-0.0005378815694712102,
0.03789127990603447,
0.11220243573188782,
0.08702115714550018,
-0.13622024655342102,
0.029426908120512962,
0.025967992842197418,
-0.021020567044615746,
-0.21206539869308472,
-0.028446493670344353,
-0.0029190380591899157,
-0.07482894510030746,
0.0390218086540699,
0.021479828283190727,
-0.0022142441011965275,
0.06011484935879707,
0.02166934125125408,
0.048596709966659546,
-0.0249907448887825,
0.06053663790225983,
0.0602521076798439,
0.046088092029094696,
0.10996872186660767,
-0.04331322759389877,
-0.06801527738571167,
0.02170231193304062,
-0.04719885438680649,
0.22817444801330566,
0.00001491811781306751,
0.041673850268125534,
0.07939895242452621,
0.17735464870929718,
-0.002462110947817564,
0.10430179536342621,
0.019770093262195587,
-0.06619296967983246,
0.011421018280088902,
-0.07086790353059769,
-0.003634663764387369,
0.02175724133849144,
-0.14258019626140594,
0.08539474755525589,
-0.12784256041049957,
-0.004376190714538097,
0.0801885724067688,
0.22792589664459229,
0.02038872055709362,
-0.33866608142852783,
-0.07218042016029358,
0.0015513667603954673,
-0.007956861518323421,
-0.023518022149801254,
0.011211591772735119,
0.1214594691991806,
-0.04184851422905922,
0.027020303532481194,
-0.03522983193397522,
0.06242155283689499,
0.031152624636888504,
0.04701671749353409,
0.05710339918732643,
0.13777777552604675,
-0.01353135984390974,
0.03180978074669838,
-0.2967328429222107,
0.2733897864818573,
0.019609946757555008,
0.13773353397846222,
-0.023160841315984726,
-0.02387249656021595,
0.020564470440149307,
0.06624981760978699,
0.01824108138680458,
-0.030782584100961685,
-0.05063082277774811,
-0.23543617129325867,
-0.039089519530534744,
0.06226648762822151,
0.12959490716457367,
0.01751827634871006,
0.09988957643508911,
-0.00317860534414649,
0.012835978530347347,
0.10463982820510864,
-0.06836505234241486,
-0.19078455865383148,
-0.010738518089056015,
-0.0613342821598053,
0.02804485149681568,
0.016211025416851044,
-0.11761446297168732,
-0.1024993360042572,
-0.0700412392616272,
0.06540956348180771,
0.0037123053334653378,
-0.03701884299516678,
-0.11041536182165146,
0.07908197492361069,
0.09566164761781693,
-0.048732247203588486,
0.07471208274364471,
0.041078776121139526,
0.056677158921957016,
0.024378404021263123,
-0.03807076811790466,
0.10815038532018661,
-0.06901469081640244,
-0.2009645253419876,
-0.05414474010467529,
0.08314872533082962,
0.05137045681476593,
0.03196307271718979,
-0.01318739727139473,
0.015162421390414238,
0.008538338355720043,
-0.09961853921413422,
0.0038757321890443563,
-0.02996899001300335,
0.05577516183257103,
0.028683772310614586,
-0.051311735063791275,
-0.042351674288511276,
-0.050810445100069046,
-0.0363025926053524,
0.10146798938512802,
0.32721322774887085,
-0.06988053023815155,
-0.04230755195021629,
0.08364763855934143,
-0.03463691845536232,
-0.17070871591567993,
0.09220478683710098,
0.06034506857395172,
-0.0038430248387157917,
0.07597258687019348,
-0.11784222722053528,
0.12991152703762054,
0.14157889783382416,
-0.028757434338331223,
0.1240784227848053,
-0.28924304246902466,
-0.1585167944431305,
0.11064843088388443,
0.21439293026924133,
0.1088043749332428,
-0.1557728499174118,
-0.01764458417892456,
-0.04504897817969322,
-0.13108333945274353,
0.09324704110622406,
-0.15120381116867065,
0.08701140433549881,
-0.01612214185297489,
0.06821642071008682,
-0.0008300155750475824,
-0.06112472340464592,
0.14311541616916656,
0.0007622124394401908,
0.1586318016052246,
-0.03935972973704338,
-0.012870344333350658,
0.09020872414112091,
-0.021951651200652122,
0.0271515604108572,
-0.01850840076804161,
0.04274798557162285,
0.009107017889618874,
-0.023169470950961113,
-0.08148299902677536,
0.049713414162397385,
-0.0533994697034359,
-0.08378700911998749,
-0.024846956133842468,
0.018639277666807175,
-0.024928512051701546,
-0.04630934074521065,
0.09266750514507294,
0.03939923644065857,
0.16225983202457428,
0.07548313587903976,
0.029977168887853622,
-0.0779544860124588,
-0.00511569669470191,
0.03007619082927704,
-0.0344298779964447,
0.07267012447118759,
-0.13134963810443878,
0.012572791427373886,
0.1029602587223053,
0.020421480759978294,
0.10135439038276672,
0.08415261656045914,
-0.04837226867675781,
0.028388533741235733,
0.09198541939258575,
-0.16636347770690918,
-0.0741669088602066,
0.008622251451015472,
-0.039275527000427246,
-0.07221722602844238,
0.10561412572860718,
0.0799483135342598,
-0.09373471885919571,
-0.007967788726091385,
-0.03593408688902855,
-0.012631812132894993,
-0.07303313910961151,
0.2227294147014618,
0.10064034163951874,
0.05390103533864021,
-0.09364035725593567,
0.09540357440710068,
0.037226930260658264,
-0.019714446738362312,
-0.02629864402115345,
0.050324659794569016,
-0.0708092525601387,
-0.0081774378195405,
0.11950493603944778,
0.19487260282039642,
-0.07515739649534225,
-0.06573905795812607,
-0.18015339970588684,
-0.11609358340501785,
0.016829947009682655,
0.19300605356693268,
0.10496100038290024,
0.005628860089927912,
0.018334483727812767,
0.04501958191394806,
-0.13519175350666046,
0.08371682465076447,
0.013451937586069107,
0.09526120126247406,
-0.15351462364196777,
0.1688528209924698,
0.02303115464746952,
-0.0004302372981328517,
-0.03028959222137928,
0.06828100979328156,
-0.1316944658756256,
0.024528106674551964,
-0.1442081779241562,
-0.04870409145951271,
0.009894388727843761,
-0.0011268117232248187,
0.009574041701853275,
-0.07749912887811661,
-0.07624939829111099,
0.047325003892183304,
-0.10701669752597809,
-0.00812611822038889,
0.05367180332541466,
0.03715867921710014,
-0.15122228860855103,
-0.03743722289800644,
0.019229386001825333,
-0.06613703072071075,
0.03823935240507126,
0.0487651601433754,
0.03849530592560768,
0.09630363434553146,
-0.20997434854507446,
-0.0029430179856717587,
0.08880948275327682,
-0.016367431730031967,
0.0696609690785408,
-0.05353253334760666,
-0.02602485939860344,
-0.007858239114284515,
0.0978042483329773,
0.008416155353188515,
0.0867205411195755,
-0.13997472822666168,
-0.006191175431013107,
-0.02968805655837059,
-0.06472725421190262,
-0.04450218006968498,
-0.0349254235625267,
0.09429186582565308,
-0.013142097741365433,
0.18351885676383972,
-0.09520009905099869,
0.011849514208734035,
-0.21236686408519745,
-0.014442931860685349,
-0.028647301718592644,
-0.09178724139928818,
-0.15416806936264038,
-0.026780616492033005,
0.06469432264566422,
-0.04646223038434982,
0.14698436856269836,
-0.0015701946103945374,
0.06641637533903122,
0.0374632254242897,
-0.03034866787493229,
0.026866896077990532,
0.0457080602645874,
0.24175049364566803,
0.031129976734519005,
-0.00613299710676074,
0.03124288097023964,
0.0734565481543541,
0.11419452726840973,
0.01158747635781765,
0.22155217826366425,
0.18127477169036865,
-0.07043489068746567,
0.1019318550825119,
0.0614655576646328,
-0.06218587979674339,
-0.11618132889270782,
0.0043325223959982395,
-0.024654695764183998,
0.025565601885318756,
-0.037003397941589355,
0.18632014095783234,
0.10361658036708832,
-0.16563965380191803,
0.013917487114667892,
-0.0577508844435215,
-0.07344912737607956,
-0.09634224325418472,
0.07478474825620651,
-0.08116906881332397,
-0.19115056097507477,
0.026669880375266075,
-0.11170000582933426,
-0.01101001352071762,
0.1307188719511032,
-0.015901604667305946,
-0.008357066661119461,
0.235883891582489,
0.07531672716140747,
0.062061015516519547,
0.01887267641723156,
0.002806885400786996,
-0.03576918691396713,
-0.07197372615337372,
-0.1012999638915062,
0.005275350995361805,
-0.0329875648021698,
0.015152910724282265,
-0.05596904084086418,
-0.12911638617515564,
0.0355282686650753,
0.015552476979792118,
-0.09444280713796616,
0.024008115753531456,
0.033259183168411255,
0.050202906131744385,
0.02025768719613552,
0.011496256105601788,
0.022110527381300926,
-0.0040362440049648285,
0.2072332501411438,
-0.057010989636182785,
-0.13155171275138855,
-0.06247086822986603,
0.26245516538619995,
0.04406530410051346,
0.021840333938598633,
0.02593136578798294,
-0.12180887162685394,
0.010624225251376629,
0.15366508066654205,
0.14873500168323517,
-0.08162666112184525,
-0.0008891946054063737,
-0.04128885641694069,
-0.02712344564497471,
-0.09549082070589066,
0.12738096714019775,
0.1272391825914383,
0.04202356934547424,
-0.09513653069734573,
-0.020700950175523758,
-0.05248567834496498,
-0.008205144666135311,
-0.04144055396318436,
0.02868380770087242,
0.03159588947892189,
0.009482614696025848,
-0.059001702815294266,
0.07555936276912689,
-0.01746372878551483,
-0.11142945289611816,
0.0962703675031662,
-0.15968038141727448,
-0.14188013970851898,
-0.010838326066732407,
0.1389695703983307,
-0.023466376587748528,
0.06749363988637924,
-0.052638206630945206,
-0.005400174763053656,
0.04293496161699295,
-0.02973678521811962,
-0.051605794578790665,
-0.127696231007576,
0.0767015814781189,
-0.0755990594625473,
0.236514151096344,
-0.0324367955327034,
0.12624210119247437,
0.10700509697198868,
0.03140915557742119,
-0.09085901826620102,
0.12035630643367767,
0.03917442262172699,
-0.1436665952205658,
-0.0003976380976382643,
0.0872146487236023,
-0.052429888397455215,
0.0752221941947937,
0.01663416437804699,
-0.1661776900291443,
0.028236106038093567,
0.006344062741845846,
-0.0737876445055008,
-0.057386551052331924,
-0.06284373253583908,
-0.06381191313266754,
0.09128324687480927,
0.1598876714706421,
-0.03899944946169853,
0.04929110035300255,
-0.0702781155705452,
0.07627421617507935,
0.0825834795832634,
0.04035616293549538,
-0.028585845604538918,
-0.2825299799442291,
0.058108292520046234,
0.14923399686813354,
-0.056315988302230835,
-0.2234940379858017,
-0.07451563328504562,
0.01051599532365799,
-0.0654032900929451,
-0.09339899569749832,
0.06626714020967484,
0.11650731414556503,
0.05525080859661102,
-0.045404739677906036,
-0.1677107810974121,
-0.10386636853218079,
0.1699165403842926,
-0.14799459278583527,
-0.12518051266670227
] |
null | null | null |
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
| {"tags": ["Pixelcopter-PLE-v0", "reinforce", "reinforcement-learning", "custom-implementation", "deep-rl-class"], "model-index": [{"name": "Reinforce-Pixelcopter-PLE-v0", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "Pixelcopter-PLE-v0", "type": "Pixelcopter-PLE-v0"}, "metrics": [{"type": "mean_reward", "value": "32.30 +/- 17.93", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | arekpaterak/Reinforce-Pixelcopter-PLE-v0 | [
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] | 2024-02-12T13:07:51+00:00 | [] | [] | TAGS
#Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us
|
# Reinforce Agent playing Pixelcopter-PLE-v0
This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL
| [
"# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL"
] | [
"TAGS\n#Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us \n",
"# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL"
] | [
41,
58
] | [
"passage: TAGS\n#Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us \n# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL"
] | [
0.0073175891302526,
-0.2259262204170227,
-0.0017347558168694377,
0.05054566636681557,
0.0658537745475769,
-0.055378563702106476,
0.1412602812051773,
0.05916554853320122,
-0.04990595206618309,
0.059261854737997055,
0.14166708290576935,
0.03996060788631439,
0.022112762555480003,
0.1513713151216507,
0.09764605015516281,
-0.2469022423028946,
0.07438477873802185,
0.01641594059765339,
0.008152224123477936,
0.09583204984664917,
0.060265738517045975,
-0.1405058205127716,
0.037032704800367355,
-0.01332044042646885,
-0.13650871813297272,
0.0010478810872882605,
-0.021802188828587532,
-0.03625129908323288,
0.15681709349155426,
0.006844013463705778,
0.09602472931146622,
-0.001560068572871387,
0.06475798785686493,
-0.12438877671957016,
0.05466329678893089,
0.06455880403518677,
-0.06293967366218567,
0.058029334992170334,
-0.057374246418476105,
0.11959903687238693,
0.04641333222389221,
-0.01578129455447197,
0.054811324924230576,
0.010941818356513977,
-0.14131468534469604,
-0.006710252724587917,
0.007013716734945774,
0.15098218619823456,
0.1339312642812729,
0.01409265398979187,
-0.0014771400019526482,
0.1363491266965866,
-0.16774429380893707,
0.045684073120355606,
0.061802688986063004,
-0.2633039951324463,
-0.04168876260519028,
0.12259352207183838,
0.08951573073863983,
0.06848238408565521,
-0.060910262167453766,
0.07636868953704834,
0.049813780933618546,
0.013985024765133858,
0.023094501346349716,
-0.042509064078330994,
-0.040479615330696106,
0.02289252169430256,
-0.0921095609664917,
-0.05999262258410454,
0.11517233401536942,
-0.006806366611272097,
0.03735918551683426,
-0.12476086616516113,
-0.015330453403294086,
-0.07314357161521912,
-0.05917041376233101,
-0.082573801279068,
0.07563583552837372,
0.030191516503691673,
-0.048283837735652924,
-0.08895846456289291,
-0.056533291935920715,
-0.11489585787057877,
-0.023082571104168892,
-0.07226225733757019,
0.005096882116049528,
-0.03157244250178337,
-0.035645097494125366,
0.09446526318788528,
-0.0021088174544274807,
-0.015028090216219425,
-0.03452150896191597,
-0.05930153280496597,
-0.04213470220565796,
-0.02359505370259285,
-0.03510070592164993,
-0.059062156826257706,
0.054655663669109344,
0.0680202916264534,
0.04938843473792076,
0.09133565425872803,
-0.0467856265604496,
0.1667373925447464,
-0.03256719931960106,
0.08078566938638687,
-0.011897698976099491,
0.2012830525636673,
0.11370102316141129,
0.12129533290863037,
0.06716908514499664,
-0.05294690653681755,
-0.16726544499397278,
0.039163749665021896,
0.12641896307468414,
0.07664673775434494,
-0.032492902129888535,
0.018162984400987625,
-0.12440363317728043,
0.05439428985118866,
-0.14826108515262604,
-0.06745084375143051,
0.024251462891697884,
0.01822635903954506,
-0.060682263225317,
0.03656952083110809,
-0.0028792342636734247,
0.003339326474815607,
0.004654870834201574,
-0.16432709991931915,
-0.05568019300699234,
0.028964387252926826,
-0.15712425112724304,
-0.06656725704669952,
0.06277995556592941,
-0.10113482922315598,
-0.012132617644965649,
-0.16982388496398926,
-0.16305199265480042,
-0.03628521412611008,
0.017857929691672325,
-0.040613796561956406,
-0.056917786598205566,
-0.14010562002658844,
-0.019415250048041344,
-0.045320261269807816,
-0.004312154371291399,
0.044072363525629044,
0.0020940210670232773,
0.04635847359895706,
0.0066573889926075935,
0.09289347380399704,
0.010714372619986534,
-0.0014722738415002823,
-0.04595406726002693,
0.0909833237528801,
-0.30731555819511414,
0.07525643706321716,
-0.08645553886890411,
0.05539081245660782,
-0.057316381484270096,
-0.0926317572593689,
-0.007509906310588121,
0.06277763843536377,
0.060464419424533844,
0.20788121223449707,
-0.2800109386444092,
-0.07025618106126785,
0.13655538856983185,
-0.09533236175775528,
-0.13146020472049713,
0.0513952374458313,
-0.050213608890771866,
0.07593657076358795,
0.027370907366275787,
0.140700101852417,
-0.028026295825839043,
-0.15554022789001465,
0.06281048059463501,
0.04586128890514374,
-0.11356306821107864,
0.019295670092105865,
0.03597676753997803,
0.06723599135875702,
0.05744141340255737,
-0.036986757069826126,
-0.04105675220489502,
0.08096802979707718,
-0.07076814025640488,
-0.037564266473054886,
0.04588831216096878,
-0.0579565204679966,
0.1630958467721939,
0.033971156924963,
0.09856503456830978,
-0.04149768501520157,
-0.07435470074415207,
-0.005698562134057283,
0.038746561855077744,
-0.08962973952293396,
0.025353478267788887,
-0.18320298194885254,
0.2423991560935974,
-0.02621818706393242,
0.027546977624297142,
-0.16845986247062683,
-0.0588528998196125,
0.011087946593761444,
0.21568740904331207,
0.030399197712540627,
0.12989304959774017,
0.07485637813806534,
-0.01250512059777975,
0.014156299643218517,
-0.06183977797627449,
-0.1972363442182541,
-0.03247830644249916,
0.008314179256558418,
-0.058311350643634796,
-0.04934588819742203,
-0.0900716632604599,
0.10427892208099365,
-0.19334633648395538,
-0.005319371819496155,
0.08282599598169327,
0.023504555225372314,
0.03946567326784134,
0.0035407328978180885,
-0.03634254261851311,
0.055148303508758545,
0.02030518464744091,
-0.08980578929185867,
0.14668866991996765,
0.0035520538222044706,
-0.03514726087450981,
-0.03927676007151604,
-0.03267495706677437,
0.05703731253743172,
0.08045367896556854,
-0.18214593827724457,
-0.0733821839094162,
-0.0838410034775734,
-0.02458474040031433,
0.050523869693279266,
0.036679428070783615,
0.02738112211227417,
0.44813573360443115,
0.057562243193387985,
0.09003535658121109,
-0.08811535686254501,
0.039806611835956573,
0.012785476632416248,
-0.031281858682632446,
0.013625281862914562,
0.04725322127342224,
0.11279468983411789,
0.028284218162298203,
0.01669839769601822,
0.03680038824677467,
0.01938779093325138,
0.08824212104082108,
-0.10939645022153854,
-0.003965397831052542,
0.002614045049995184,
0.038018375635147095,
0.03672022372484207,
0.07190682739019394,
0.015936892479658127,
-0.09583546966314316,
-0.030848123133182526,
-0.11166880279779434,
0.015594755299389362,
-0.20979784429073334,
-0.025905707851052284,
-0.029619399458169937,
0.0003502996696624905,
0.09109684824943542,
0.04222718998789787,
-0.04444896802306175,
0.035467714071273804,
0.03947039321064949,
-0.0861397460103035,
0.0594942644238472,
-0.014317752793431282,
-0.07008631527423859,
0.13023322820663452,
-0.1002996563911438,
-0.3153233230113983,
-0.08797995746135712,
0.05698639526963234,
0.05295826122164726,
0.06816939264535904,
-0.05876303091645241,
-0.09240786731243134,
0.03294730558991432,
-0.06836386770009995,
-0.0017794050509110093,
0.0037346978206187487,
-0.051060982048511505,
0.07253886014223099,
0.08541567623615265,
-0.014505518600344658,
-0.08911184966564178,
-0.006620637606829405,
-0.041561197489500046,
-0.124965138733387,
0.044060997664928436,
-0.03760828450322151,
0.00007921225915197283,
0.18620672821998596,
0.03724536672234535,
0.06256633251905441,
-0.06291008740663528,
0.07596296072006226,
-0.09150096774101257,
0.0004740063741337508,
0.18428465723991394,
-0.015377625823020935,
-0.004100616089999676,
-0.03996327146887779,
-0.0259257685393095,
-0.10829219967126846,
0.053985193371772766,
-0.07330703735351562,
-0.07349077612161636,
-0.0023273853585124016,
-0.07770214974880219,
-0.0351552739739418,
0.0012160884216427803,
0.07817990332841873,
0.029699061065912247,
-0.09635239094495773,
0.04920589178800583,
0.1298678070306778,
0.0931883230805397,
0.03626195341348648,
0.023981640115380287,
0.13739009201526642,
-0.11230582743883133,
0.019063033163547516,
-0.05148853361606598,
-0.1041760966181755,
-0.042787205427885056,
-0.0714287981390953,
0.07368279993534088,
0.06034531816840172,
-0.09970010071992874,
0.05144011229276657,
0.041872985661029816,
0.0883496031165123,
0.1373600959777832,
-0.04213863983750343,
-0.11244629323482513,
-0.041393622756004333,
-0.022004956379532814,
-0.1777329444885254,
0.0341336652636528,
0.22155584394931793,
0.0073304991237819195,
-0.10497386753559113,
0.07876885682344437,
-0.005956185050308704,
0.11527370661497116,
0.031222699210047722,
-0.278682678937912,
0.016931315883994102,
0.00203216471709311,
0.042359162122011185,
-0.047676295042037964,
0.10937416553497314,
0.11747439950704575,
-0.14421136677265167,
-0.06650938838720322,
-0.03273930773139,
0.044137366116046906,
-0.15618287026882172,
0.036923591047525406,
-0.12602220475673676,
0.06240779533982277,
0.050940994173288345,
0.05090156942605972,
-0.2197665423154831,
0.06881614029407501,
-0.0274215005338192,
0.06763827055692673,
-0.062248338013887405,
-0.01823522336781025,
0.04473711550235748,
0.025079863145947456,
0.14955177903175354,
-0.014347962103784084,
0.14454017579555511,
-0.09031219780445099,
-0.11753576993942261,
0.0027052261866629124,
0.08532248437404633,
0.013173088431358337,
0.013580933213233948,
0.0026939227245748043,
0.041669201105833054,
-0.02811569906771183,
0.17063532769680023,
-0.08147624880075455,
-0.022407781332731247,
-0.06592555344104767,
-0.018158966675400734,
0.2039334923028946,
-0.12064731866121292,
-0.10121093690395355,
-0.11619500070810318,
0.08663272857666016,
-0.04296411573886871,
0.08175522089004517,
-0.020344657823443413,
0.049704354256391525,
-0.02509051002562046,
0.007178863976150751,
0.09594997018575668,
0.01950966566801071,
0.08983828872442245,
-0.09791163355112076,
-0.019585272297263145,
0.13838915526866913,
-0.037155888974666595,
-0.036971647292375565,
-0.019425252452492714,
0.11054370552301407,
-0.0358734093606472,
0.08033111691474915,
0.03929615020751953,
0.03664831817150116,
0.03428546339273453,
-0.039165496826171875,
0.10309428721666336,
0.10041618347167969,
-0.06291446089744568,
0.03864621743559837,
-0.07954532653093338,
0.26597461104393005,
0.040773067623376846,
0.07301845401525497,
0.28390514850616455,
0.19391325116157532,
-0.03036464750766754,
0.10683353990316391,
-0.017607249319553375,
-0.024403288960456848,
-0.2950931787490845,
0.0006976581644266844,
0.027765681967139244,
0.11812873929738998,
0.01744898222386837,
-0.20587195456027985,
-0.1211688369512558,
-0.03560304269194603,
-0.007791717536747456,
0.0310499370098114,
-0.2441052496433258,
-0.06442268192768097,
0.06107868626713753,
0.13779635727405548,
0.15878525376319885,
-0.05917542055249214,
-0.007856467738747597,
0.029358724132180214,
0.07593556493520737,
0.017292039468884468,
-0.11598441749811172,
0.11550791561603546,
0.025637371465563774,
-0.05708931386470795,
0.0267958827316761,
-0.044003549963235855,
0.04214555397629738,
-0.17736166715621948,
0.10933554917573929,
-0.05924695357680321,
-0.08421005308628082,
0.07140472531318665,
-0.02217724733054638,
-0.048552993685007095,
0.0789642184972763,
0.020652711391448975,
-0.13173207640647888,
0.038154006004333496,
0.005618774797767401,
0.04346654564142227,
-0.004941361024975777,
-0.019811764359474182,
-0.029163256287574768,
0.07706235349178314,
-0.03806605935096741,
0.09605937451124191,
0.19590972363948822,
-0.0573095865547657,
0.03974950686097145,
0.085201695561409,
0.09593135863542557,
-0.05523005872964859,
-0.0809539332985878,
-0.03812742978334427,
-0.005277194548398256,
0.0674438327550888,
-0.08598461747169495,
-0.019085103645920753,
0.07938229292631149,
0.015313901007175446,
0.14910826086997986,
0.14389736950397491,
-0.08835655450820923,
0.11321785300970078,
0.10694554448127747,
-0.11366690695285797,
-0.08583837002515793,
-0.02963297814130783,
0.0009990704711526632,
0.04910186678171158,
-0.048617590218782425,
0.05932905897498131,
-0.1035301461815834,
0.012819357216358185,
0.03532040864229202,
0.0038119733799248934,
-0.09975302964448929,
0.009764863178133965,
0.08645275235176086,
0.06119582802057266,
-0.0567571222782135,
0.09250631928443909,
-0.0019178141374140978,
-0.10868195444345474,
0.07241881638765335,
0.009918469935655594,
-0.021528873592615128,
-0.06352251768112183,
0.03211374953389168,
0.2370220273733139,
0.13945111632347107,
-0.04336636886000633,
-0.12396618723869324,
-0.15508891642093658,
0.037849195301532745,
0.024356422945857048,
0.051251959055662155,
0.0062240250408649445,
-0.06906022876501083,
0.01234503649175167,
-0.04392383247613907,
0.005266309250146151,
-0.05930564925074577,
-0.047703344374895096,
-0.12081446498632431,
0.1154373437166214,
0.053290288895368576,
0.11705748736858368,
-0.0842847004532814,
-0.07057584822177887,
-0.1921386867761612,
0.09190598875284195,
0.041707299649715424,
-0.05532265454530716,
0.06002674251794815,
-0.030134430155158043,
0.017344338819384575,
0.11256659775972366,
-0.051967836916446686,
0.008543911390006542,
-0.09269233793020248,
0.03236149623990059,
0.03133073076605797,
0.04903566092252731,
-0.004612727556377649,
-0.017903391271829605,
0.04399999976158142,
-0.05730267986655235,
0.07619527727365494,
-0.07757602632045746,
-0.033709146082401276,
0.0645759105682373,
-0.16051416099071503,
-0.054324716329574585,
0.08708633482456207,
0.013749903067946434,
0.02590017393231392,
-0.05825240537524223,
0.019142305478453636,
-0.05566488951444626,
-0.04483235627412796,
0.01169554702937603,
-0.05552767962217331,
-0.011517677456140518,
0.05293213203549385,
-0.05287189036607742,
-0.040493328124284744,
-0.06794002652168274,
0.061874233186244965,
-0.07247710227966309,
0.09816460311412811,
0.031187955290079117,
-0.10892423242330551,
0.07648903876543045,
-0.037552736699581146,
-0.0049397205002605915,
-0.009439278393983841,
0.039307788014411926,
0.15598824620246887,
-0.1606634259223938,
0.05345672369003296,
-0.0484454482793808,
0.13272921741008759,
0.046888746321201324,
-0.04458791762590408,
-0.020207170397043228,
0.02469455823302269,
-0.05549024045467377,
0.06932897865772247,
0.15877580642700195,
0.09880131483078003,
0.02571805939078331,
0.008134597912430763,
0.10187267512083054,
0.1060529574751854,
0.08136752992868423,
0.08394161611795425,
-0.03428563475608826,
-0.11287897825241089,
0.14338994026184082,
0.09748584777116776,
0.024613093584775925,
0.21077860891819,
0.17944025993347168,
0.03125298395752907,
0.03018142655491829,
-0.06512103229761124,
0.17325744032859802,
0.061261482536792755,
-0.08229418843984604,
0.014424329623579979,
0.03221147879958153,
-0.049809664487838745,
-0.047004032880067825,
-0.09757380187511444,
-0.029556652531027794,
-0.24085633456707,
0.10851483792066574,
-0.057250600308179855,
-0.09750643372535706,
0.022772664204239845,
0.02990041859447956,
-0.018839845433831215,
0.11280566453933716,
-0.07735858112573624,
0.012980576604604721,
0.18577688932418823,
-0.03825045004487038,
-0.022322099655866623,
-0.1633504331111908,
-0.11154003441333771,
-0.014046176336705685,
-0.11750495433807373,
0.025494296103715897,
0.06305963546037674,
0.01117965579032898,
0.04399528726935387,
0.028923438861966133,
-0.020834028720855713,
0.019218796864151955,
-0.05903913825750351,
-0.042673509567976,
-0.01891910657286644,
0.02202831581234932,
-0.09593231230974197,
-0.03627033904194832,
0.12151803076267242,
-0.03246605768799782,
-0.08207374066114426,
-0.006544890813529491,
0.07848484069108963,
-0.042620159685611725,
0.09450104832649231,
-0.07687012106180191,
-0.03479038178920746,
-0.06794454902410507,
0.268902063369751,
0.09388194978237152,
-0.20183001458644867,
0.03341769427061081,
-0.030470456928014755,
0.026735708117485046,
-0.09215684235095978,
0.16250114142894745,
0.0899243950843811,
0.049168527126312256,
-0.12686687707901,
-0.003401300171390176,
-0.09992645680904388,
-0.0028723697178065777,
-0.12552696466445923,
-0.14725084602832794,
0.12093491852283478,
-0.003848524997010827,
-0.06547791510820389,
0.02844911813735962,
-0.15909899771213531,
0.06585367769002914,
0.0978507474064827,
-0.1514272391796112,
-0.038227714598178864,
-0.06086801365017891,
0.06072385236620903,
0.026465637609362602,
0.13005392253398895,
-0.05080926790833473,
0.012067130766808987,
-0.0656723901629448,
-0.011309894733130932,
-0.0000654291216051206,
-0.017478201538324356,
0.001532604917883873,
-0.09828947484493256,
0.05038110539317131,
-0.0835796371102333,
0.12184429168701172,
0.05709611251950264,
0.005326167680323124,
0.008464806713163853,
0.0648408755660057,
-0.02414623089134693,
-0.10202058404684067,
-0.01877439208328724,
0.033475372940301895,
0.03998998552560806,
0.010373802855610847,
0.034506846219301224,
0.0006507808575406671,
0.07714920490980148,
-0.011413984932005405,
-0.027285432443022728,
-0.058209117501974106,
0.03936338797211647,
-0.10441672056913376,
0.10461361706256866,
0.0013552121818065643,
-0.02240127883851528,
-0.010913821868598461,
-0.05532446503639221,
0.045815300196409225,
0.04572062939405441,
0.029743505641818047,
-0.05261747166514397,
-0.09262793511152267,
-0.021781492978334427,
0.023900283500552177,
-0.11539579927921295,
-0.18497975170612335,
-0.0664035826921463,
-0.15038692951202393,
-0.01633414439857006,
-0.0620744526386261,
0.08902198076248169,
0.13558129966259003,
0.030392181128263474,
-0.04822919890284538,
-0.12171997129917145,
0.025026977062225342,
0.13544774055480957,
-0.03851630911231041,
-0.07532322406768799
] |
null | null | null |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-base-rank64
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.8756
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 13.9289 | 1.0 | 179 | 8.8618 |
| 7.5578 | 2.0 | 358 | 5.4690 |
| 5.614 | 3.0 | 537 | 4.8756 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "microsoft/deberta-v3-base", "model-index": [{"name": "deberta-v3-base-rank64", "results": []}]} | null | alitolga/deberta-v3-base-rank64 | [
"safetensors",
"generated_from_trainer",
"base_model:microsoft/deberta-v3-base",
"license:mit",
"region:us"
] | 2024-02-12T13:13:11+00:00 | [] | [] | TAGS
#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us
| deberta-v3-base-rank64
======================
This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 4.8756
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.1+cu118
* Datasets 2.15.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
37,
98,
4,
33
] | [
"passage: TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0"
] | [
-0.10977909713983536,
-0.02383926510810852,
-0.000272897508693859,
0.0962144061923027,
0.18739436566829681,
0.03188342973589897,
0.1521315574645996,
0.04621896147727966,
-0.11039812862873077,
0.026030896231532097,
0.10344589501619339,
0.12346210330724716,
-0.01214214600622654,
0.12621824443340302,
-0.05480639636516571,
-0.22153477370738983,
0.015339897945523262,
0.017923252657055855,
-0.055215757340192795,
0.11165095120668411,
0.10440938174724579,
-0.16147270798683167,
0.0682869404554367,
-0.013526142574846745,
-0.23760250210762024,
0.03791137784719467,
0.04424617439508438,
-0.06407684087753296,
0.14295685291290283,
-0.009633136913180351,
0.18386055529117584,
0.012823836877942085,
0.13880614936351776,
-0.16603654623031616,
0.01076443400233984,
0.06743723154067993,
0.0072526587173342705,
0.05671124532818794,
0.06414835900068283,
0.008718214929103851,
0.08650407195091248,
-0.10005319863557816,
0.0757589116692543,
0.031208310276269913,
-0.13932667672634125,
-0.27417436242103577,
-0.08789606392383575,
0.0025896879378706217,
0.07555236667394638,
0.08566202968358994,
-0.013975093141198158,
0.17519612610340118,
-0.10044020414352417,
0.08535193651914597,
0.25552237033843994,
-0.25281932950019836,
-0.07021211832761765,
0.071146659553051,
0.023459527641534805,
0.08894380927085876,
-0.10143274813890457,
-0.02900632470846176,
0.0903119370341301,
0.04442868381738663,
0.12188786268234253,
-0.012826957739889622,
-0.11815144866704941,
0.009122032672166824,
-0.14692682027816772,
-0.004426182713359594,
0.0912783071398735,
0.03500741720199585,
-0.054377976804971695,
-0.004236732143908739,
-0.07277829945087433,
-0.1371414214372635,
-0.051367513835430145,
-0.04145136475563049,
0.06730754673480988,
-0.06141535937786102,
-0.08764902502298355,
0.01499799732118845,
-0.11625413596630096,
-0.11828317493200302,
-0.024499638006091118,
0.24395588040351868,
0.052924994379282,
0.028124088421463966,
-0.04764493182301521,
0.1291472613811493,
-0.05694502219557762,
-0.1366644948720932,
0.024524841457605362,
0.019550397992134094,
0.024960318580269814,
-0.05345170944929123,
-0.07289297133684158,
-0.0356169231235981,
0.027074923738837242,
0.1499943733215332,
-0.1311299353837967,
0.0237225741147995,
0.04559052735567093,
0.028581978753209114,
-0.11121973395347595,
0.16195543110370636,
-0.035876959562301636,
0.010333948768675327,
0.028275663033127785,
0.07690852880477905,
0.03842984512448311,
0.0128098726272583,
-0.07406949251890182,
0.005984250921756029,
0.1138528510928154,
0.03108055889606476,
-0.10245327651500702,
0.03943431377410889,
-0.04984300583600998,
0.02609124220907688,
0.022958341985940933,
-0.09585303068161011,
0.02446972019970417,
0.01637401059269905,
-0.0685202106833458,
-0.0464635044336319,
0.017798306420445442,
0.028543973341584206,
0.041776999831199646,
0.1314091831445694,
-0.09103567898273468,
0.033552948385477066,
-0.1253967434167862,
-0.11012474447488785,
-0.012358481995761395,
-0.013520389795303345,
0.038584187626838684,
-0.11949725449085236,
-0.1693306863307953,
-0.004231118597090244,
0.03667663410305977,
-0.015293833799660206,
0.01881149411201477,
-0.03158961609005928,
-0.0996800884604454,
-0.012826534919440746,
-0.026565516367554665,
0.11635850369930267,
-0.0694487988948822,
0.11646972596645355,
0.09728671610355377,
0.05548146739602089,
-0.09908585995435715,
0.02373643033206463,
-0.10591994971036911,
0.0024640432093292475,
-0.2569774389266968,
-0.005124390125274658,
-0.06571763008832932,
0.06924369931221008,
-0.03966303914785385,
-0.1075524166226387,
0.022764498367905617,
0.005322130396962166,
0.10874362289905548,
0.1000693067908287,
-0.17926017940044403,
-0.05148404836654663,
0.12636883556842804,
-0.1271693855524063,
-0.1113729402422905,
0.08568505942821503,
-0.04757939279079437,
0.06583791971206665,
0.07703462988138199,
0.16356876492500305,
-0.00956286396831274,
-0.15184687077999115,
0.010717145167291164,
-0.0590687021613121,
0.04493590071797371,
-0.05465210974216461,
0.0488610714673996,
0.0008339445339515805,
-0.021729113534092903,
0.032641030848026276,
-0.06953171640634537,
0.04686718434095383,
-0.1324886828660965,
-0.07485809177160263,
-0.06488070636987686,
-0.1212497353553772,
0.041503455489873886,
0.06225567311048508,
0.06155192852020264,
-0.1368376612663269,
-0.061608269810676575,
0.1359492540359497,
0.0928584486246109,
-0.05217166990041733,
0.004119411576539278,
-0.05063227564096451,
0.05520080775022507,
-0.041433896869421005,
-0.04931100085377693,
-0.1645110845565796,
-0.08925328403711319,
0.015393294394016266,
-0.0047068120911717415,
0.00327594974078238,
-0.044142138212919235,
0.07455213367938995,
0.11798553168773651,
-0.07449058443307877,
-0.04214201495051384,
-0.10275856405496597,
0.019334375858306885,
-0.1075754463672638,
-0.19162553548812866,
-0.03287539631128311,
-0.00888550654053688,
0.07661817222833633,
-0.20160453021526337,
0.04192623123526573,
-0.029132604598999023,
0.09545008093118668,
0.02637353353202343,
-0.005475367419421673,
-0.06444989144802094,
0.09747123718261719,
0.0045448471792042255,
-0.0580558106303215,
0.02937583066523075,
-0.021635031327605247,
-0.04795526713132858,
-0.09300464391708374,
-0.09899953752756119,
0.19842319190502167,
0.14743494987487793,
-0.10093656182289124,
-0.09002619981765747,
0.0297528188675642,
-0.06982388347387314,
-0.016209444031119347,
-0.06776061654090881,
0.020620150491595268,
0.11483170837163925,
-0.02282099425792694,
0.11661163717508316,
-0.09248635172843933,
-0.022356461733579636,
0.0014529010513797402,
-0.041367556899785995,
0.057399142533540726,
0.06753043085336685,
0.11225567758083344,
-0.053833018988370895,
0.11423299461603165,
0.15965376794338226,
-0.11110177636146545,
0.10334773361682892,
-0.06334836781024933,
-0.07238177955150604,
-0.015347606502473354,
-0.01436277199536562,
-0.007964075542986393,
0.1776101291179657,
-0.04427338019013405,
0.03807692229747772,
-0.009708701632916927,
0.017880810424685478,
0.027409786358475685,
-0.2506949007511139,
-0.052766069769859314,
-0.006493645720183849,
-0.04180464893579483,
-0.046832308173179626,
-0.01916314661502838,
0.020225372165441513,
0.10081995278596878,
-0.04058639332652092,
-0.057044293731451035,
0.0223590936511755,
0.00557175325229764,
-0.07168775796890259,
0.22610197961330414,
-0.07881317287683487,
-0.04608485847711563,
-0.10444154590368271,
0.012919694185256958,
-0.04146885499358177,
0.009725523181259632,
0.0729912519454956,
-0.11279396712779999,
-0.044001005589962006,
-0.06022777780890465,
0.05084332823753357,
0.07876109331846237,
0.039546042680740356,
0.007717863656580448,
0.01782396249473095,
0.09497642517089844,
-0.14543552696704865,
0.0006467378116212785,
-0.07175392657518387,
-0.08312161266803741,
0.056910187005996704,
0.08347267657518387,
0.11983032524585724,
0.13179658353328705,
-0.040337517857551575,
-0.02079136297106743,
-0.02325625531375408,
0.27377429604530334,
-0.06932184845209122,
-0.062200676649808884,
0.14498460292816162,
-0.0005378815694712102,
0.03789127990603447,
0.11220243573188782,
0.08702115714550018,
-0.13622024655342102,
0.029426908120512962,
0.025967992842197418,
-0.021020567044615746,
-0.21206539869308472,
-0.028446493670344353,
-0.0029190380591899157,
-0.07482894510030746,
0.0390218086540699,
0.021479828283190727,
-0.0022142441011965275,
0.06011484935879707,
0.02166934125125408,
0.048596709966659546,
-0.0249907448887825,
0.06053663790225983,
0.0602521076798439,
0.046088092029094696,
0.10996872186660767,
-0.04331322759389877,
-0.06801527738571167,
0.02170231193304062,
-0.04719885438680649,
0.22817444801330566,
0.00001491811781306751,
0.041673850268125534,
0.07939895242452621,
0.17735464870929718,
-0.002462110947817564,
0.10430179536342621,
0.019770093262195587,
-0.06619296967983246,
0.011421018280088902,
-0.07086790353059769,
-0.003634663764387369,
0.02175724133849144,
-0.14258019626140594,
0.08539474755525589,
-0.12784256041049957,
-0.004376190714538097,
0.0801885724067688,
0.22792589664459229,
0.02038872055709362,
-0.33866608142852783,
-0.07218042016029358,
0.0015513667603954673,
-0.007956861518323421,
-0.023518022149801254,
0.011211591772735119,
0.1214594691991806,
-0.04184851422905922,
0.027020303532481194,
-0.03522983193397522,
0.06242155283689499,
0.031152624636888504,
0.04701671749353409,
0.05710339918732643,
0.13777777552604675,
-0.01353135984390974,
0.03180978074669838,
-0.2967328429222107,
0.2733897864818573,
0.019609946757555008,
0.13773353397846222,
-0.023160841315984726,
-0.02387249656021595,
0.020564470440149307,
0.06624981760978699,
0.01824108138680458,
-0.030782584100961685,
-0.05063082277774811,
-0.23543617129325867,
-0.039089519530534744,
0.06226648762822151,
0.12959490716457367,
0.01751827634871006,
0.09988957643508911,
-0.00317860534414649,
0.012835978530347347,
0.10463982820510864,
-0.06836505234241486,
-0.19078455865383148,
-0.010738518089056015,
-0.0613342821598053,
0.02804485149681568,
0.016211025416851044,
-0.11761446297168732,
-0.1024993360042572,
-0.0700412392616272,
0.06540956348180771,
0.0037123053334653378,
-0.03701884299516678,
-0.11041536182165146,
0.07908197492361069,
0.09566164761781693,
-0.048732247203588486,
0.07471208274364471,
0.041078776121139526,
0.056677158921957016,
0.024378404021263123,
-0.03807076811790466,
0.10815038532018661,
-0.06901469081640244,
-0.2009645253419876,
-0.05414474010467529,
0.08314872533082962,
0.05137045681476593,
0.03196307271718979,
-0.01318739727139473,
0.015162421390414238,
0.008538338355720043,
-0.09961853921413422,
0.0038757321890443563,
-0.02996899001300335,
0.05577516183257103,
0.028683772310614586,
-0.051311735063791275,
-0.042351674288511276,
-0.050810445100069046,
-0.0363025926053524,
0.10146798938512802,
0.32721322774887085,
-0.06988053023815155,
-0.04230755195021629,
0.08364763855934143,
-0.03463691845536232,
-0.17070871591567993,
0.09220478683710098,
0.06034506857395172,
-0.0038430248387157917,
0.07597258687019348,
-0.11784222722053528,
0.12991152703762054,
0.14157889783382416,
-0.028757434338331223,
0.1240784227848053,
-0.28924304246902466,
-0.1585167944431305,
0.11064843088388443,
0.21439293026924133,
0.1088043749332428,
-0.1557728499174118,
-0.01764458417892456,
-0.04504897817969322,
-0.13108333945274353,
0.09324704110622406,
-0.15120381116867065,
0.08701140433549881,
-0.01612214185297489,
0.06821642071008682,
-0.0008300155750475824,
-0.06112472340464592,
0.14311541616916656,
0.0007622124394401908,
0.1586318016052246,
-0.03935972973704338,
-0.012870344333350658,
0.09020872414112091,
-0.021951651200652122,
0.0271515604108572,
-0.01850840076804161,
0.04274798557162285,
0.009107017889618874,
-0.023169470950961113,
-0.08148299902677536,
0.049713414162397385,
-0.0533994697034359,
-0.08378700911998749,
-0.024846956133842468,
0.018639277666807175,
-0.024928512051701546,
-0.04630934074521065,
0.09266750514507294,
0.03939923644065857,
0.16225983202457428,
0.07548313587903976,
0.029977168887853622,
-0.0779544860124588,
-0.00511569669470191,
0.03007619082927704,
-0.0344298779964447,
0.07267012447118759,
-0.13134963810443878,
0.012572791427373886,
0.1029602587223053,
0.020421480759978294,
0.10135439038276672,
0.08415261656045914,
-0.04837226867675781,
0.028388533741235733,
0.09198541939258575,
-0.16636347770690918,
-0.0741669088602066,
0.008622251451015472,
-0.039275527000427246,
-0.07221722602844238,
0.10561412572860718,
0.0799483135342598,
-0.09373471885919571,
-0.007967788726091385,
-0.03593408688902855,
-0.012631812132894993,
-0.07303313910961151,
0.2227294147014618,
0.10064034163951874,
0.05390103533864021,
-0.09364035725593567,
0.09540357440710068,
0.037226930260658264,
-0.019714446738362312,
-0.02629864402115345,
0.050324659794569016,
-0.0708092525601387,
-0.0081774378195405,
0.11950493603944778,
0.19487260282039642,
-0.07515739649534225,
-0.06573905795812607,
-0.18015339970588684,
-0.11609358340501785,
0.016829947009682655,
0.19300605356693268,
0.10496100038290024,
0.005628860089927912,
0.018334483727812767,
0.04501958191394806,
-0.13519175350666046,
0.08371682465076447,
0.013451937586069107,
0.09526120126247406,
-0.15351462364196777,
0.1688528209924698,
0.02303115464746952,
-0.0004302372981328517,
-0.03028959222137928,
0.06828100979328156,
-0.1316944658756256,
0.024528106674551964,
-0.1442081779241562,
-0.04870409145951271,
0.009894388727843761,
-0.0011268117232248187,
0.009574041701853275,
-0.07749912887811661,
-0.07624939829111099,
0.047325003892183304,
-0.10701669752597809,
-0.00812611822038889,
0.05367180332541466,
0.03715867921710014,
-0.15122228860855103,
-0.03743722289800644,
0.019229386001825333,
-0.06613703072071075,
0.03823935240507126,
0.0487651601433754,
0.03849530592560768,
0.09630363434553146,
-0.20997434854507446,
-0.0029430179856717587,
0.08880948275327682,
-0.016367431730031967,
0.0696609690785408,
-0.05353253334760666,
-0.02602485939860344,
-0.007858239114284515,
0.0978042483329773,
0.008416155353188515,
0.0867205411195755,
-0.13997472822666168,
-0.006191175431013107,
-0.02968805655837059,
-0.06472725421190262,
-0.04450218006968498,
-0.0349254235625267,
0.09429186582565308,
-0.013142097741365433,
0.18351885676383972,
-0.09520009905099869,
0.011849514208734035,
-0.21236686408519745,
-0.014442931860685349,
-0.028647301718592644,
-0.09178724139928818,
-0.15416806936264038,
-0.026780616492033005,
0.06469432264566422,
-0.04646223038434982,
0.14698436856269836,
-0.0015701946103945374,
0.06641637533903122,
0.0374632254242897,
-0.03034866787493229,
0.026866896077990532,
0.0457080602645874,
0.24175049364566803,
0.031129976734519005,
-0.00613299710676074,
0.03124288097023964,
0.0734565481543541,
0.11419452726840973,
0.01158747635781765,
0.22155217826366425,
0.18127477169036865,
-0.07043489068746567,
0.1019318550825119,
0.0614655576646328,
-0.06218587979674339,
-0.11618132889270782,
0.0043325223959982395,
-0.024654695764183998,
0.025565601885318756,
-0.037003397941589355,
0.18632014095783234,
0.10361658036708832,
-0.16563965380191803,
0.013917487114667892,
-0.0577508844435215,
-0.07344912737607956,
-0.09634224325418472,
0.07478474825620651,
-0.08116906881332397,
-0.19115056097507477,
0.026669880375266075,
-0.11170000582933426,
-0.01101001352071762,
0.1307188719511032,
-0.015901604667305946,
-0.008357066661119461,
0.235883891582489,
0.07531672716140747,
0.062061015516519547,
0.01887267641723156,
0.002806885400786996,
-0.03576918691396713,
-0.07197372615337372,
-0.1012999638915062,
0.005275350995361805,
-0.0329875648021698,
0.015152910724282265,
-0.05596904084086418,
-0.12911638617515564,
0.0355282686650753,
0.015552476979792118,
-0.09444280713796616,
0.024008115753531456,
0.033259183168411255,
0.050202906131744385,
0.02025768719613552,
0.011496256105601788,
0.022110527381300926,
-0.0040362440049648285,
0.2072332501411438,
-0.057010989636182785,
-0.13155171275138855,
-0.06247086822986603,
0.26245516538619995,
0.04406530410051346,
0.021840333938598633,
0.02593136578798294,
-0.12180887162685394,
0.010624225251376629,
0.15366508066654205,
0.14873500168323517,
-0.08162666112184525,
-0.0008891946054063737,
-0.04128885641694069,
-0.02712344564497471,
-0.09549082070589066,
0.12738096714019775,
0.1272391825914383,
0.04202356934547424,
-0.09513653069734573,
-0.020700950175523758,
-0.05248567834496498,
-0.008205144666135311,
-0.04144055396318436,
0.02868380770087242,
0.03159588947892189,
0.009482614696025848,
-0.059001702815294266,
0.07555936276912689,
-0.01746372878551483,
-0.11142945289611816,
0.0962703675031662,
-0.15968038141727448,
-0.14188013970851898,
-0.010838326066732407,
0.1389695703983307,
-0.023466376587748528,
0.06749363988637924,
-0.052638206630945206,
-0.005400174763053656,
0.04293496161699295,
-0.02973678521811962,
-0.051605794578790665,
-0.127696231007576,
0.0767015814781189,
-0.0755990594625473,
0.236514151096344,
-0.0324367955327034,
0.12624210119247437,
0.10700509697198868,
0.03140915557742119,
-0.09085901826620102,
0.12035630643367767,
0.03917442262172699,
-0.1436665952205658,
-0.0003976380976382643,
0.0872146487236023,
-0.052429888397455215,
0.0752221941947937,
0.01663416437804699,
-0.1661776900291443,
0.028236106038093567,
0.006344062741845846,
-0.0737876445055008,
-0.057386551052331924,
-0.06284373253583908,
-0.06381191313266754,
0.09128324687480927,
0.1598876714706421,
-0.03899944946169853,
0.04929110035300255,
-0.0702781155705452,
0.07627421617507935,
0.0825834795832634,
0.04035616293549538,
-0.028585845604538918,
-0.2825299799442291,
0.058108292520046234,
0.14923399686813354,
-0.056315988302230835,
-0.2234940379858017,
-0.07451563328504562,
0.01051599532365799,
-0.0654032900929451,
-0.09339899569749832,
0.06626714020967484,
0.11650731414556503,
0.05525080859661102,
-0.045404739677906036,
-0.1677107810974121,
-0.10386636853218079,
0.1699165403842926,
-0.14799459278583527,
-0.12518051266670227
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLM_RoBERTa-Offensive-Language-Detection-8-langs-new
This model is a fine-tuned version of [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8276
- Micro F1: 0.8721
- Macro F1: 0.8604
- Accuracy: 0.8721
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
### Framework versions
- Transformers 4.36.1
- Pytorch 2.1.0+cu121
- Datasets 2.13.1
- Tokenizers 0.15.0
| {"license": "mit", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "xlm-roberta-large", "model-index": [{"name": "XLM_RoBERTa-Offensive-Language-Detection-8-langs-new", "results": []}]} | text-classification | christinacdl/XLM_RoBERTa-Offensive-Language-Detection-8-langs-new | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"generated_from_trainer",
"base_model:xlm-roberta-large",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:16:29+00:00 | [] | [] | TAGS
#transformers #safetensors #xlm-roberta #text-classification #generated_from_trainer #base_model-xlm-roberta-large #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# XLM_RoBERTa-Offensive-Language-Detection-8-langs-new
This model is a fine-tuned version of xlm-roberta-large on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8276
- Micro F1: 0.8721
- Macro F1: 0.8604
- Accuracy: 0.8721
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
### Framework versions
- Transformers 4.36.1
- Pytorch 2.1.0+cu121
- Datasets 2.13.1
- Tokenizers 0.15.0
| [
"# XLM_RoBERTa-Offensive-Language-Detection-8-langs-new\n\nThis model is a fine-tuned version of xlm-roberta-large on an unknown dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.8276\n- Micro F1: 0.8721\n- Macro F1: 0.8604\n- Accuracy: 0.8721",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 4",
"### Training results",
"### Framework versions\n\n- Transformers 4.36.1\n- Pytorch 2.1.0+cu121\n- Datasets 2.13.1\n- Tokenizers 0.15.0"
] | [
"TAGS\n#transformers #safetensors #xlm-roberta #text-classification #generated_from_trainer #base_model-xlm-roberta-large #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# XLM_RoBERTa-Offensive-Language-Detection-8-langs-new\n\nThis model is a fine-tuned version of xlm-roberta-large on an unknown dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.8276\n- Micro F1: 0.8721\n- Macro F1: 0.8604\n- Accuracy: 0.8721",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 4",
"### Training results",
"### Framework versions\n\n- Transformers 4.36.1\n- Pytorch 2.1.0+cu121\n- Datasets 2.13.1\n- Tokenizers 0.15.0"
] | [
66,
94,
6,
12,
8,
3,
113,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #xlm-roberta #text-classification #generated_from_trainer #base_model-xlm-roberta-large #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# XLM_RoBERTa-Offensive-Language-Detection-8-langs-new\n\nThis model is a fine-tuned version of xlm-roberta-large on an unknown dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.8276\n- Micro F1: 0.8721\n- Macro F1: 0.8604\n- Accuracy: 0.8721## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 4### Training results### Framework versions\n\n- Transformers 4.36.1\n- Pytorch 2.1.0+cu121\n- Datasets 2.13.1\n- Tokenizers 0.15.0"
] | [
-0.10667676478624344,
0.15120282769203186,
-0.003704812377691269,
0.08586640655994415,
0.1265532225370407,
0.007819910533726215,
0.08808363229036331,
0.14329609274864197,
-0.06791725009679794,
0.12150802463293076,
0.07959048449993134,
0.05159479007124901,
0.062143031507730484,
0.14814980328083038,
-0.04991646856069565,
-0.17809119820594788,
0.019149912521243095,
-0.04210904613137245,
-0.06417851150035858,
0.08444051444530487,
0.1159922257065773,
-0.10031522810459137,
0.060327351093292236,
0.005912485998123884,
-0.13127553462982178,
0.014361128211021423,
-0.000830472563393414,
-0.062241144478321075,
0.061188243329524994,
0.04458106309175491,
0.07311636954545975,
0.004120738245546818,
0.09042201191186905,
-0.1690278798341751,
-0.008941051550209522,
0.06425654888153076,
0.024916592985391617,
0.07571384310722351,
0.10916635394096375,
0.004476319998502731,
0.053224675357341766,
-0.1554146260023117,
0.07655473053455353,
0.03473736345767975,
-0.08163128793239594,
-0.19025802612304688,
-0.09917979687452316,
0.07201965898275375,
0.10601489245891571,
0.10109315067529678,
-0.017353512346744537,
0.1435893476009369,
-0.08836283534765244,
0.06116745248436928,
0.14773428440093994,
-0.2670535743236542,
-0.05214432626962662,
0.04415532201528549,
0.009280640631914139,
0.06386784464120865,
-0.103324294090271,
-0.004635810386389494,
0.03682580217719078,
0.02065972611308098,
0.08420673757791519,
-0.014604060910642147,
0.006024472415447235,
0.005750230047851801,
-0.10871607810258865,
-0.041879717260599136,
0.14858199656009674,
0.05439183861017227,
-0.06703516840934753,
-0.16420435905456543,
-0.012731602415442467,
-0.07955682277679443,
-0.03762553632259369,
-0.024970462545752525,
0.021080436185002327,
-0.05131486430764198,
-0.051385391503572464,
-0.012554950080811977,
-0.05123131349682808,
-0.03950342908501625,
0.012081475928425789,
0.10556337237358093,
0.024706365540623665,
0.0169371385127306,
-0.006281284615397453,
0.07547833025455475,
-0.036281831562519073,
-0.13298188149929047,
-0.03137439861893654,
-0.003110673278570175,
-0.11399073898792267,
-0.05412061884999275,
-0.048098936676979065,
-0.01502303872257471,
0.005612529348582029,
0.1344609409570694,
-0.01228028628975153,
0.08133342862129211,
0.03745732083916664,
-0.013507019728422165,
-0.03255694732069969,
0.18100111186504364,
-0.05323438346385956,
-0.10385692864656448,
0.003168978728353977,
0.11141064018011093,
0.015763593837618828,
-0.019401686266064644,
-0.075691819190979,
-0.009178796783089638,
0.06852735579013824,
0.059182628989219666,
-0.03392057120800018,
0.016311002895236015,
-0.0493386946618557,
-0.0266795102506876,
0.02290397509932518,
-0.1440686583518982,
0.0632474273443222,
0.01282580941915512,
-0.08807089924812317,
-0.013793088495731354,
-0.001319766859523952,
-0.010232905857264996,
-0.05302254483103752,
0.08872421085834503,
-0.07451903820037842,
-0.019878694787621498,
-0.06476838141679764,
-0.07175654917955399,
-0.0035096376668661833,
-0.02392183430492878,
-0.0033549873623996973,
-0.06914519518613815,
-0.15946802496910095,
-0.05642860010266304,
0.015025109983980656,
-0.09091584384441376,
-0.052724678069353104,
-0.03152432665228844,
-0.06687622517347336,
0.049006734043359756,
0.00818299874663353,
0.11185820400714874,
-0.025263875722885132,
0.05673300847411156,
0.03492700308561325,
0.02684086374938488,
0.08311914652585983,
0.03329841047525406,
-0.07950868457555771,
0.05176205933094025,
-0.1081458255648613,
0.11656676977872849,
-0.0879068672657013,
0.03191620111465454,
-0.12082508206367493,
-0.08541225641965866,
0.015419560484588146,
-0.011872309260070324,
0.07447893917560577,
0.12633901834487915,
-0.12577645480632782,
-0.03941243141889572,
0.1525678038597107,
-0.05608043447136879,
-0.08741672337055206,
0.10433397442102432,
-0.03233487904071808,
-0.01253131777048111,
0.051039062440395355,
0.13280871510505676,
0.15511713922023773,
-0.1035032719373703,
-0.024485180154442787,
0.003700211877003312,
0.07735787332057953,
0.04491288587450981,
0.09182476252317429,
-0.028488000854849815,
0.01077356282621622,
0.008817410096526146,
-0.07092990726232529,
-0.002179184230044484,
-0.052346765995025635,
-0.07975177466869354,
-0.04516254737973213,
-0.07982578128576279,
0.034934353083372116,
0.014329803176224232,
0.04306727647781372,
-0.07689119875431061,
-0.11209730803966522,
0.027473442256450653,
0.12569968402385712,
-0.046496205031871796,
-0.01686468906700611,
-0.09357977658510208,
0.08893809467554092,
-0.05357275903224945,
-0.01339919026941061,
-0.19305478036403656,
-0.08720610290765762,
0.04978320747613907,
-0.10721561312675476,
0.013668451458215714,
-0.014913725666701794,
0.07017747312784195,
0.0676301047205925,
-0.03281637281179428,
-0.03065250627696514,
-0.04464989900588989,
-0.01979062147438526,
-0.09014154225587845,
-0.13253578543663025,
-0.044261399656534195,
-0.02597585879266262,
0.20032137632369995,
-0.24919700622558594,
-0.007146788295358419,
-0.004852120764553547,
0.11199876666069031,
0.0302999597042799,
-0.05663037300109863,
-0.007140599191188812,
0.0188965555280447,
0.0048700980842113495,
-0.10019159317016602,
0.027586841955780983,
0.009706253185868263,
-0.10585689544677734,
-0.02763674221932888,
-0.15244589745998383,
0.06207859888672829,
0.07260917127132416,
0.08708852529525757,
-0.11624673008918762,
-0.03618187457323074,
-0.03585829585790634,
-0.023727700114250183,
-0.07278858125209808,
-0.029670754447579384,
0.18700775504112244,
0.013222177512943745,
0.12629340589046478,
-0.063341423869133,
-0.07532326877117157,
0.010909996926784515,
0.000967568950727582,
-0.02778560109436512,
0.12765076756477356,
-0.007441891357302666,
-0.15128736197948456,
0.0918319970369339,
0.10543065518140793,
-0.04493416100740433,
0.09899961203336716,
-0.05968708172440529,
-0.09524138271808624,
-0.04567441716790199,
0.02308781072497368,
0.030071375891566277,
0.08181764930486679,
-0.023474760353565216,
0.0019793419633060694,
0.05201304703950882,
0.014745695516467094,
0.002775289351120591,
-0.1309053897857666,
-0.0008276619482785463,
0.06386047601699829,
-0.026519902050495148,
0.012935448437929153,
-0.022108502686023712,
0.029829375445842743,
0.09827502816915512,
0.020435402169823647,
-0.017049849033355713,
0.014013468287885189,
-0.04426593706011772,
-0.08745871484279633,
0.17978817224502563,
-0.09613513946533203,
-0.14280171692371368,
-0.14401255548000336,
0.012537388131022453,
-0.060087524354457855,
-0.017033249139785767,
-0.018422789871692657,
-0.06874936819076538,
-0.08397030830383301,
-0.1093447133898735,
-0.04636731371283531,
-0.02576100081205368,
-0.019410690292716026,
0.07267335802316666,
-0.0005485336878336966,
0.10359246283769608,
-0.10600995272397995,
-0.002295720623806119,
0.018184341490268707,
-0.04749416932463646,
-0.0020490912720561028,
0.05239129438996315,
0.0863587036728859,
0.10894152522087097,
-0.0016394881531596184,
0.026923401281237602,
-0.02386404760181904,
0.24585306644439697,
-0.09093578904867172,
-0.020740507170557976,
0.12296772003173828,
0.029494676738977432,
0.05512469261884689,
0.11712762713432312,
0.027445441111922264,
-0.08347942680120468,
0.030804583802819252,
0.04146195203065872,
-0.007186742499470711,
-0.2094271183013916,
-0.05564380809664726,
-0.0415634959936142,
-0.04968150705099106,
0.10603968799114227,
0.040497343987226486,
0.007491822354495525,
0.04141949862241745,
-0.013589196838438511,
0.04813368245959282,
0.011313111521303654,
0.07347644865512848,
0.0838705450296402,
0.07059343159198761,
0.10808763653039932,
-0.03165285289287567,
-0.018429618328809738,
0.06576547026634216,
-0.029277602210640907,
0.21091647446155548,
-0.027171077206730843,
0.16686756908893585,
-0.008020784705877304,
0.12615461647510529,
-0.011368745006620884,
0.040213920176029205,
0.01556744147092104,
-0.020846722647547722,
0.017487462610006332,
-0.07841258496046066,
-0.01813778467476368,
0.026707688346505165,
0.029964899644255638,
0.058121468871831894,
-0.10269574075937271,
0.01725955493748188,
0.030995884910225868,
0.20097573101520538,
0.06957435607910156,
-0.3190469443798065,
-0.06845973432064056,
0.03259873017668724,
-0.01929662749171257,
-0.06661215424537659,
-0.014604015275835991,
0.11929012089967728,
-0.14592207968235016,
0.05983591079711914,
-0.059441521763801575,
0.0844297930598259,
-0.009989268146455288,
-0.005037352908402681,
0.06915921717882156,
0.09101710468530655,
0.0030249704141169786,
0.08259084820747375,
-0.17300109565258026,
0.17639821767807007,
0.02801581658422947,
0.08525986969470978,
-0.06618249416351318,
0.04615430906414986,
0.018734632059931755,
0.02374373748898506,
0.12939222157001495,
0.00032272658427245915,
-0.09754032641649246,
-0.21103475987911224,
-0.09094477444887161,
0.006571207661181688,
0.10910487920045853,
-0.05932622775435448,
0.1018325686454773,
-0.06315885484218597,
-0.017236659303307533,
0.014601455070078373,
-0.011398155242204666,
-0.09005045145750046,
-0.15513892471790314,
0.028804922476410866,
0.023356392979621887,
-0.039347514510154724,
-0.0871305987238884,
-0.07856916636228561,
-0.05046305060386658,
0.1817716360092163,
-0.031025931239128113,
-0.04890264943242073,
-0.16349804401397705,
0.07357225567102432,
0.12331374734640121,
-0.0725393071770668,
0.022610826417803764,
0.008441700600087643,
0.14790090918540955,
0.036764632910490036,
-0.07289408892393112,
0.0468323715031147,
-0.063277468085289,
-0.17068427801132202,
-0.05037204548716545,
0.14532111585140228,
0.027121879160404205,
0.06592922657728195,
0.005279045086354017,
0.027984829619526863,
0.03167610242962837,
-0.09475083649158478,
0.012548393569886684,
0.058263469487428665,
0.09262149035930634,
0.07554122805595398,
-0.024297958239912987,
0.00819928664714098,
-0.05834033712744713,
-0.0038202821742743254,
0.12705709040164948,
0.26093217730522156,
-0.08778496086597443,
0.059341952204704285,
0.01839321292936802,
-0.05098963901400566,
-0.18798847496509552,
0.03023442067205906,
0.11356282979249954,
0.052949629724025726,
0.06633395701646805,
-0.11428111046552658,
0.09793717414140701,
0.09024015069007874,
-0.025971073657274246,
0.03894641250371933,
-0.32784369587898254,
-0.12851369380950928,
0.06778042018413544,
0.08262033760547638,
-0.017453668639063835,
-0.14716528356075287,
-0.06446220725774765,
-0.03840923309326172,
-0.08188357204198837,
0.07760998606681824,
-0.045850951224565506,
0.0924467146396637,
0.002145190257579088,
0.06018399819731712,
0.053361792117357254,
-0.03982997313141823,
0.18010196089744568,
0.0018308312864974141,
0.07321164757013321,
-0.05133388191461563,
0.02965344302356243,
0.043216243386268616,
-0.07316368818283081,
0.05053652822971344,
-0.060696497559547424,
0.07172627002000809,
-0.17124193906784058,
-0.01714984141290188,
-0.059857167303562164,
0.036945924162864685,
-0.054716210812330246,
-0.03419212996959686,
-0.05898084118962288,
0.06559771299362183,
0.07251685112714767,
-0.034048497676849365,
0.0601414255797863,
0.005169968120753765,
0.06560803204774857,
0.13023830950260162,
0.08113621920347214,
0.0491936095058918,
-0.1414482444524765,
0.005429447162896395,
-0.0009186331881210208,
0.04055144265294075,
-0.09492374956607819,
0.046674832701683044,
0.12777620553970337,
0.039154041558504105,
0.14717616140842438,
0.029173310846090317,
-0.06079898402094841,
-0.006447611842304468,
0.03727474808692932,
-0.08782359212636948,
-0.11643802374601364,
-0.028385858982801437,
0.024670276790857315,
-0.15117332339286804,
-0.05941443145275116,
0.10440623760223389,
-0.04039054363965988,
-0.01647862046957016,
-0.017015986144542694,
0.024701545014977455,
-0.012457765638828278,
0.2091919481754303,
0.02721691131591797,
0.07933099567890167,
-0.07220088690519333,
0.11770634353160858,
0.0977780744433403,
-0.0922040268778801,
0.03307749703526497,
0.05928325653076172,
-0.07495160400867462,
-0.01754836179316044,
0.08031821250915527,
0.1235155537724495,
-0.0017566930036991835,
-0.038741420954465866,
-0.05620459467172623,
-0.11976419389247894,
0.05414097383618355,
0.040346238762140274,
0.049119316041469574,
-0.01640119031071663,
-0.022812463343143463,
-0.021163107827305794,
-0.11936153471469879,
0.11611827462911606,
0.0772671177983284,
0.03988615795969963,
-0.14182670414447784,
0.09321609884500504,
-0.024658480659127235,
0.02144927717745304,
-0.013366603292524815,
0.017665114253759384,
-0.10541484504938126,
-0.02683347277343273,
-0.1169808879494667,
0.03954237699508667,
-0.019245648756623268,
-0.006063958164304495,
-0.01594865508377552,
-0.01988786645233631,
-0.02810700610280037,
0.03168044984340668,
-0.06503306329250336,
-0.08275386691093445,
0.007608882151544094,
0.05841240659356117,
-0.11149857938289642,
-0.025055747479200363,
0.0308229923248291,
-0.1071726530790329,
0.0696607157588005,
0.05375693738460541,
0.043816570192575455,
0.0018164117354899645,
-0.0614522323012352,
-0.005495346151292324,
0.010111797600984573,
0.027150042355060577,
0.06226196885108948,
-0.15456974506378174,
0.017935704439878464,
-0.028964636847376823,
-0.005752201192080975,
0.022152861580252647,
0.019560132175683975,
-0.11862847208976746,
-0.016638915985822678,
-0.058559779077768326,
-0.03348946571350098,
-0.07841596007347107,
0.05377357453107834,
0.09148678183555603,
0.025394514203071594,
0.158925861120224,
-0.09139439463615417,
0.051025036722421646,
-0.20828552544116974,
-0.03988773003220558,
-0.011890971101820469,
-0.02505512908101082,
-0.05449138581752777,
-0.018988026306033134,
0.08663472533226013,
-0.04227819666266441,
0.09184049069881439,
-0.014486351981759071,
0.07882755994796753,
0.042359672486782074,
-0.03247771039605141,
0.04561345651745796,
0.03340054675936699,
0.15714871883392334,
0.07222403585910797,
-0.026693128049373627,
0.0872548371553421,
-0.019491540268063545,
0.0628993883728981,
0.07674454897642136,
0.12302666157484055,
0.16843441128730774,
-0.02875038981437683,
0.059537023305892944,
0.023686878383159637,
-0.1258205771446228,
-0.11724585294723511,
0.09393204003572464,
-0.06435166299343109,
0.10418053716421127,
-0.0590338259935379,
0.13918520510196686,
0.09836802631616592,
-0.19775785505771637,
0.06155889108777046,
-0.08393809199333191,
-0.10543042421340942,
-0.10444407910108566,
-0.11676595360040665,
-0.09004522114992142,
-0.07819098979234695,
0.010858885012567043,
-0.11163686960935593,
0.05262748524546623,
0.06683773547410965,
0.013992232270538807,
-0.015444639138877392,
0.13457916676998138,
-0.10867474973201752,
-0.001526524662040174,
0.08604457974433899,
0.01854407601058483,
0.003971994388848543,
-0.006806082557886839,
-0.03168535232543945,
0.03911631181836128,
-0.00858816597610712,
0.0957169234752655,
-0.027582067996263504,
0.013457970693707466,
0.03667990863323212,
-0.010796450078487396,
-0.08565638959407806,
0.0189406368881464,
-0.004676589276641607,
0.027638224884867668,
0.04298214986920357,
0.04768352583050728,
0.014000741764903069,
-0.06339269876480103,
0.26436832547187805,
-0.07792793214321136,
-0.025871848687529564,
-0.1348189115524292,
0.1713951826095581,
0.028961192816495895,
0.01481087226420641,
0.05203702673316002,
-0.11783261597156525,
-0.003694098675623536,
0.16313371062278748,
0.12900197505950928,
-0.07855624705553055,
-0.02153821662068367,
-0.016537081450223923,
-0.017069213092327118,
-0.04595297947525978,
0.10836514830589294,
0.06368032097816467,
0.05407395586371422,
-0.04238654300570488,
0.0441783182322979,
-0.010016989894211292,
-0.034005723893642426,
-0.07333838194608688,
0.120519258081913,
0.006557885091751814,
0.018925270065665245,
-0.020913831889629364,
0.0587628148496151,
0.02081664465367794,
-0.1578831523656845,
0.06640849262475967,
-0.16101978719234467,
-0.1862543672323227,
-0.011082577519118786,
0.009165213443338871,
0.001043130410835147,
0.07443008571863174,
-0.003607975086197257,
-0.007151855155825615,
0.08433933556079865,
-0.015290784649550915,
-0.03344884142279625,
-0.08255903422832489,
0.058054663240909576,
-0.07598593831062317,
0.24338045716285706,
0.018497800454497337,
0.06461691111326218,
0.12068552523851395,
0.00398343987762928,
-0.14495834708213806,
0.03918304294347763,
0.08791530132293701,
-0.06515471637248993,
0.05989295244216919,
0.1870584934949875,
-0.034632161259651184,
0.11048369854688644,
0.053088098764419556,
-0.10158754140138626,
-0.00577888498082757,
-0.06316890567541122,
0.0016610523452982306,
-0.08313824981451035,
-0.013214321807026863,
-0.05904369428753853,
0.16841340065002441,
0.20803087949752808,
-0.051236461848020554,
-0.0002724121150095016,
-0.038460008800029755,
0.03662430867552757,
0.0351768434047699,
0.12735533714294434,
-0.0016906767850741744,
-0.18797282874584198,
0.04247370362281799,
-0.027339616790413857,
0.04155074805021286,
-0.2611156404018402,
-0.08367952704429626,
0.03266400098800659,
-0.053089689463377,
-0.04621908813714981,
0.1314576417207718,
0.05351608991622925,
0.014815782196819782,
-0.04928717389702797,
-0.13489940762519836,
-0.03718526288866997,
0.1445188969373703,
-0.11829037219285965,
-0.05044417083263397
] |
null | null | diffusers | This model is a segmoe merge of 2 models from civitAI:
https://civitai.com/models/234898/vixons-fantasy-mix
https://civitai.com/models/43977?modelVersionId=113623
Merged using the great project at: https://github.com/segmind/segmoe
To do something similar you can either follow the guide in readme or you can follow this blogpost: https://huggingface.co/blog/segmoe
The setting I used:
base_model: https://civitai.com/api/download/models/306781
num_experts: 4
moe_layers: all
num_experts_per_tok: 2
type: sd
experts:
- source_model: https://civitai.com/api/download/models/306781
positive_prompt: "cinematic, portrait, photograph, instagram, fashion, movie, macro shot, 8K, RAW, fantastic, ultra high quality"
negative_prompt: " (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime), text, cropped, out of frame, worst quality, low quality, jpeg artifacts, ugly, duplicate, morbid, mutilated, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, mutation, deformed, blurry, dehydrated, bad anatomy, bad proportions, extra limbs, cloned face, disfigured, gross proportions, malformed limbs, missing arms, missing legs, extra arms, extra legs, fused fingers, too many fingers, long neck"
- source_model: https://civitai.com/api/download/models/113623
positive_prompt: "photo realistic scenes, fantastic view, impressive view, movie scene, 8K, RAW, hyperrealistic, ultra realistic"
negative_prompt: "simple background, duplicate, retro style, low quality, lowest quality, 1980s, 1990s, 2000s, 2005 2006 2007 2008 2009 2010 2011 2012 2013, bad anatomy, bad proportions, extra digits, lowres, username, artist name, error, duplicate, watermark, signature, text, extra digit, fewer digits, worst quality, jpeg artifacts, blurry"
# Useage
!pip install -U segmoe diffusers transformers
from segmoe import SegMoEPipeline
pipeline = SegMoEPipeline("eren23/sd15-FantasyMix-filmGrain-segmoe", device="cuda")
prompt = "fantastic land canvas, knight cat standing next to a purple medieval village wall"
negative_prompt = "nsfw, bad quality, worse quality"
img = pipeline(
prompt=prompt,
negative_prompt=negative_prompt,
height=512,
width=512,
num_inference_steps=30,
guidance_scale=7.5,
).images[0]
img.save("image.png") | {"library_name": "diffusers", "tags": ["text-to-image", "stable-diffusion", "segmoe", "merge", "moe", "sd1.5"], "pipeline_tag": "text-to-image"} | text-to-image | eren23/sd15-FantasyMix-filmGrain-segmoe | [
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"segmoe",
"merge",
"moe",
"sd1.5",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-12T13:17:43+00:00 | [] | [] | TAGS
#diffusers #safetensors #text-to-image #stable-diffusion #segmoe #merge #moe #sd1.5 #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
| This model is a segmoe merge of 2 models from civitAI:
URL
URL
Merged using the great project at: URL
To do something similar you can either follow the guide in readme or you can follow this blogpost: URL
The setting I used:
base_model: URL
num_experts: 4
moe_layers: all
num_experts_per_tok: 2
type: sd
experts:
- source_model: URL
positive_prompt: "cinematic, portrait, photograph, instagram, fashion, movie, macro shot, 8K, RAW, fantastic, ultra high quality"
negative_prompt: " (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime), text, cropped, out of frame, worst quality, low quality, jpeg artifacts, ugly, duplicate, morbid, mutilated, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, mutation, deformed, blurry, dehydrated, bad anatomy, bad proportions, extra limbs, cloned face, disfigured, gross proportions, malformed limbs, missing arms, missing legs, extra arms, extra legs, fused fingers, too many fingers, long neck"
- source_model: URL
positive_prompt: "photo realistic scenes, fantastic view, impressive view, movie scene, 8K, RAW, hyperrealistic, ultra realistic"
negative_prompt: "simple background, duplicate, retro style, low quality, lowest quality, 1980s, 1990s, 2000s, 2005 2006 2007 2008 2009 2010 2011 2012 2013, bad anatomy, bad proportions, extra digits, lowres, username, artist name, error, duplicate, watermark, signature, text, extra digit, fewer digits, worst quality, jpeg artifacts, blurry"
# Useage
!pip install -U segmoe diffusers transformers
from segmoe import SegMoEPipeline
pipeline = SegMoEPipeline("eren23/sd15-FantasyMix-filmGrain-segmoe", device="cuda")
prompt = "fantastic land canvas, knight cat standing next to a purple medieval village wall"
negative_prompt = "nsfw, bad quality, worse quality"
img = pipeline(
prompt=prompt,
negative_prompt=negative_prompt,
height=512,
width=512,
num_inference_steps=30,
guidance_scale=7.5,
).images[0]
URL("URL") | [
"# Useage\n\n!pip install -U segmoe diffusers transformers\n\nfrom segmoe import SegMoEPipeline\n\npipeline = SegMoEPipeline(\"eren23/sd15-FantasyMix-filmGrain-segmoe\", device=\"cuda\")\n\nprompt = \"fantastic land canvas, knight cat standing next to a purple medieval village wall\"\nnegative_prompt = \"nsfw, bad quality, worse quality\"\nimg = pipeline(\n prompt=prompt,\n negative_prompt=negative_prompt,\n height=512,\n width=512,\n num_inference_steps=30,\n guidance_scale=7.5,\n).images[0]\nURL(\"URL\")"
] | [
"TAGS\n#diffusers #safetensors #text-to-image #stable-diffusion #segmoe #merge #moe #sd1.5 #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"# Useage\n\n!pip install -U segmoe diffusers transformers\n\nfrom segmoe import SegMoEPipeline\n\npipeline = SegMoEPipeline(\"eren23/sd15-FantasyMix-filmGrain-segmoe\", device=\"cuda\")\n\nprompt = \"fantastic land canvas, knight cat standing next to a purple medieval village wall\"\nnegative_prompt = \"nsfw, bad quality, worse quality\"\nimg = pipeline(\n prompt=prompt,\n negative_prompt=negative_prompt,\n height=512,\n width=512,\n num_inference_steps=30,\n guidance_scale=7.5,\n).images[0]\nURL(\"URL\")"
] | [
63,
165
] | [
"passage: TAGS\n#diffusers #safetensors #text-to-image #stable-diffusion #segmoe #merge #moe #sd1.5 #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n# Useage\n\n!pip install -U segmoe diffusers transformers\n\nfrom segmoe import SegMoEPipeline\n\npipeline = SegMoEPipeline(\"eren23/sd15-FantasyMix-filmGrain-segmoe\", device=\"cuda\")\n\nprompt = \"fantastic land canvas, knight cat standing next to a purple medieval village wall\"\nnegative_prompt = \"nsfw, bad quality, worse quality\"\nimg = pipeline(\n prompt=prompt,\n negative_prompt=negative_prompt,\n height=512,\n width=512,\n num_inference_steps=30,\n guidance_scale=7.5,\n).images[0]\nURL(\"URL\")"
] | [
-0.06290066987276077,
0.0967249646782875,
-0.005173361860215664,
-0.0018885968020185828,
0.07366861402988434,
-0.028837360441684723,
0.10451464354991913,
0.03808632493019104,
0.029084233567118645,
0.09398965537548065,
0.08687712252140045,
0.06855520606040955,
0.056011952459812164,
0.03336383402347565,
0.013082399033010006,
-0.09317595511674881,
0.09535279870033264,
-0.01705223135650158,
-0.08919178694486618,
0.06976447999477386,
0.046377070248126984,
-0.044412486255168915,
0.08439349383115768,
0.002582554705440998,
-0.07334471493959427,
0.06845977902412415,
0.04774235934019089,
0.005744640715420246,
0.04667237028479576,
-0.0004650716728065163,
0.011096739210188389,
0.05204251781105995,
0.022987231612205505,
-0.14252540469169617,
0.033981699496507645,
0.0634271502494812,
-0.023958098143339157,
-0.01991093158721924,
0.10929407179355621,
-0.025294525548815727,
-0.013627998530864716,
-0.14714056253433228,
0.0023287360090762377,
0.0316769964993,
-0.033258106559515,
-0.03754641115665436,
-0.04418209567666054,
0.07730518281459808,
0.13468313217163086,
0.07922500371932983,
0.04196246340870857,
0.026743613183498383,
-0.07706441730260849,
0.09570153057575226,
0.25786691904067993,
-0.15292003750801086,
-0.03043600171804428,
0.01031962689012289,
0.09320379793643951,
0.11622833460569382,
-0.1135091632604599,
0.0758131816983223,
-0.014840115793049335,
-0.029801368713378906,
-0.04974885657429695,
-0.05917008966207504,
0.11261482536792755,
-0.07656688243150711,
-0.09203655272722244,
-0.036266881972551346,
0.08567499369382858,
0.13725532591342926,
-0.037480901926755905,
-0.0018370953621342778,
-0.10130729526281357,
0.027569711208343506,
-0.03180107846856117,
0.061104997992515564,
-0.029616927728056908,
-0.010400958359241486,
-0.03675529733300209,
-0.045340873301029205,
-0.050235308706760406,
-0.07717811316251755,
-0.02456454187631607,
0.1039283499121666,
0.031300123780965805,
0.07888022810220718,
-0.0361720435321331,
0.05447709187865257,
-0.10542703419923782,
-0.0680295079946518,
0.017685655504465103,
-0.01891552098095417,
0.009534158743917942,
0.1498923897743225,
0.035801149904727936,
-0.06383759528398514,
0.05814531818032265,
0.13150054216384888,
0.04642913490533829,
0.0750848576426506,
0.02838514931499958,
0.08689627796411514,
-0.10698828101158142,
-0.013890848495066166,
-0.05744541436433792,
-0.16047750413417816,
0.04900166392326355,
0.08113566040992737,
0.11390871554613113,
-0.06576047837734222,
0.0041620791889727116,
-0.09719912707805634,
-0.045152999460697174,
-0.004678468685597181,
0.13912884891033173,
0.022332174703478813,
-0.0950050875544548,
0.0464731901884079,
0.16461625695228577,
0.0046945842914283276,
0.06531863659620285,
-0.014120657928287983,
0.01137087494134903,
0.13532982766628265,
0.09105690568685532,
-0.02002405747771263,
-0.017856983467936516,
0.09088613092899323,
-0.044372979551553726,
-0.023885926231741905,
0.009364275261759758,
-0.049989502876996994,
0.05254904553294182,
-0.08891471475362778,
-0.06499215960502625,
-0.0993049144744873,
-0.12076875567436218,
0.029277879744768143,
-0.017523525282740593,
0.035154201090335846,
0.03609316796064377,
0.016232937574386597,
-0.06222628429532051,
0.07028396427631378,
0.03265473246574402,
-0.15758438408374786,
-0.04323621466755867,
0.056316111236810684,
-0.03383960574865341,
0.08423613011837006,
-0.1552499383687973,
-0.0440693162381649,
-0.011300547979772091,
0.07179894298315048,
-0.2263718843460083,
0.029148690402507782,
-0.13920843601226807,
0.023869259282946587,
-0.044874951243400574,
-0.0382922925055027,
-0.07681557536125183,
0.003532623639330268,
0.04162684082984924,
0.19029006361961365,
-0.14701251685619354,
-0.04169510677456856,
0.08429925888776779,
-0.13138434290885925,
-0.02936599776148796,
0.14099836349487305,
0.07887806743383408,
-0.20436786115169525,
0.04836636409163475,
0.15791353583335876,
0.08356373757123947,
-0.21304851770401,
-0.06821785122156143,
0.09888575226068497,
-0.09046841412782669,
-0.08369706571102142,
0.05324328690767288,
-0.060574524104595184,
0.0004934216267429292,
0.014611396938562393,
-0.08768012374639511,
0.04796207323670387,
-0.00973326526582241,
0.015583725646138191,
-0.08397765457630157,
-0.025098752230405807,
0.021265732124447823,
-0.0433959923684597,
-0.07067113369703293,
-0.07670542597770691,
-0.08887948095798492,
-0.026932118460536003,
0.03815431892871857,
-0.028839420527219772,
0.10709220916032791,
-0.02507314272224903,
0.152692973613739,
-0.00455395458266139,
-0.013415822759270668,
-0.040408823639154434,
-0.09592803567647934,
-0.04356341436505318,
-0.03597568720579147,
-0.0112904729321599,
-0.08600825071334839,
0.050071872770786285,
0.032929811626672745,
0.017752191051840782,
-0.09597435593605042,
-0.01098724827170372,
0.029724227264523506,
-0.0523962564766407,
-0.14712224900722504,
0.07164150476455688,
0.027380647137761116,
0.021952249109745026,
-0.10338840633630753,
0.028045399114489555,
0.16879943013191223,
0.1762247383594513,
0.015213783830404282,
-0.10202226042747498,
-0.00955498218536377,
-0.04222138226032257,
-0.02257603220641613,
-0.0296867024153471,
0.02124497853219509,
0.0005743459914810956,
-0.08815202862024307,
0.05890617519617081,
-0.1651861071586609,
0.09258420765399933,
0.1246471181511879,
-0.09108025580644608,
-0.03006071411073208,
0.09840265661478043,
-0.01664811559021473,
-0.024479156360030174,
0.002280305838212371,
-0.1522398591041565,
-0.13965481519699097,
0.025035347789525986,
0.06684035807847977,
-0.03307947888970375,
0.03439215198159218,
0.035665761679410934,
-0.09278012067079544,
-0.04952533543109894,
0.1005852073431015,
0.06465135514736176,
0.08805162459611893,
0.04810294136404991,
0.1925075203180313,
-0.09075745195150375,
0.05428221449255943,
0.02978360280394554,
-0.04005720838904381,
-0.013195710256695747,
0.06573712080717087,
0.06223531812429428,
0.17014528810977936,
-0.06199251860380173,
0.023858502507209778,
0.01502205803990364,
-0.011567329987883568,
-0.024510370567440987,
-0.17267157137393951,
-0.018988240510225296,
0.0015374368522316217,
-0.048508595675230026,
-0.027762191370129585,
0.03763148933649063,
-0.03705234080553055,
0.08524464070796967,
-0.09087114036083221,
-0.06831064075231552,
0.014672301709651947,
-0.01988358050584793,
-0.03234585002064705,
0.0888444036245346,
-0.1268283724784851,
-0.23084253072738647,
-0.18631400167942047,
0.0046396986581385136,
-0.13179105520248413,
-0.011681756936013699,
0.0844089463353157,
-0.09224388003349304,
-0.0974821075797081,
-0.09888388216495514,
-0.042122673243284225,
-0.041227806359529495,
0.05684741958975792,
0.01775318756699562,
-0.01042536087334156,
0.09591087698936462,
-0.06829993426799774,
-0.024555252864956856,
0.004747111815959215,
0.0065012602135539055,
0.015601696446537971,
0.010573526844382286,
0.17299765348434448,
0.07208318263292313,
-0.01621275022625923,
-0.029408471658825874,
0.05836990848183632,
0.21277283132076263,
-0.05015134438872337,
0.09212642163038254,
0.1825307011604309,
-0.011332825757563114,
0.1434360146522522,
0.12535695731639862,
0.07416962832212448,
-0.04159969463944435,
-0.027371186763048172,
0.08969758450984955,
-0.08170681446790695,
-0.09591688215732574,
-0.02215559035539627,
-0.08041604608297348,
0.012891577556729317,
-0.011189884506165981,
0.06806112080812454,
-0.033943865448236465,
0.04015403613448143,
-0.06590218096971512,
-0.009054062888026237,
0.11338255554437637,
0.1374281793832779,
0.14745190739631653,
-0.03368062525987625,
0.03274479880928993,
-0.06402554363012314,
-0.005991149228066206,
0.07939324527978897,
0.02537449821829796,
0.08702754229307175,
0.017875783145427704,
0.059104252606630325,
0.07624633610248566,
0.08977418392896652,
0.056569043546915054,
-0.07026085257530212,
-0.021968627348542213,
-0.027992580085992813,
0.012958470731973648,
-0.06318177282810211,
0.05619969964027405,
0.011759749613702297,
-0.008437679149210453,
0.011273186653852463,
-0.043170735239982605,
0.08254674077033997,
0.08893421292304993,
-0.01593153551220894,
0.0456901490688324,
-0.1720561683177948,
0.06797928363084793,
0.026936450973153114,
0.03991350531578064,
-0.020220957696437836,
0.00578432809561491,
0.09157998859882355,
-0.02400936745107174,
0.11120281368494034,
-0.003048045327886939,
0.0818525031208992,
0.011656682938337326,
0.024531498551368713,
-0.00814917404204607,
0.06381195783615112,
-0.012165396474301815,
-0.02886687032878399,
-0.042780011892318726,
0.0979228988289833,
0.01970714144408703,
0.009488834999501705,
0.03353903442621231,
-0.03295908868312836,
-0.0037471968680620193,
0.15204522013664246,
0.1253834068775177,
0.05904438719153404,
0.0012641178909689188,
-0.13755907118320465,
-0.07709430158138275,
-0.03412962704896927,
0.12137573957443237,
0.041746653616428375,
0.014368993230164051,
0.0005815557087771595,
-0.04412908852100372,
0.023797165602445602,
-0.04992002621293068,
-0.14946480095386505,
-0.1462237387895584,
0.042080871760845184,
0.14232933521270752,
-0.059052154421806335,
-0.11201293766498566,
0.005177588667720556,
-0.06004412844777107,
0.26440978050231934,
-0.23886390030384064,
-0.05599803104996681,
-0.08490686118602753,
-0.091947540640831,
0.04068842530250549,
-0.04008626192808151,
0.07002811133861542,
-0.06739392131567001,
0.06974402070045471,
-0.015741797164082527,
-0.05964748188853264,
0.04118741676211357,
-0.10409241169691086,
-0.12387537211179733,
-0.12241484969854355,
0.15469864010810852,
-0.008194439113140106,
-0.005998707842081785,
-0.019174177199602127,
-0.05111166834831238,
0.025039151310920715,
-0.1312827616930008,
0.021796800196170807,
0.23774990439414978,
-0.06388790160417557,
0.0736355260014534,
0.09769615530967712,
-0.017438651993870735,
-0.017476793378591537,
-0.007193112745881081,
0.025294732302427292,
0.2945566475391388,
-0.046569082885980606,
0.029644761234521866,
0.042968764901161194,
-0.026254041120409966,
-0.2145511507987976,
0.007172566372901201,
0.03400358930230141,
-0.09028124809265137,
0.004583720117807388,
0.11125119775533676,
0.15220783650875092,
0.11679069697856903,
-0.016111623495817184,
0.04809119552373886,
-0.3915107846260071,
-0.08009570091962814,
-0.037942055612802505,
0.11008472740650177,
0.26704728603363037,
-0.16857242584228516,
-0.07908982783555984,
-0.06826446950435638,
-0.19939030706882477,
0.096724733710289,
0.05581305921077728,
0.02598872408270836,
-0.013588670641183853,
-0.0038963952101767063,
0.0054589794017374516,
-0.06455115973949432,
0.20861239731311798,
-0.01429473515599966,
0.05417754501104355,
-0.06717554479837418,
0.06836826354265213,
0.09640467911958694,
-0.09355632215738297,
-0.017842406406998634,
-0.09389016777276993,
0.003031242173165083,
-0.12272899597883224,
0.061336468905210495,
-0.04948091134428978,
0.09408355504274368,
-0.01274188607931137,
0.052871011197566986,
-0.07279561460018158,
-0.0504118986427784,
0.037494633346796036,
0.04277457296848297,
0.1265353113412857,
-0.002113844733685255,
-0.034860726445913315,
0.09549649804830551,
-0.04721926152706146,
-0.03807428106665611,
-0.2780265510082245,
-0.013926466926932335,
-0.05987914651632309,
0.08266996592283249,
-0.011354499496519566,
0.03624940663576126,
0.11730188876390457,
0.04750816896557808,
0.024094022810459137,
0.013320041820406914,
-0.01323485653847456,
0.030774550512433052,
0.1272183507680893,
-0.13189062476158142,
-0.08041612803936005,
0.014818461611866951,
0.1307598203420639,
0.0012229878921061754,
0.020596563816070557,
0.1202726662158966,
0.004063433036208153,
0.06799086928367615,
0.03677714243531227,
0.04277621582150459,
-0.01084878295660019,
0.12289762496948242,
0.040912531316280365,
0.03821071237325668,
-0.069845050573349,
0.09670687466859818,
-0.06158259138464928,
-0.11313322186470032,
-0.02539881505072117,
0.07811357080936432,
-0.10240409523248672,
0.016398949548602104,
0.022172924131155014,
-0.027027687057852745,
-0.06927751749753952,
-0.048520397394895554,
-0.07289440929889679,
-0.07931755483150482,
-0.0018620430491864681,
0.05226437747478485,
-0.004256769549101591,
0.03353797644376755,
0.04856932535767555,
0.017193162813782692,
-0.06761232763528824,
0.010374348610639572,
0.07001154124736786,
0.0910506546497345,
-0.21836718916893005,
-0.021609488874673843,
0.02222820185124874,
-0.020987585186958313,
-0.010712679475545883,
0.05038579925894737,
-0.12454719841480255,
-0.014557899907231331,
-0.10215769708156586,
0.07138937711715698,
-0.06516795605421066,
-0.0009386510937474668,
-0.015731530264019966,
-0.050041697919368744,
-0.032144032418727875,
-0.00037975262966938317,
-0.03300062194466591,
-0.10923494398593903,
0.06093868613243103,
-0.015173792839050293,
-0.13203135132789612,
-0.054542649537324905,
0.0727241188287735,
-0.0749298632144928,
-0.01143348403275013,
0.048553917557001114,
-0.03250834345817566,
-0.04483671858906746,
-0.22173799574375153,
-0.07926806807518005,
0.09035585075616837,
0.023082030937075615,
-0.057219043374061584,
0.17746388912200928,
0.02164769358932972,
0.014025856740772724,
-0.010332883335649967,
-0.023071663454174995,
0.06036937236785889,
-0.10802881419658661,
0.010068218223750591,
-0.12800894677639008,
-0.05323871970176697,
-0.061474673449993134,
0.007747413124889135,
0.17559827864170074,
0.049610577523708344,
0.11850278079509735,
-0.07065954804420471,
0.015036867000162601,
-0.10228808224201202,
0.023474376648664474,
0.048232611268758774,
-0.08331248164176941,
-0.0048369900323450565,
-0.007709880825132132,
0.06235509738326073,
-0.02457662671804428,
0.033463675528764725,
-0.09227197617292404,
0.02686580829322338,
0.03158235549926758,
0.012705630622804165,
0.035791151225566864,
-0.024220168590545654,
0.1249435767531395,
0.005803172010928392,
0.01674276776611805,
-0.012253857217729092,
-0.06624727696180344,
0.13637040555477142,
-0.08385580033063889,
0.0675303190946579,
0.07986608147621155,
-0.07994101941585541,
0.09191685914993286,
0.042994365096092224,
0.011595586314797401,
-0.16988758742809296,
0.039477817714214325,
-0.10369189828634262,
0.11208375543355942,
-0.012819726020097733,
0.07091545313596725,
0.1671641618013382,
-0.0529739148914814,
-0.05104853957891464,
0.11011402308940887,
-0.06256407499313354,
-0.10464280843734741,
-0.17355430126190186,
-0.03002120926976204,
-0.15400762856006622,
0.06488066166639328,
-0.0301143117249012,
0.09493140876293182,
0.01295643113553524,
0.02955889329314232,
0.02282297983765602,
0.07939378917217255,
-0.016421383246779442,
-0.03300861641764641,
0.04684871807694435,
0.018971607089042664,
-0.043208763003349304,
0.13216914236545563,
-0.043837085366249084,
0.11323476582765579,
0.014044887386262417,
0.0024036120157688856,
0.10235787183046341,
0.030637329444289207,
0.03556764870882034,
-0.012955112382769585,
-0.07527076452970505,
-0.020077023655176163,
0.007066518068313599,
0.009251908399164677,
0.14626444876194,
0.10877344012260437,
0.002788473619148135,
-0.04901289939880371,
0.1258563995361328,
-0.07450614869594574,
-0.0072098723612725735,
0.03875181823968887,
0.06787952780723572,
-0.05788902938365936,
0.06304741650819778,
-0.06444741785526276,
-0.16547024250030518,
-0.04013438895344734,
0.15768852829933167,
0.06294313073158264,
-0.09770964086055756,
0.01737324893474579,
-0.10208527743816376,
-0.0045356471091508865,
-0.0008818144560791552,
0.05589819699525833,
0.024758830666542053,
0.2355378419160843,
0.016237415373325348,
-0.03949446231126785,
-0.06140318512916565,
0.00263273436576128,
-0.15155920386314392,
-0.010683830827474594,
0.021608972921967506,
-0.03471943736076355,
-0.11424639821052551,
0.08498671650886536,
-0.06784277409315109,
-0.08699220418930054,
0.025603067129850388,
-0.06141098588705063,
-0.004964892752468586,
0.04004385322332382,
0.1467546820640564,
0.001356837572529912,
-0.017128873616456985,
-0.07603228837251663,
0.010219854302704334,
-0.016674386337399483,
-0.013089808635413647,
-0.12186979502439499,
-0.03229185566306114,
-0.051885612308979034,
-0.0374564491212368,
0.16261909902095795,
0.03914300724864006,
0.035290516912937164,
0.06783080101013184,
-0.01552417129278183,
-0.11748965829610825,
0.05506261810660362,
-0.01123698428273201,
-0.13689091801643372,
-0.0806802287697792,
0.10854268074035645,
0.01664862409234047,
0.05273405835032463,
0.05148004740476608,
0.036009788513183594,
-0.03929677978157997,
0.050589609891176224,
-0.003930543549358845,
-0.08498246967792511,
-0.004078707192093134,
-0.08370984345674515,
0.06602723151445389,
0.03249889612197876,
-0.03003726527094841,
0.004906570538878441,
-0.007850284688174725,
-0.03309651464223862,
0.017335297539830208,
-0.03935046121478081,
0.10532371699810028,
-0.14150986075401306,
0.04414140805602074,
-0.0015520218294113874,
0.05571599304676056,
0.049020685255527496,
-0.12213295698165894,
-0.12675537168979645,
0.042520537972450256,
-0.03224271163344383,
0.07607489079236984,
0.08537396043539047,
-0.023676935583353043,
-0.022848650813102722,
-0.1754269301891327,
0.03007940948009491,
0.14741627871990204,
-0.0008220492745749652,
-0.05449250712990761
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# bertbase-ten
This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 8.1691
- Validation Loss: 8.0864
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 0.001, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 8.1691 | 8.0864 | 0 |
### Framework versions
- Transformers 4.38.0.dev0
- TensorFlow 2.15.0
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"tags": ["generated_from_keras_callback"], "model-index": [{"name": "bertbase-ten", "results": []}]} | null | Shruthi-S/bertbase-ten | [
"transformers",
"tf",
"bert",
"pretraining",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:18:41+00:00 | [] | [] | TAGS
#transformers #tf #bert #pretraining #generated_from_keras_callback #endpoints_compatible #region-us
| bertbase-ten
============
This model is a fine-tuned version of [](URL on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 8.1691
* Validation Loss: 8.0864
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'weight\_decay': None, 'clipnorm': None, 'global\_clipnorm': None, 'clipvalue': None, 'use\_ema': False, 'ema\_momentum': 0.99, 'ema\_overwrite\_frequency': None, 'jit\_compile': True, 'is\_legacy\_optimizer': False, 'learning\_rate': 0.001, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.38.0.dev0
* TensorFlow 2.15.0
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'weight\\_decay': None, 'clipnorm': None, 'global\\_clipnorm': None, 'clipvalue': None, 'use\\_ema': False, 'ema\\_momentum': 0.99, 'ema\\_overwrite\\_frequency': None, 'jit\\_compile': True, 'is\\_legacy\\_optimizer': False, 'learning\\_rate': 0.001, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* TensorFlow 2.15.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tf #bert #pretraining #generated_from_keras_callback #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'weight\\_decay': None, 'clipnorm': None, 'global\\_clipnorm': None, 'clipvalue': None, 'use\\_ema': False, 'ema\\_momentum': 0.99, 'ema\\_overwrite\\_frequency': None, 'jit\\_compile': True, 'is\\_legacy\\_optimizer': False, 'learning\\_rate': 0.001, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* TensorFlow 2.15.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
36,
194,
4,
36
] | [
"passage: TAGS\n#transformers #tf #bert #pretraining #generated_from_keras_callback #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'weight\\_decay': None, 'clipnorm': None, 'global\\_clipnorm': None, 'clipvalue': None, 'use\\_ema': False, 'ema\\_momentum': 0.99, 'ema\\_overwrite\\_frequency': None, 'jit\\_compile': True, 'is\\_legacy\\_optimizer': False, 'learning\\_rate': 0.001, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* TensorFlow 2.15.0\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.06809735298156738,
0.02613092213869095,
-0.005282666068524122,
0.06779689341783524,
0.14163434505462646,
0.03730929270386696,
0.10177754610776901,
0.13001807034015656,
-0.11767443269491196,
0.13344913721084595,
0.11692953109741211,
0.1396089792251587,
0.05506879836320877,
0.10145393759012222,
-0.0871005430817604,
-0.1531224399805069,
0.053454913198947906,
-0.01713130995631218,
-0.0681193619966507,
0.07277827709913254,
0.060822535306215286,
-0.07405543327331543,
0.0720757246017456,
-0.024464592337608337,
-0.11686577647924423,
0.016838613897562027,
0.05013076588511467,
-0.051510803401470184,
0.08275202661752701,
0.07844776660203934,
0.05929156765341759,
-0.018662504851818085,
-0.010069324634969234,
-0.1942434012889862,
0.0063712806440889835,
0.11476946622133255,
-0.007399674504995346,
0.064923956990242,
0.0510132759809494,
-0.013765852898359299,
0.1167156919836998,
-0.10589392483234406,
0.04057549685239792,
0.0390779934823513,
-0.14965595304965973,
-0.2348458170890808,
-0.09664575010538101,
0.011485268361866474,
0.07261090725660324,
0.09336814284324646,
-0.02566860057413578,
0.1987701952457428,
-0.05035844072699547,
0.08777876943349838,
0.16045448184013367,
-0.25839895009994507,
-0.06309766322374344,
0.009467730298638344,
0.035299163311719894,
0.043976038694381714,
-0.07536643743515015,
0.012515055947005749,
0.02273171953856945,
0.03270062804222107,
0.010832207277417183,
-0.013894764706492424,
-0.04366500675678253,
-0.04800490662455559,
-0.07863687723875046,
-0.030392814427614212,
0.1396443396806717,
0.058893006294965744,
-0.05788414180278778,
-0.05264623463153839,
-0.05653353035449982,
-0.15856550633907318,
0.012654737569391727,
-0.031669825315475464,
0.01205115020275116,
0.0012425915338099003,
-0.039975494146347046,
0.005213287193328142,
-0.048551350831985474,
-0.047396741807460785,
-0.005172921344637871,
0.12204331159591675,
0.03460109606385231,
0.016908597201108932,
-0.008473895490169525,
0.061979565769433975,
-0.06665903329849243,
-0.13617128133773804,
-0.00914933905005455,
-0.007851134985685349,
-0.07898867875337601,
-0.033839061856269836,
-0.08093176037073135,
-0.05099713057279587,
0.05863887071609497,
0.14772272109985352,
-0.025065911933779716,
0.12530626356601715,
-0.05053752288222313,
0.018209494650363922,
-0.09424812346696854,
0.06353588402271271,
-0.05013745650649071,
-0.013890429399907589,
-0.012598280794918537,
0.05434788763523102,
0.03776516392827034,
-0.02935636043548584,
-0.02202625572681427,
0.016073783859610558,
0.10107733309268951,
0.016002988442778587,
-0.03896775841712952,
0.05669453367590904,
-0.08014168590307236,
-0.019002357497811317,
-0.0461546890437603,
-0.08890917152166367,
0.03930365666747093,
0.0134412981569767,
-0.08596949279308319,
0.031510382890701294,
0.07726076990365982,
-0.009310676716268063,
-0.06265576928853989,
0.05154423043131828,
-0.07485466450452805,
-0.0018717177445068955,
-0.08360269665718079,
-0.11808186769485474,
0.024983739480376244,
-0.1059587225317955,
-0.018181055784225464,
-0.03827957063913345,
-0.14522692561149597,
-0.051432110369205475,
0.07014978677034378,
-0.06596094369888306,
0.0018961307359859347,
-0.07130248099565506,
-0.17173287272453308,
0.045317474752664566,
0.007398082874715328,
0.13427922129631042,
-0.04725826531648636,
0.07130414247512817,
0.017715269699692726,
0.03676536679267883,
-0.0012584896758198738,
0.043418172746896744,
-0.049830980598926544,
0.03704537823796272,
-0.1748633235692978,
0.09467882663011551,
-0.06183871254324913,
0.01641913503408432,
-0.13634368777275085,
-0.07484553009271622,
0.016629096120595932,
0.007708588149398565,
0.12191961705684662,
0.12309157103300095,
-0.18418753147125244,
-0.05236930027604103,
0.12424570322036743,
-0.08556678891181946,
-0.07412968575954437,
0.0875210165977478,
-0.05129850655794144,
0.0028657286893576384,
0.07112559676170349,
0.07260327786207199,
0.010053463280200958,
-0.08276782929897308,
0.009596027433872223,
-0.0965438112616539,
0.016864610835909843,
0.08359944820404053,
0.025645572692155838,
-0.043698668479919434,
-0.07329592853784561,
-0.002420748583972454,
0.0013564183609560132,
-0.00779884634539485,
-0.0844939574599266,
-0.04817221313714981,
-0.04297000542283058,
-0.04220931977033615,
0.024213964119553566,
0.040177229791879654,
0.032308366149663925,
-0.11160299181938171,
-0.1736912876367569,
0.06267755478620529,
0.0374111570417881,
-0.060157354921102524,
0.014225251972675323,
-0.05927938595414162,
0.035385992377996445,
-0.009648019447922707,
-0.007938208058476448,
-0.17533782124519348,
-0.05305938050150871,
0.02870977111160755,
-0.01467655599117279,
0.04721298813819885,
0.0016955864848569036,
0.06829411536455154,
0.07177931070327759,
-0.06717905402183533,
-0.02017120271921158,
-0.02482132986187935,
0.023244455456733704,
-0.069168321788311,
-0.24257688224315643,
-0.0003906830388586968,
-0.006971374154090881,
0.05713040381669998,
-0.2820308208465576,
0.002326508052647114,
0.05451342836022377,
0.14822420477867126,
0.02514948509633541,
-0.043712370097637177,
-0.029245004057884216,
0.057076022028923035,
-0.036952242255210876,
-0.06788349151611328,
0.02716084010899067,
0.0012186126550659537,
-0.13572724163532257,
-0.05435962602496147,
-0.1980952024459839,
0.07533945888280869,
0.12862759828567505,
-0.09921568632125854,
-0.1568288505077362,
0.05719152092933655,
-0.018215803429484367,
-0.04411371052265167,
-0.02592512220144272,
-0.0018831960624083877,
0.16897588968276978,
0.03423381969332695,
0.11571226269006729,
-0.0451851487159729,
-0.029321076348423958,
0.03520108386874199,
-0.010020340792834759,
-0.024735694751143456,
0.12244464457035065,
-0.010467913933098316,
-0.14387284219264984,
0.10056159645318985,
0.06646160036325455,
-0.08916384726762772,
0.12036243826150894,
-0.03853830322623253,
-0.06198951229453087,
-0.09209272265434265,
0.07481063902378082,
0.06446395814418793,
0.08470217883586884,
-0.13318279385566711,
0.02995956689119339,
0.009570485912263393,
0.03636205196380615,
-0.01853777840733528,
-0.1660923808813095,
0.006096505094319582,
0.006630321033298969,
-0.04826245829463005,
0.03261828422546387,
0.03096446581184864,
0.029102817177772522,
0.12140350043773651,
0.005385961849242449,
0.02357405796647072,
0.0470496267080307,
-0.03564309701323509,
-0.08525072038173676,
0.21902135014533997,
-0.1388072818517685,
-0.0896616205573082,
-0.10445749014616013,
0.0021885319147258997,
-0.08044025301933289,
-0.003402846632525325,
0.02681082859635353,
-0.08906073868274689,
-0.05562682822346687,
-0.07137656956911087,
0.010186819359660149,
-0.017086084932088852,
0.027728648856282234,
-0.0073713818565011024,
-0.013238007202744484,
0.15613742172718048,
-0.10643655061721802,
-0.028752749785780907,
-0.010528748854994774,
-0.07707995176315308,
0.029947614297270775,
0.026392599567770958,
0.0187297984957695,
0.0732714906334877,
0.006389969494193792,
0.014352774247527122,
-0.02476901188492775,
0.22425352036952972,
-0.04832296445965767,
-0.01228881161659956,
0.10640005767345428,
-0.018649393692612648,
0.0777270719408989,
0.12323703616857529,
0.046738751232624054,
-0.1123085469007492,
0.029827499762177467,
0.10057266801595688,
-0.005977544002234936,
-0.23253484070301056,
-0.020825602114200592,
-0.042409494519233704,
-0.12619692087173462,
0.042451392859220505,
0.03378714621067047,
0.10770551860332489,
0.027801215648651123,
-0.022279422730207443,
0.06803122907876968,
0.025311866775155067,
0.08388480544090271,
0.13141101598739624,
0.07667689025402069,
0.10487992316484451,
-0.013795815408229828,
-0.00779162859544158,
0.03968023881316185,
-0.05347973853349686,
0.23903417587280273,
0.0029396330937743187,
0.062112655490636826,
0.09519890695810318,
0.05925397947430611,
-0.034695062786340714,
0.0073204655200243,
0.012955576181411743,
0.002567909425124526,
0.006446192506700754,
-0.06511735171079636,
-0.03636718913912773,
0.03423328697681427,
0.00008633283869130537,
0.10931113362312317,
-0.09889980405569077,
-0.014514083042740822,
0.08252990990877151,
0.19072940945625305,
0.09050597995519638,
-0.30908018350601196,
-0.08897261321544647,
0.003870643675327301,
-0.04013817757368088,
-0.0647301897406578,
-0.03799575939774513,
0.10256046801805496,
-0.08454854041337967,
0.11328114569187164,
-0.07287473231554031,
0.05534577742218971,
-0.018313609063625336,
0.04317714273929596,
0.08790889382362366,
0.09669524431228638,
0.0003230520524084568,
0.026578083634376526,
-0.301784485578537,
0.266653448343277,
0.029249265789985657,
0.13893648982048035,
-0.055165618658065796,
0.037186071276664734,
0.04441044479608536,
-0.06626579165458679,
0.08161371946334839,
-0.0163299348205328,
-0.049511637538671494,
-0.20248953998088837,
-0.043304216116666794,
-0.004041382111608982,
0.14853502810001373,
-0.008659621700644493,
0.12813396751880646,
-0.02233409881591797,
0.006991267669945955,
0.06234763562679291,
0.0025066027883440256,
-0.18713904917240143,
-0.05829179659485817,
0.05396437272429466,
-0.0014596892287954688,
-0.011206312105059624,
-0.05267956107854843,
-0.08248299360275269,
-0.05091243237257004,
0.17046819627285004,
-0.1430077850818634,
-0.03315890207886696,
-0.1261758655309677,
0.09054503589868546,
0.14057455956935883,
-0.06620192527770996,
0.02710796520113945,
-0.001752579351887107,
0.05874694883823395,
0.06475216150283813,
-0.0673450231552124,
0.13597701489925385,
-0.01903659477829933,
-0.22108055651187897,
-0.07637389004230499,
0.11137109994888306,
0.08207619190216064,
0.03368128463625908,
-0.03310539200901985,
0.09033989161252975,
0.03176039457321167,
-0.11058301478624344,
0.06790271401405334,
0.023200515657663345,
0.07509176433086395,
0.08737526834011078,
-0.03928663954138756,
0.005609854590147734,
-0.0463208332657814,
-0.005818287841975689,
0.06831635534763336,
0.3412279486656189,
-0.059398103505373,
0.00006901949382154271,
-0.011506992392241955,
-0.09676441550254822,
-0.1619727909564972,
0.07801991701126099,
0.10974648594856262,
0.015119168907403946,
-0.028307219967246056,
-0.1616426706314087,
0.07155204564332962,
0.11261092871427536,
0.01990359090268612,
0.08437571674585342,
-0.2924724221229553,
-0.15011364221572876,
0.0589103065431118,
0.1304532289505005,
0.094160296022892,
-0.1899954080581665,
-0.051671549677848816,
-0.02697863057255745,
-0.007248654030263424,
0.16523058712482452,
-0.09547603875398636,
0.1251264363527298,
0.01492593064904213,
-0.009778707288205624,
0.016635660082101822,
-0.03249715268611908,
0.1555454283952713,
-0.0021953736431896687,
0.09826963394880295,
-0.04832752048969269,
-0.02813381887972355,
0.1153516173362732,
-0.07231470942497253,
0.003450218355283141,
-0.04099694639444351,
0.007532928138971329,
-0.12785886228084564,
0.006922615692019463,
-0.09113047271966934,
0.05401839688420296,
-0.06470056623220444,
-0.010763848200440407,
-0.00037396393599919975,
0.046709779649972916,
0.09616561979055405,
-0.020888829603791237,
0.08580857515335083,
-0.02754928171634674,
0.1844858080148697,
0.14422526955604553,
0.07436813414096832,
0.026882706210017204,
-0.06494460254907608,
0.07651346921920776,
-0.004215974360704422,
0.06071624904870987,
-0.09074397385120392,
0.04997602105140686,
0.13881398737430573,
-0.0013384362682700157,
0.15051646530628204,
0.06923288851976395,
-0.06801443547010422,
0.028646325692534447,
0.04570315405726433,
-0.12593567371368408,
-0.057636164128780365,
0.015086377039551735,
-0.012692851014435291,
-0.06541815400123596,
0.0041327448561787605,
0.14550255239009857,
-0.041728854179382324,
0.023645922541618347,
0.016619276255369186,
0.040133990347385406,
-0.06266267597675323,
0.16436767578125,
-0.010464096441864967,
0.05423494428396225,
-0.08616925776004791,
0.1292148232460022,
0.046103790402412415,
-0.1263130158185959,
0.10699046403169632,
0.05578835308551788,
-0.05589454993605614,
-0.012567068450152874,
0.05706838518381119,
0.17119139432907104,
0.0209068413823843,
-0.03765777498483658,
-0.09532926976680756,
-0.14660939574241638,
0.08289067447185516,
0.21449896693229675,
0.044122785329818726,
0.05279102921485901,
-0.03645819053053856,
0.004012299235910177,
-0.08710981160402298,
0.08176427334547043,
0.11846444010734558,
0.05190826579928398,
-0.12942101061344147,
0.16608670353889465,
0.011471785604953766,
-0.04154203459620476,
-0.007619866635650396,
0.007736183237284422,
-0.18760119378566742,
-0.015521490946412086,
-0.13953351974487305,
0.020342882722616196,
0.010534217581152916,
-0.020242085680365562,
0.03022407367825508,
-0.05719078704714775,
-0.05526845157146454,
0.019268233329057693,
-0.0778525099158287,
-0.0561707429587841,
0.045819010585546494,
0.05381712689995766,
-0.12704028189182281,
-0.07119065523147583,
0.028408164158463478,
-0.1113075390458107,
0.02978779561817646,
0.0654662549495697,
0.00039889515028335154,
0.00983095820993185,
-0.09608940780162811,
0.004744683858007193,
0.04138943925499916,
-0.005872066598385572,
0.042006008327007294,
-0.1828560084104538,
0.028284966945648193,
-0.04811941087245941,
0.058064430952072144,
0.016794640570878983,
0.11699908971786499,
-0.10157755017280579,
-0.07865286618471146,
-0.024958185851573944,
-0.037015561014413834,
-0.04839009791612625,
0.045204322785139084,
0.15055902302265167,
-0.009623074904084206,
0.18000617623329163,
-0.1174396201968193,
0.04482528567314148,
-0.19892533123493195,
-0.006153213791549206,
0.020011253654956818,
-0.09018490463495255,
-0.06668153405189514,
-0.0038522393442690372,
0.11229538917541504,
-0.08314970880746841,
0.07215151935815811,
-0.06510311365127563,
0.10223552584648132,
0.038455355912446976,
-0.080424003303051,
-0.10455721616744995,
0.062263838946819305,
0.18654800951480865,
0.04479598626494408,
-0.018132777884602547,
0.027620933949947357,
-0.01786135882139206,
0.0806226134300232,
0.03185355290770531,
0.199728861451149,
0.126745343208313,
-0.02100856602191925,
0.0810471624135971,
0.05960458889603615,
-0.10144788026809692,
-0.09546341747045517,
0.16005481779575348,
-0.04839090630412102,
0.1468992978334427,
-0.03182641416788101,
0.09019598364830017,
0.03679582104086876,
-0.20486673712730408,
0.04343453422188759,
-0.0677540972828865,
-0.10592412203550339,
-0.10039608180522919,
-0.09995229542255402,
-0.08545739203691483,
-0.09682898223400116,
0.0068436055444180965,
-0.11124849319458008,
0.05199982970952988,
0.09029129892587662,
0.04404205456376076,
0.010715820826590061,
0.12570062279701233,
-0.0641036406159401,
0.00022850710956845433,
0.08186938613653183,
0.01420868281275034,
-0.011468770913779736,
-0.01813771016895771,
-0.07146219909191132,
0.052633173763751984,
0.007658346556127071,
0.040929164737463,
0.021721888333559036,
-0.006448919419199228,
0.04376889020204544,
-0.029653700068593025,
-0.10734421014785767,
0.05944221094250679,
0.024900471791625023,
0.010375159792602062,
0.06301306933164597,
0.03467514365911484,
-0.01644444838166237,
-0.034939609467983246,
0.14598241448402405,
-0.10427894443273544,
-0.018779948353767395,
-0.13343723118305206,
0.2520241439342499,
0.009511199779808521,
0.0396222285926342,
0.016934093087911606,
-0.0835951715707779,
-0.020943447947502136,
0.15406249463558197,
0.11599627137184143,
-0.04288718104362488,
-0.024462418630719185,
0.0766802653670311,
-0.02123592048883438,
-0.01384014543145895,
0.08092394471168518,
0.07426860183477402,
-0.035379808396101,
-0.0670129656791687,
0.0026614556554704905,
-0.009917720220983028,
-0.023674920201301575,
-0.04416726157069206,
0.07387909293174744,
0.0200274009257555,
-0.00584979634732008,
-0.020481867715716362,
0.06231613829731941,
-0.058455757796764374,
-0.15888461470603943,
0.11913798749446869,
-0.20162098109722137,
-0.17583076655864716,
-0.027262825518846512,
0.008711756207048893,
0.0010837753070518374,
0.05510209873318672,
-0.010118234902620316,
-0.017534025013446808,
0.11750969290733337,
-0.02317318320274353,
-0.013746233657002449,
-0.12930387258529663,
0.034073662012815475,
-0.053417056798934937,
0.15722855925559998,
-0.013060368597507477,
0.0329337902367115,
0.1349588930606842,
0.014041909016668797,
-0.06985429674386978,
0.05666991323232651,
0.06400573253631592,
-0.1077243834733963,
0.009949454106390476,
0.13586652278900146,
-0.028111858293414116,
0.14337748289108276,
0.05262666568160057,
-0.12146598100662231,
0.02224123664200306,
-0.07446202635765076,
-0.07888361066579819,
-0.023312561213970184,
-0.021777192130684853,
-0.07850176095962524,
0.1434871405363083,
0.24883925914764404,
-0.03705275431275368,
0.021656006574630737,
-0.053515251725912094,
-0.0010821667965501547,
0.05787618085741997,
0.060785938054323196,
-0.026328423991799355,
-0.25184065103530884,
0.08194204419851303,
0.06854419410228729,
0.060119595378637314,
-0.17611904442310333,
-0.103110171854496,
0.03214796259999275,
-0.023691749200224876,
-0.08018733561038971,
0.10245010256767273,
0.027820061892271042,
0.04932469129562378,
-0.06821923702955246,
-0.14630338549613953,
-0.038792457431554794,
0.19590578973293304,
-0.0905086100101471,
-0.07294842600822449
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mhGPT
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 2
### Training results
### Framework versions
- Transformers 4.31.0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.13.3
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "microsoft/phi-2", "model-index": [{"name": "mhGPT", "results": []}]} | null | mango278/mhGPT | [
"transformers",
"tensorboard",
"GPTConfig",
"generated_from_trainer",
"base_model:microsoft/phi-2",
"license:mit",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:22:00+00:00 | [] | [] | TAGS
#transformers #tensorboard #GPTConfig #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #endpoints_compatible #region-us
|
# mhGPT
This model is a fine-tuned version of microsoft/phi-2 on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 2
### Training results
### Framework versions
- Transformers 4.31.0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.13.3
| [
"# mhGPT\n\nThis model is a fine-tuned version of microsoft/phi-2 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.05\n- num_epochs: 2",
"### Training results",
"### Framework versions\n\n- Transformers 4.31.0\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #tensorboard #GPTConfig #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #endpoints_compatible #region-us \n",
"# mhGPT\n\nThis model is a fine-tuned version of microsoft/phi-2 on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.05\n- num_epochs: 2",
"### Training results",
"### Framework versions\n\n- Transformers 4.31.0\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.13.3"
] | [
47,
27,
6,
12,
8,
3,
129,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #GPTConfig #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #endpoints_compatible #region-us \n# mhGPT\n\nThis model is a fine-tuned version of microsoft/phi-2 on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.05\n- num_epochs: 2### Training results### Framework versions\n\n- Transformers 4.31.0\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.13.3"
] | [
-0.12036493420600891,
0.13107770681381226,
-0.0021770314779132605,
0.09154581278562546,
0.10181756317615509,
0.034498196095228195,
0.08304666727781296,
0.1384894996881485,
-0.036766063421964645,
0.11041513085365295,
0.10465376824140549,
0.05413668602705002,
0.06312083452939987,
0.1429239809513092,
-0.004608279559761286,
-0.25660431385040283,
0.019557084888219833,
-0.03616634011268616,
-0.06087523698806763,
0.0819203183054924,
0.08693792670965195,
-0.08871869742870331,
0.09678864479064941,
0.014922178350389004,
-0.1400834023952484,
-0.03267514705657959,
-0.050106119364500046,
-0.03549918532371521,
0.09166070818901062,
0.0025940900668501854,
0.0495540015399456,
0.019020067527890205,
0.11956749856472015,
-0.19794848561286926,
-0.00024410152400378138,
0.055337145924568176,
0.04497060924768448,
0.10773774236440659,
0.06783273816108704,
0.032489996403455734,
0.13021034002304077,
-0.1300460249185562,
0.08678822219371796,
0.03766920790076256,
-0.07884427160024643,
-0.08758852630853653,
-0.10075995326042175,
0.07132162153720856,
0.06126052141189575,
0.0840093269944191,
0.015143834985792637,
0.16262908279895782,
-0.09617523849010468,
0.046840693801641464,
0.21833346784114838,
-0.2744799852371216,
-0.049312617629766464,
0.08677653223276138,
0.05782858282327652,
0.04006163403391838,
-0.12189339101314545,
-0.003980722278356552,
0.020353958010673523,
0.01575619727373123,
0.07585292309522629,
0.010483847931027412,
0.008299823850393295,
0.0008895745268091559,
-0.11965573579072952,
-0.020593775436282158,
0.08108104020357132,
0.07144318521022797,
-0.03607599809765816,
-0.13648375868797302,
-0.044535521417856216,
-0.14263568818569183,
-0.005066617392003536,
-0.008474272675812244,
0.03603408858180046,
-0.05286726355552673,
-0.06384093314409256,
-0.05313713848590851,
-0.052600741386413574,
-0.08082401007413864,
-0.00628708116710186,
0.13311947882175446,
0.030790580436587334,
0.026755107566714287,
0.01349556352943182,
0.12083762139081955,
-0.01093615498393774,
-0.11634206026792526,
-0.0367916040122509,
-0.015542068518698215,
-0.11549071967601776,
-0.03498877212405205,
-0.038997069001197815,
0.03846437484025955,
0.008662170730531216,
0.130504310131073,
-0.02916673570871353,
0.06746431440114975,
0.08155378699302673,
0.004135033115744591,
-0.009113239124417305,
0.14852803945541382,
-0.07507693022489548,
-0.04753706231713295,
-0.013268311507999897,
0.08895158767700195,
0.032870493829250336,
-0.012784010730683804,
-0.10452058166265488,
-0.04316432401537895,
0.07816793024539948,
0.04750657454133034,
-0.031039442867040634,
0.02915802411735058,
-0.05349572002887726,
-0.04500459507107735,
0.08598306030035019,
-0.09198109060525894,
0.050488270819187164,
0.0008456175564788282,
-0.09223349392414093,
-0.010180149227380753,
0.027398981153964996,
0.012336013838648796,
-0.01683066412806511,
0.07091367244720459,
-0.10582275688648224,
-0.013643322512507439,
-0.06381384283304214,
-0.029588045552372932,
0.030718522146344185,
-0.13522456586360931,
-0.0006432097288779914,
-0.06452738493680954,
-0.1729663461446762,
-0.029254091903567314,
0.036385126411914825,
-0.10231878608465195,
-0.06573114544153214,
-0.02594861388206482,
-0.05811164900660515,
0.031185045838356018,
-0.010572556406259537,
0.1579640805721283,
-0.05449040234088898,
0.05958951264619827,
-0.007745190057903528,
0.05004064366221428,
0.005010426510125399,
0.03367587551474571,
-0.06256978958845139,
0.03437455743551254,
-0.1506333351135254,
0.06518271565437317,
-0.08760613948106766,
0.021029261872172356,
-0.11408681422472,
-0.08446791768074036,
-0.013823055662214756,
-0.03536100685596466,
0.06359479576349258,
0.13726148009300232,
-0.13009105622768402,
-0.018003806471824646,
0.10269933193922043,
-0.06301712244749069,
-0.08157271891832352,
0.07277684658765793,
-0.017196109518408775,
0.00006450778164435178,
0.029081877321004868,
0.14959540963172913,
0.044981956481933594,
-0.1776617169380188,
-0.018044859170913696,
0.04348745942115784,
0.04415817931294441,
-0.020235782489180565,
0.08893740922212601,
-0.023173684254288673,
0.06752147525548935,
0.021058067679405212,
-0.027750909328460693,
0.0011082187993451953,
-0.07894005626440048,
-0.06917645037174225,
-0.060953203588724136,
-0.10021759569644928,
0.014397657476365566,
0.01857377588748932,
0.033281147480010986,
-0.0444810725748539,
-0.12015511095523834,
0.07768838107585907,
0.13565513491630554,
-0.0347568616271019,
0.010430085472762585,
-0.07543657720088959,
0.029614120721817017,
-0.037740979343652725,
-0.04586450755596161,
-0.17026041448116302,
-0.10818030685186386,
0.047421958297491074,
-0.07186321914196014,
0.03854206204414368,
0.03765081241726875,
0.07059924304485321,
0.07266013324260712,
-0.043940331786870956,
-0.012685764580965042,
-0.08826038241386414,
0.006507047917693853,
-0.12151020765304565,
-0.1910346895456314,
-0.0684572234749794,
-0.01999816857278347,
0.18850252032279968,
-0.20425055921077728,
-0.011607354506850243,
-0.0003726576396729797,
0.16237375140190125,
0.033263444900512695,
-0.08241026848554611,
0.03146953135728836,
0.009345635771751404,
-0.0030897539108991623,
-0.10664574801921844,
0.036132778972387314,
0.009713465347886086,
-0.1108742356300354,
-0.040139127522706985,
-0.13745367527008057,
0.017766183242201805,
0.07966659218072891,
0.10280954837799072,
-0.07082854211330414,
-0.06900671869516373,
-0.07167220115661621,
-0.061085499823093414,
-0.06178079545497894,
0.001802934450097382,
0.17749957740306854,
0.03528129681944847,
0.10041332989931107,
-0.06756290048360825,
-0.08268444240093231,
0.021511171013116837,
0.01360669918358326,
-0.016969354823231697,
0.07914134114980698,
0.1098463162779808,
-0.08269395679235458,
0.07590629160404205,
0.06925228238105774,
-0.03933817893266678,
0.11985699087381363,
-0.0333150252699852,
-0.1080164909362793,
-0.030648430809378624,
-0.011773670092225075,
-0.003990567289292812,
0.15681828558444977,
-0.08454250544309616,
0.029069814831018448,
0.047842007130384445,
0.040171150118112564,
0.047362376004457474,
-0.1526370644569397,
-0.009146054275333881,
0.010539975948631763,
-0.039839453995227814,
-0.014850183390080929,
-0.010955803096294403,
0.010926171205937862,
0.07542318850755692,
0.04646953567862511,
-0.01161526795476675,
0.01159153413027525,
-0.010311671532690525,
-0.07315404713153839,
0.1728329062461853,
-0.08840934187173843,
-0.12858611345291138,
-0.11579291522502899,
0.07626011967658997,
-0.06108852103352547,
-0.02401147224009037,
-0.019279586151242256,
-0.051223039627075195,
-0.049573369324207306,
-0.09676461666822433,
0.0031902985647320747,
-0.04185833781957626,
0.02573012001812458,
0.02642856352031231,
0.019438855350017548,
0.0679064467549324,
-0.1142294704914093,
0.004952420946210623,
-0.029804110527038574,
-0.055388085544109344,
0.0012598450994119048,
0.03374059498310089,
0.08500522375106812,
0.11113540083169937,
0.013295923359692097,
0.021930165588855743,
-0.02404322288930416,
0.2170870155096054,
-0.08265767246484756,
0.01241065002977848,
0.11821923404932022,
0.03244218975305557,
0.04241529852151871,
0.09176712483167648,
0.023290203884243965,
-0.0911911278963089,
0.023271217942237854,
0.05349624529480934,
-0.019112421199679375,
-0.21991920471191406,
-0.04969591647386551,
-0.025889988988637924,
-0.048393614590168,
0.09389244019985199,
0.05057515203952789,
-0.017509877681732178,
0.04789814352989197,
-0.00009649959247326478,
0.006163702812045813,
-0.04627794399857521,
0.0724296122789383,
0.05493597686290741,
0.056578416377305984,
0.08616995811462402,
-0.024070505052804947,
0.014723469503223896,
0.07947281748056412,
0.042615775018930435,
0.2219240367412567,
-0.030148938298225403,
0.14335288107395172,
0.00008105046435957775,
0.14016160368919373,
-0.025274721905589104,
0.04523739963769913,
0.029040079563856125,
-0.0197309423238039,
0.0017017773352563381,
-0.060118868947029114,
-0.04402969777584076,
0.04687877371907234,
0.04419487342238426,
0.028476575389504433,
-0.0945184975862503,
0.06596201658248901,
0.00915399007499218,
0.23557999730110168,
0.0676959976553917,
-0.2974846363067627,
-0.0873362123966217,
-0.0019217533990740776,
-0.03388085588812828,
-0.05956718698143959,
0.003694037441164255,
0.1269192397594452,
-0.14070716500282288,
0.05799408257007599,
-0.053115129470825195,
0.07652902603149414,
-0.0799574926495552,
-0.014040140435099602,
0.04460872709751129,
0.1282731145620346,
0.0007826338987797499,
0.07811101526021957,
-0.15874451398849487,
0.20477773249149323,
0.020392723381519318,
0.09358875453472137,
-0.07837098091840744,
0.01719125546514988,
-0.001616347348317504,
0.055300723761320114,
0.10052333772182465,
0.0016745468601584435,
-0.020920859649777412,
-0.14256000518798828,
-0.15000300109386444,
0.02624381333589554,
0.11129909753799438,
-0.042980462312698364,
0.07326630502939224,
-0.03292825445532799,
0.004174103494733572,
0.034510642290115356,
-0.05662219971418381,
-0.12976379692554474,
-0.12065543234348297,
0.03862810134887695,
-0.02204587310552597,
-0.03456055745482445,
-0.06997022032737732,
-0.10476464778184891,
-0.0402766689658165,
0.16842228174209595,
-0.012052066624164581,
-0.04535510018467903,
-0.16594338417053223,
0.0755360871553421,
0.1519545316696167,
-0.05943048372864723,
0.02354019694030285,
-0.0053047724068164825,
0.13025285303592682,
0.03504553064703941,
-0.06723114103078842,
0.09336773306131363,
-0.060779016464948654,
-0.20489735901355743,
-0.05244668573141098,
0.12652897834777832,
0.06168857589364052,
0.06613808870315552,
-0.009919863194227219,
0.022675538435578346,
-0.011747755110263824,
-0.09828634560108185,
0.056389179080724716,
0.08766878396272659,
0.06670980900526047,
0.036631010472774506,
-0.026059620082378387,
0.12176568061113358,
0.010275009088218212,
-0.023883860558271408,
0.11575423181056976,
0.2402704358100891,
-0.0967511534690857,
0.09989235550165176,
0.04525848478078842,
-0.04094905033707619,
-0.16398067772388458,
0.04716641083359718,
0.11324474215507507,
0.0353192575275898,
0.03828051686286926,
-0.17731593549251556,
0.11322956532239914,
0.13248732686042786,
-0.03286082297563553,
0.05811648070812225,
-0.29400718212127686,
-0.12875372171401978,
0.05954224616289139,
0.08720454573631287,
0.011769362725317478,
-0.12297040224075317,
-0.04612527787685394,
-0.025037499144673347,
-0.1457190215587616,
0.10404422879219055,
-0.06493130326271057,
0.12744086980819702,
-0.02786611206829548,
0.10737808048725128,
0.025360476225614548,
-0.0477302260696888,
0.16035638749599457,
0.03876673802733421,
0.05184053257107735,
-0.041050612926483154,
0.02440181002020836,
0.02127576805651188,
-0.07310602813959122,
0.06400071084499359,
-0.051498159766197205,
0.06732257455587387,
-0.15290729701519012,
-0.022245856001973152,
-0.07092632353305817,
0.06621958315372467,
-0.04725632816553116,
-0.07018712908029556,
-0.037350092083215714,
0.05784246698021889,
0.039710164070129395,
-0.02893674001097679,
0.07186174392700195,
0.0022707583848387003,
0.038687657564878464,
0.12556372582912445,
0.09668600559234619,
-0.036251459270715714,
-0.11423424631357193,
-0.008279490284621716,
-0.009299688041210175,
0.04038040712475777,
-0.09950782358646393,
-0.001276440336368978,
0.12498631328344345,
0.044399067759513855,
0.09481964260339737,
0.024179356172680855,
-0.0746234804391861,
-0.007218152284622192,
0.03555912896990776,
-0.10575953871011734,
-0.13232088088989258,
0.006454308517277241,
0.0016588690923526883,
-0.1377408653497696,
0.03336723521351814,
0.10207393020391464,
-0.04273359850049019,
-0.009637774899601936,
-0.008036660961806774,
0.027650250121951103,
-0.012858780100941658,
0.16926273703575134,
0.05974828824400902,
0.06601634621620178,
-0.08660585433244705,
0.1479509174823761,
0.04051543399691582,
-0.08720112591981888,
0.040412481874227524,
0.06553862243890762,
-0.10367115586996078,
-0.005215180106461048,
0.06079299375414848,
0.09126316756010056,
-0.06501656770706177,
-0.0336640402674675,
-0.07926330715417862,
-0.08265785127878189,
0.05735142529010773,
0.06153547391295433,
0.05412384122610092,
0.0073898350819945335,
-0.025269895792007446,
0.02621571719646454,
-0.11879345774650574,
0.0647270604968071,
0.017853951081633568,
0.06905847042798996,
-0.1600904017686844,
0.07364195585250854,
-0.014175589196383953,
0.03985876590013504,
-0.023372290655970573,
0.02741025574505329,
-0.0962466225028038,
-0.04215337708592415,
-0.09832126647233963,
-0.0038148744497448206,
-0.04872142896056175,
0.0030973635148257017,
-0.007411284372210503,
-0.07226262241601944,
-0.02855871431529522,
0.03063776157796383,
-0.0743073970079422,
-0.06090822070837021,
-0.011416067369282246,
0.04220491275191307,
-0.10443513840436935,
0.004165972117334604,
0.026763856410980225,
-0.09152545779943466,
0.11205314099788666,
0.05749908462166786,
0.04834357649087906,
0.010666495189070702,
-0.05682898312807083,
0.016764448955655098,
0.01726958528161049,
0.019055450335144997,
0.07921241968870163,
-0.09409172832965851,
-0.029751429334282875,
-0.04509685933589935,
0.04282096028327942,
0.013410740531980991,
0.07150857150554657,
-0.13555198907852173,
-0.037056535482406616,
-0.05707405507564545,
-0.040020279586315155,
-0.06903041154146194,
0.038425687700510025,
0.09094632416963577,
0.024327319115400314,
0.13180425763130188,
-0.053637199103832245,
0.03428897634148598,
-0.19079725444316864,
-0.013973883353173733,
0.00097996077965945,
0.008054347708821297,
-0.03991284966468811,
-0.007663857191801071,
0.09866161644458771,
-0.02651239186525345,
0.14341291785240173,
-0.02767804078757763,
0.05271719768643379,
0.034430213272571564,
-0.01831938698887825,
0.014413794502615929,
0.0033173146657645702,
0.16696882247924805,
0.07346700131893158,
0.010668531991541386,
0.12689559161663055,
-0.008162046782672405,
0.051726121455430984,
0.05198052525520325,
0.22008877992630005,
0.1360907256603241,
-0.0183101873844862,
0.10291952639818192,
0.08655137568712234,
-0.12147800624370575,
-0.15886424481868744,
0.1280103325843811,
-0.04586538299918175,
0.1019640862941742,
-0.06618611514568329,
0.15388119220733643,
0.07096567004919052,
-0.17978806793689728,
0.014395304955542088,
-0.04433952271938324,
-0.10885392129421234,
-0.10427694022655487,
-0.06259305775165558,
-0.07149624824523926,
-0.13442890346050262,
0.020074328407645226,
-0.09737767279148102,
0.02174621820449829,
0.0919634997844696,
0.02294197306036949,
0.014622103422880173,
0.13451841473579407,
-0.0442134365439415,
0.018041497096419334,
0.04438536614179611,
0.03617584705352783,
-0.008343753404915333,
-0.06507387012243271,
-0.06952326744794846,
0.026556346565485,
0.01771491952240467,
0.08891811966896057,
-0.05316288396716118,
-0.007521184626966715,
0.04009641706943512,
0.033039435744285583,
-0.07388639450073242,
0.01278491597622633,
0.005979873705655336,
0.01883576437830925,
0.055361658334732056,
0.03585166484117508,
0.01670738123357296,
-0.061967600136995316,
0.25361719727516174,
-0.08287028223276138,
-0.04832269251346588,
-0.12859278917312622,
0.19139236211776733,
-0.008971828036010265,
-0.009943819604814053,
0.05355765298008919,
-0.11723989248275757,
-0.039363663643598557,
0.16115395724773407,
0.14662371575832367,
-0.09261509776115417,
-0.03025028482079506,
0.009865268133580685,
-0.016186753287911415,
-0.048117078840732574,
0.09853760153055191,
0.07705598324537277,
0.04105382785201073,
-0.06771299242973328,
-0.02479761280119419,
-0.0003103234339505434,
-0.05790318176150322,
-0.0677497610449791,
0.047005560249090195,
0.010104021988809109,
0.013492780737578869,
-0.050025295466184616,
0.05930308252573013,
-0.024718331173062325,
-0.18129868805408478,
0.04360588267445564,
-0.15363925695419312,
-0.18312062323093414,
-0.025025712326169014,
0.04208673536777496,
-0.02082115225493908,
0.07724230736494064,
-0.007549884729087353,
0.00009129919635597616,
0.11406087130308151,
-0.008792855776846409,
-0.09294284135103226,
-0.07612200081348419,
0.08156580477952957,
-0.012382136657834053,
0.21346963942050934,
-0.011662876233458519,
0.07180435955524445,
0.10796292126178741,
0.0327257364988327,
-0.1717706322669983,
0.02033539116382599,
0.07808630913496017,
-0.10577665269374847,
0.031049443408846855,
0.1509345918893814,
-0.02828940562903881,
0.06721711903810501,
0.03281346336007118,
-0.11219705641269684,
-0.029324624687433243,
-0.05552768334746361,
-0.008115623146295547,
-0.0668177604675293,
-0.005009843967854977,
-0.05271478369832039,
0.18021440505981445,
0.18098300695419312,
-0.04538905248045921,
-0.025576721876859665,
-0.05800421163439751,
0.03629200533032417,
0.03624428063631058,
0.04451942443847656,
-0.022566217929124832,
-0.2032075971364975,
0.028467275202274323,
0.008589088916778564,
0.015365616418421268,
-0.18359681963920593,
-0.08366449922323227,
0.03152351826429367,
-0.04842829704284668,
-0.049350302666425705,
0.10970248281955719,
0.018441732972860336,
0.021371779963374138,
-0.03648073971271515,
-0.08537331223487854,
-0.03685654699802399,
0.12524551153182983,
-0.17818577587604523,
-0.06707359105348587
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b5_1
This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3428
- Mean Iou: 0.4792
- Mean Accuracy: 0.5000
- Overall Accuracy: 0.9583
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.0001
- Accuracy Undropoff: 0.9999
- Iou Unlabeled: nan
- Iou Dropoff: 0.0001
- Iou Undropoff: 0.9583
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 0.8047 | 5.0 | 10 | 0.9867 | 0.2744 | 0.6315 | 0.7475 | nan | 0.5049 | 0.7581 | 0.0 | 0.0812 | 0.7422 |
| 0.7528 | 10.0 | 20 | 0.8526 | 0.3461 | 0.5957 | 0.9213 | nan | 0.2406 | 0.9508 | 0.0 | 0.1178 | 0.9205 |
| 0.7087 | 15.0 | 30 | 0.7023 | 0.3450 | 0.5533 | 0.9467 | nan | 0.1243 | 0.9824 | 0.0 | 0.0887 | 0.9464 |
| 0.6601 | 20.0 | 40 | 0.6251 | 0.3381 | 0.5390 | 0.9462 | nan | 0.0948 | 0.9832 | 0.0 | 0.0684 | 0.9460 |
| 0.6274 | 25.0 | 50 | 0.5828 | 0.3286 | 0.5178 | 0.9486 | nan | 0.0479 | 0.9876 | 0.0 | 0.0374 | 0.9485 |
| 0.5929 | 30.0 | 60 | 0.5478 | 0.3257 | 0.5122 | 0.9488 | nan | 0.0359 | 0.9884 | 0.0 | 0.0284 | 0.9487 |
| 0.5672 | 35.0 | 70 | 0.5237 | 0.3240 | 0.5088 | 0.9494 | nan | 0.0283 | 0.9893 | 0.0 | 0.0227 | 0.9493 |
| 0.5454 | 40.0 | 80 | 0.4966 | 0.4856 | 0.5072 | 0.9529 | nan | 0.0212 | 0.9933 | nan | 0.0183 | 0.9528 |
| 0.5261 | 45.0 | 90 | 0.4700 | 0.3234 | 0.5062 | 0.9553 | nan | 0.0163 | 0.9960 | 0.0 | 0.0149 | 0.9552 |
| 0.5012 | 50.0 | 100 | 0.4576 | 0.4832 | 0.5041 | 0.9563 | nan | 0.0107 | 0.9974 | nan | 0.0101 | 0.9563 |
| 0.4875 | 55.0 | 110 | 0.4430 | 0.4811 | 0.5018 | 0.9566 | nan | 0.0058 | 0.9978 | nan | 0.0056 | 0.9565 |
| 0.4622 | 60.0 | 120 | 0.4328 | 0.4800 | 0.5007 | 0.9570 | nan | 0.0031 | 0.9983 | nan | 0.0030 | 0.9570 |
| 0.4394 | 65.0 | 130 | 0.4179 | 0.4796 | 0.5004 | 0.9572 | nan | 0.0021 | 0.9986 | nan | 0.0021 | 0.9572 |
| 0.4352 | 70.0 | 140 | 0.4048 | 0.4795 | 0.5002 | 0.9573 | nan | 0.0016 | 0.9988 | nan | 0.0016 | 0.9573 |
| 0.426 | 75.0 | 150 | 0.3881 | 0.4796 | 0.5003 | 0.9577 | nan | 0.0015 | 0.9992 | nan | 0.0014 | 0.9577 |
| 0.4175 | 80.0 | 160 | 0.3794 | 0.4797 | 0.5004 | 0.9579 | nan | 0.0014 | 0.9994 | nan | 0.0014 | 0.9579 |
| 0.4087 | 85.0 | 170 | 0.3742 | 0.3196 | 0.5002 | 0.9577 | nan | 0.0012 | 0.9992 | 0.0 | 0.0012 | 0.9577 |
| 0.3887 | 90.0 | 180 | 0.3645 | 0.4792 | 0.4999 | 0.9581 | nan | 0.0003 | 0.9996 | nan | 0.0003 | 0.9581 |
| 0.3799 | 95.0 | 190 | 0.3540 | 0.4791 | 0.4999 | 0.9581 | nan | 0.0001 | 0.9997 | nan | 0.0001 | 0.9581 |
| 0.376 | 100.0 | 200 | 0.3511 | 0.4792 | 0.4999 | 0.9582 | nan | 0.0001 | 0.9998 | nan | 0.0001 | 0.9582 |
| 0.3677 | 105.0 | 210 | 0.3452 | 0.4792 | 0.4999 | 0.9582 | nan | 0.0001 | 0.9998 | nan | 0.0001 | 0.9582 |
| 0.358 | 110.0 | 220 | 0.3437 | 0.4792 | 0.4999 | 0.9582 | nan | 0.0001 | 0.9998 | nan | 0.0001 | 0.9582 |
| 0.3997 | 115.0 | 230 | 0.3434 | 0.4792 | 0.5000 | 0.9583 | nan | 0.0001 | 0.9999 | nan | 0.0001 | 0.9583 |
| 0.3769 | 120.0 | 240 | 0.3428 | 0.4792 | 0.5000 | 0.9583 | nan | 0.0001 | 0.9999 | nan | 0.0001 | 0.9583 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b5_1", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b5_1 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:23:03+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b5\_1
====================================
This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3428
* Mean Iou: 0.4792
* Mean Accuracy: 0.5000
* Overall Accuracy: 0.9583
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.0001
* Accuracy Undropoff: 0.9999
* Iou Unlabeled: nan
* Iou Dropoff: 0.0001
* Iou Undropoff: 0.9583
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-06
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10557883977890015,
0.034187328070402145,
-0.0019901443738490343,
0.11674907058477402,
0.17250721156597137,
0.028266573324799538,
0.11677761375904083,
0.11515800654888153,
-0.11076497286558151,
0.030467601493000984,
0.10493253916501999,
0.14289754629135132,
0.016209183260798454,
0.09414276480674744,
-0.019785547628998756,
-0.3066781461238861,
-0.024805068969726562,
0.03122584894299507,
-0.08515045046806335,
0.12522883713245392,
0.06489083170890808,
-0.16202843189239502,
0.09213489294052124,
-0.004245896823704243,
-0.22006916999816895,
0.01643534190952778,
-0.005308887455612421,
-0.03076137974858284,
0.15966826677322388,
0.023163942620158195,
0.11566684395074844,
0.01014185231178999,
0.11430707573890686,
-0.20255975425243378,
0.017674153670668602,
0.05575472116470337,
-0.004255280364304781,
0.06708282232284546,
0.06548759341239929,
0.001845747814513743,
0.15122108161449432,
-0.10577747225761414,
0.06791786849498749,
0.0019613862968981266,
-0.1448063999414444,
-0.20931032299995422,
-0.07496614009141922,
0.021809814497828484,
0.07784435153007507,
0.0962395891547203,
-0.004843216389417648,
0.11356692016124725,
-0.09220285713672638,
0.11295092105865479,
0.27044105529785156,
-0.24249306321144104,
-0.08665169030427933,
0.04047643393278122,
0.0022885515354573727,
0.06739672273397446,
-0.13200946152210236,
0.008813029155135155,
0.03283306211233139,
0.04779564589262009,
0.11669105291366577,
-0.03323691338300705,
-0.09926760196685791,
0.027138322591781616,
-0.13909652829170227,
-0.03300708904862404,
0.05520184710621834,
0.05266767740249634,
-0.020856624469161034,
-0.03250373527407646,
-0.06735199689865112,
-0.18192920088768005,
-0.06580018997192383,
0.012254275381565094,
0.06649476289749146,
-0.060215212404727936,
-0.11591386795043945,
-0.0156877338886261,
-0.11015670746564865,
-0.08556342869997025,
-0.04943095147609711,
0.1278548687696457,
0.033900871872901917,
0.0197321567684412,
-0.03398292139172554,
0.12723195552825928,
-0.02609758824110031,
-0.13898511230945587,
0.016865676268935204,
0.02944379858672619,
-0.042024627327919006,
-0.032124146819114685,
-0.049195557832717896,
-0.06429355591535568,
-0.012849350459873676,
0.10805224627256393,
-0.06067505106329918,
0.06797967106103897,
0.03601311147212982,
0.050818443298339844,
-0.11417543143033981,
0.1898730993270874,
-0.06714387983083725,
-0.006918976083397865,
-0.03749854490160942,
0.0586886927485466,
0.0042533003725111485,
-0.022498751059174538,
-0.10583323240280151,
0.005006760358810425,
0.07019483298063278,
-0.007628555875271559,
-0.08782503753900528,
0.07032912969589233,
-0.039655063301324844,
-0.012201575562357903,
0.0005086447345092893,
-0.07575441151857376,
0.04547577723860741,
-0.0010075063910335302,
-0.08349371701478958,
-0.028324881568551064,
0.05134165659546852,
0.014651176519691944,
0.013948056846857071,
0.1647414267063141,
-0.0863439217209816,
0.06349951028823853,
-0.11277085542678833,
-0.09952069073915482,
0.00021089577057864517,
-0.08592642843723297,
0.03570837900042534,
-0.07842851430177689,
-0.149947851896286,
-0.008920291438698769,
0.07201188057661057,
-0.04115975648164749,
0.0032315736170858145,
-0.05288741737604141,
-0.09080400317907333,
0.0028974462766200304,
-0.008626701310276985,
0.16304175555706024,
-0.06510613113641739,
0.12235667556524277,
0.038192491978406906,
0.07196608185768127,
-0.06543108075857162,
0.03920016437768936,
-0.08516132831573486,
0.019887734204530716,
-0.2218235284090042,
0.04296228289604187,
-0.05097716301679611,
0.06706228107213974,
-0.060027673840522766,
-0.12265059351921082,
0.008072027936577797,
0.0022015005815774202,
0.09168221056461334,
0.10648906975984573,
-0.22373521327972412,
-0.07567963749170303,
0.14744757115840912,
-0.07318980246782303,
-0.09910444170236588,
0.11305659264326096,
-0.06330794095993042,
0.01182395126670599,
0.06026928499341011,
0.19730639457702637,
0.05301862582564354,
-0.13805104792118073,
0.022822992876172066,
-0.015711341053247452,
0.04939395934343338,
-0.02755020558834076,
0.05096901208162308,
0.022230805829167366,
0.08758749067783356,
0.019314642995595932,
-0.06681390106678009,
0.06740259379148483,
-0.1243194192647934,
-0.09696054458618164,
-0.025357024744153023,
-0.08639120310544968,
0.042028557509183884,
0.08948050439357758,
0.06082841008901596,
-0.10515575110912323,
-0.07833276689052582,
0.09117019176483154,
0.07570064067840576,
-0.06889970600605011,
0.0399651937186718,
-0.06547331064939499,
0.04378560557961464,
-0.017811039462685585,
-0.036942265927791595,
-0.17467278242111206,
-0.026289593428373337,
-0.02218959666788578,
0.035954736173152924,
0.030067069455981255,
0.023233452811837196,
0.09187982231378555,
0.08886323124170303,
-0.07184918224811554,
-0.024934813380241394,
-0.06577154994010925,
0.002350378315895796,
-0.12084527313709259,
-0.22771242260932922,
-0.04396141320466995,
-0.008301522582769394,
0.0881694033741951,
-0.21088562905788422,
0.024129388853907585,
0.02320845052599907,
0.08749295026063919,
0.025224748998880386,
-0.03125136345624924,
-0.05304056033492088,
0.0771867111325264,
-0.010920940898358822,
-0.0659063532948494,
0.07027813047170639,
-0.005228137131780386,
-0.06762945652008057,
-0.054891929030418396,
-0.11368777602910995,
0.16318227350711823,
0.13312464952468872,
-0.1468105912208557,
-0.09270289540290833,
-0.012721171602606773,
-0.06419620662927628,
-0.03275556117296219,
-0.043460339307785034,
0.03881530091166496,
0.1797332614660263,
0.00040515922592021525,
0.13235510885715485,
-0.06064087897539139,
-0.034873008728027344,
0.028913231566548347,
-0.027805930003523827,
0.02617153897881508,
0.12811902165412903,
0.1261185258626938,
-0.06215166673064232,
0.12404162436723709,
0.12433139979839325,
-0.08060192316770554,
0.14881719648838043,
-0.0330965481698513,
-0.07978598028421402,
-0.01830931194126606,
-0.014239633455872536,
-0.007592339534312487,
0.1764323115348816,
-0.1478242129087448,
-0.016885461285710335,
-0.00498069915920496,
0.014767898246645927,
0.01537844818085432,
-0.25040221214294434,
-0.056994251906871796,
0.03938956931233406,
-0.04511456936597824,
-0.00835786946117878,
-0.024216320365667343,
-0.004326513968408108,
0.10449087619781494,
-0.0067689549177885056,
-0.07567106187343597,
0.0015425763558596373,
-0.007483184337615967,
-0.04965208098292351,
0.20717757940292358,
-0.058443594723939896,
-0.1191512867808342,
-0.09026763588190079,
-0.07873692363500595,
-0.03747443109750748,
0.0031157031189650297,
0.057938121259212494,
-0.10871084779500961,
-0.018645042553544044,
-0.060029834508895874,
0.01890450157225132,
0.006153430789709091,
0.03593037277460098,
-0.00013498582120519131,
-0.008356097154319286,
0.05645011365413666,
-0.09703078866004944,
-0.010265159420669079,
-0.06589375436306,
-0.05402236431837082,
0.055339280515909195,
0.05885810777544975,
0.14800463616847992,
0.1350613385438919,
-0.0252097025513649,
0.02032640390098095,
-0.03288876637816429,
0.25847429037094116,
-0.09418849647045135,
-0.027980154380202293,
0.11878024786710739,
-0.011601071804761887,
0.05644715577363968,
0.1070157065987587,
0.08243507146835327,
-0.10943987220525742,
-0.0012706380803138018,
0.0638323575258255,
-0.052788861095905304,
-0.1554594337940216,
-0.014808814972639084,
-0.057834237813949585,
-0.02958020754158497,
0.07608988136053085,
0.0275740884244442,
-0.003803137456998229,
0.0558692067861557,
0.0484766848385334,
0.041750479489564896,
-0.02394390106201172,
0.050296854227781296,
0.08774850517511368,
0.0323515348136425,
0.10944294184446335,
-0.044346217066049576,
-0.06670702993869781,
0.03185335919260979,
0.004098225384950638,
0.24376392364501953,
-0.015871742740273476,
0.09617166966199875,
0.07409161329269409,
0.16115093231201172,
-0.013026690110564232,
0.04848093166947365,
-0.015628619119524956,
-0.06816532462835312,
-0.01938595063984394,
-0.0446120984852314,
-0.01713716797530651,
0.009417246095836163,
-0.05140825733542442,
0.039699211716651917,
-0.12616375088691711,
0.011237387545406818,
0.06795448064804077,
0.2505929172039032,
0.027969086542725563,
-0.318435937166214,
-0.0648145005106926,
-0.0056916032917797565,
-0.010323254391551018,
-0.007860505022108555,
0.0067171440459787846,
0.1521758884191513,
-0.08197707682847977,
0.05642534792423248,
-0.08542340993881226,
0.08512335270643234,
-0.03678588941693306,
0.050826966762542725,
0.07847865670919418,
0.07381583750247955,
-0.003930340055376291,
0.055623605847358704,
-0.2865166664123535,
0.3010620176792145,
0.0013582718092948198,
0.0842340812087059,
-0.06373330950737,
-0.03179372102022171,
0.03312888741493225,
0.08105762302875519,
0.08677160739898682,
-0.01542341336607933,
-0.020947394892573357,
-0.21500270068645477,
-0.021702256053686142,
0.030829697847366333,
0.12977011501789093,
-0.017499467357993126,
0.10437894612550735,
-0.01049244124442339,
-0.004997258074581623,
0.07412558794021606,
0.000869431474711746,
-0.03422078117728233,
-0.09081001579761505,
-0.026144569739699364,
-0.02574739046394825,
-0.05106571689248085,
-0.0584954135119915,
-0.1069105714559555,
-0.11540292203426361,
0.11326166242361069,
0.020023254677653313,
-0.01410345733165741,
-0.12087692320346832,
0.09719489514827728,
0.0781511440873146,
-0.07541235536336899,
0.04032374918460846,
0.03159649297595024,
0.05776335671544075,
0.032085128128528595,
-0.05866282060742378,
0.11728952080011368,
-0.060786571353673935,
-0.15957698225975037,
-0.05626324191689491,
0.09120677411556244,
0.05111150071024895,
0.05709457024931908,
-0.02444664016366005,
0.01582995057106018,
-0.016656162217259407,
-0.0924004539847374,
0.054959069937467575,
-0.04305969551205635,
0.06330195814371109,
0.010525475256145,
-0.020076531916856766,
0.05312363803386688,
-0.05608445033431053,
-0.01219671219587326,
0.1470005214214325,
0.28592565655708313,
-0.08886907249689102,
0.013339540921151638,
0.01575865037739277,
-0.06526566296815872,
-0.19104531407356262,
0.07912170141935349,
0.058098483830690384,
-0.00030742250964976847,
0.08794169127941132,
-0.16657491028308868,
0.0973358228802681,
0.10324571281671524,
0.0004211888590361923,
0.11347323656082153,
-0.36749038100242615,
-0.1278308928012848,
0.07948783040046692,
0.1915132701396942,
0.07599451392889023,
-0.15517963469028473,
0.0014148812042549253,
-0.00221142265945673,
-0.14896652102470398,
0.09196633845567703,
-0.07709485292434692,
0.13596509397029877,
-0.02011549472808838,
0.08762302249670029,
0.016174856573343277,
-0.06133335828781128,
0.12217944860458374,
-0.004067434463649988,
0.14025096595287323,
-0.06988464295864105,
-0.03937462344765663,
0.054275933653116226,
-0.03809354081749916,
-0.012674293480813503,
-0.04718635231256485,
0.027415649965405464,
-0.060148317366838455,
-0.011945867910981178,
-0.10485587269067764,
0.012402334250509739,
-0.038981691002845764,
-0.06742627918720245,
-0.045674532651901245,
0.04384404048323631,
0.04485847428441048,
-0.0043662539683282375,
0.15200991928577423,
-0.010281546041369438,
0.11400621384382248,
0.049780283123254776,
0.06048602983355522,
-0.06203019991517067,
-0.10667672008275986,
-0.01840597577393055,
0.008667556568980217,
0.04750975966453552,
-0.13267864286899567,
0.01502790953963995,
0.15251323580741882,
0.04939693585038185,
0.12232688814401627,
0.08694817125797272,
-0.03260713815689087,
0.0322934053838253,
0.06962883472442627,
-0.1570555865764618,
-0.11307168751955032,
0.0019518918124958873,
-0.06596354395151138,
-0.07327287644147873,
0.052913110703229904,
0.07758498936891556,
-0.07489292323589325,
0.012352039106190205,
-0.006148058455437422,
0.006712825503200293,
-0.0681714415550232,
0.20486009120941162,
0.055837541818618774,
0.04113905131816864,
-0.1032596006989479,
0.07346359640359879,
0.017390236258506775,
-0.08551391959190369,
-0.0014298330061137676,
0.09207916259765625,
-0.06887996941804886,
-0.024761458858847618,
0.08199238032102585,
0.190627783536911,
-0.07825762778520584,
-0.02196286991238594,
-0.15053527057170868,
-0.10689305514097214,
0.06940051913261414,
0.18378853797912598,
0.10010828822851181,
-0.00700749596580863,
-0.05259443446993828,
0.04726144298911095,
-0.11694090068340302,
0.07697933167219162,
0.023197611793875694,
0.08149649947881699,
-0.14993496239185333,
0.1809249222278595,
0.011050499975681305,
0.0561111681163311,
-0.026171647012233734,
0.03273617476224899,
-0.11898921430110931,
0.04080826789140701,
-0.11451904475688934,
-0.03699637949466705,
-0.015641704201698303,
0.004636476282030344,
-0.013475734740495682,
-0.06216071546077728,
-0.0631694570183754,
0.005423322785645723,
-0.12738019227981567,
-0.02230103313922882,
0.04552564397454262,
0.02260253019630909,
-0.12645217776298523,
-0.038885023444890976,
0.027624761685729027,
-0.06389036774635315,
0.05589533969759941,
0.03571045398712158,
0.013845653273165226,
0.06603901088237762,
-0.1727924793958664,
-0.02265363186597824,
0.0699658915400505,
-0.007268066518008709,
0.06292551755905151,
-0.03660331666469574,
-0.026389017701148987,
-0.029277319088578224,
0.08761150389909744,
0.013258527033030987,
0.06256784498691559,
-0.13680429756641388,
0.006700072903186083,
-0.03322767838835716,
-0.09281384944915771,
-0.05853622406721115,
0.05357194319367409,
0.06272212415933609,
0.03734251856803894,
0.16280010342597961,
-0.08367099612951279,
0.04496293142437935,
-0.21855942904949188,
-0.016505692154169083,
0.0022946991957724094,
-0.10675408691167831,
-0.0832897201180458,
-0.07183435559272766,
0.08263169229030609,
-0.07520152628421783,
0.1101706475019455,
0.03703104704618454,
0.06440240144729614,
0.031303610652685165,
-0.032821785658597946,
-0.0021630744449794292,
0.03405728191137314,
0.2101021409034729,
0.011069083586335182,
-0.03273952379822731,
0.08936287462711334,
0.07853960245847702,
0.10012693703174591,
0.13596142828464508,
0.2286241054534912,
0.15446825325489044,
-0.02608575113117695,
0.08964192867279053,
0.05257187783718109,
-0.06457601487636566,
-0.17196445167064667,
0.03510070964694023,
-0.051511745899915695,
0.09740171581506729,
-0.06140376999974251,
0.2018471509218216,
0.08685792982578278,
-0.18311743438243866,
0.065960593521595,
-0.04626616835594177,
-0.10128027200698853,
-0.0809164047241211,
-0.037211135029792786,
-0.06956791132688522,
-0.14843709766864777,
0.02596873790025711,
-0.10320080071687698,
0.04253184050321579,
0.15164893865585327,
0.01070908922702074,
-0.012106250040233135,
0.2135184407234192,
0.03405368700623512,
0.03638409450650215,
0.05690782144665718,
0.014051002450287342,
-0.02916412241756916,
-0.0916077196598053,
-0.06028614193201065,
0.017825167626142502,
-0.029668118804693222,
0.018953057006001472,
-0.06867857277393341,
-0.0773639902472496,
0.02685466781258583,
0.004720157943665981,
-0.09396736323833466,
0.023428473621606827,
0.020553920418024063,
0.09036771953105927,
0.02744337171316147,
0.006041247397661209,
0.01641482673585415,
-0.028132805600762367,
0.24545060098171234,
-0.09323710203170776,
-0.081886425614357,
-0.0821460708975792,
0.2183823138475418,
0.03093087114393711,
0.0015594690339639783,
0.008237261325120926,
-0.08114650100469589,
0.008446210995316505,
0.2288217395544052,
0.17346376180648804,
-0.1333674192428589,
-0.010178035125136375,
0.00007573953189421445,
0.001718746847473085,
-0.029354896396398544,
0.11893689632415771,
0.12110992521047592,
0.05222265422344208,
-0.11468545347452164,
-0.05246816575527191,
-0.052555251866579056,
-0.018287165090441704,
-0.026253914460539818,
0.049028877168893814,
0.06731091439723969,
0.022756949067115784,
-0.06994714587926865,
0.07555589079856873,
-0.05931716784834862,
-0.14070741832256317,
0.10515697300434113,
-0.22745609283447266,
-0.1561095416545868,
-0.007017879281193018,
0.12098873406648636,
0.0034001765307039022,
0.06081113964319229,
-0.04221653938293457,
0.0010740956058725715,
0.04967864975333214,
-0.0054148598574101925,
-0.07894907891750336,
-0.10290852934122086,
0.08529814332723618,
-0.11389221996068954,
0.2169359177350998,
-0.05933253839612007,
0.03468981757760048,
0.11319968104362488,
0.06352267414331436,
-0.050419121980667114,
0.05671790614724159,
0.04163617268204689,
-0.12375028431415558,
-0.004887319169938564,
0.12322105467319489,
-0.03880440071225166,
0.05456853285431862,
0.03287462517619133,
-0.13193590939044952,
0.0327320396900177,
-0.054803576320409775,
-0.04047049209475517,
-0.027591973543167114,
-0.050643499940633774,
-0.06421016156673431,
0.11567234247922897,
0.20865952968597412,
-0.008241011761128902,
0.024139219895005226,
-0.08683662861585617,
0.01646733470261097,
0.0653618723154068,
0.04798787832260132,
-0.07811658084392548,
-0.21575382351875305,
0.006673632189631462,
0.07048676908016205,
-0.04172134771943092,
-0.2065606713294983,
-0.11201705038547516,
0.037444498389959335,
-0.05394796282052994,
-0.07203050702810287,
0.09082940220832825,
0.08981955051422119,
0.05658827722072601,
-0.055002838373184204,
-0.10248950868844986,
-0.059226326644420624,
0.17028172314167023,
-0.14703744649887085,
-0.0772395133972168
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b5_2
This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4198
- Mean Iou: 0.3194
- Mean Accuracy: 0.4998
- Overall Accuracy: 0.9558
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.0023
- Accuracy Undropoff: 0.9972
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.0022
- Iou Undropoff: 0.9558
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 0.989 | 5.0 | 10 | 1.0190 | 0.2162 | 0.5831 | 0.5879 | nan | 0.5779 | 0.5883 | 0.0 | 0.0657 | 0.5829 |
| 0.9092 | 10.0 | 20 | 0.8686 | 0.3164 | 0.5199 | 0.8922 | nan | 0.1137 | 0.9260 | 0.0 | 0.0539 | 0.8953 |
| 0.8483 | 15.0 | 30 | 0.7438 | 0.3256 | 0.5234 | 0.9219 | nan | 0.0888 | 0.9581 | 0.0 | 0.0545 | 0.9224 |
| 0.7856 | 20.0 | 40 | 0.6571 | 0.3182 | 0.5013 | 0.9336 | nan | 0.0297 | 0.9728 | 0.0 | 0.0210 | 0.9335 |
| 0.7459 | 25.0 | 50 | 0.6144 | 0.3164 | 0.4980 | 0.9324 | nan | 0.0242 | 0.9718 | 0.0 | 0.0168 | 0.9324 |
| 0.7027 | 30.0 | 60 | 0.5861 | 0.3168 | 0.4975 | 0.9351 | nan | 0.0202 | 0.9748 | 0.0 | 0.0151 | 0.9353 |
| 0.6827 | 35.0 | 70 | 0.5568 | 0.3171 | 0.4975 | 0.9391 | nan | 0.0159 | 0.9791 | 0.0 | 0.0122 | 0.9391 |
| 0.6362 | 40.0 | 80 | 0.5405 | 0.3179 | 0.4982 | 0.9424 | nan | 0.0138 | 0.9827 | 0.0 | 0.0112 | 0.9425 |
| 0.6098 | 45.0 | 90 | 0.5192 | 0.3174 | 0.4971 | 0.9449 | nan | 0.0087 | 0.9855 | 0.0 | 0.0073 | 0.9449 |
| 0.5946 | 50.0 | 100 | 0.5025 | 0.3179 | 0.4978 | 0.9475 | nan | 0.0072 | 0.9883 | 0.0 | 0.0062 | 0.9477 |
| 0.5868 | 55.0 | 110 | 0.4943 | 0.3179 | 0.4976 | 0.9490 | nan | 0.0052 | 0.9900 | 0.0 | 0.0046 | 0.9491 |
| 0.5557 | 60.0 | 120 | 0.4798 | 0.3184 | 0.4983 | 0.9505 | nan | 0.0051 | 0.9915 | 0.0 | 0.0045 | 0.9506 |
| 0.5327 | 65.0 | 130 | 0.4736 | 0.3184 | 0.4983 | 0.9514 | nan | 0.0041 | 0.9925 | 0.0 | 0.0038 | 0.9514 |
| 0.525 | 70.0 | 140 | 0.4657 | 0.3187 | 0.4987 | 0.9526 | nan | 0.0038 | 0.9937 | 0.0 | 0.0035 | 0.9526 |
| 0.5266 | 75.0 | 150 | 0.4528 | 0.3190 | 0.4992 | 0.9534 | nan | 0.0037 | 0.9946 | 0.0 | 0.0034 | 0.9535 |
| 0.5139 | 80.0 | 160 | 0.4538 | 0.3189 | 0.4991 | 0.9533 | nan | 0.0037 | 0.9945 | 0.0 | 0.0035 | 0.9534 |
| 0.5128 | 85.0 | 170 | 0.4460 | 0.3192 | 0.4995 | 0.9543 | nan | 0.0033 | 0.9956 | 0.0 | 0.0031 | 0.9543 |
| 0.4901 | 90.0 | 180 | 0.4371 | 0.3192 | 0.4995 | 0.9548 | nan | 0.0029 | 0.9961 | 0.0 | 0.0027 | 0.9548 |
| 0.4767 | 95.0 | 190 | 0.4325 | 0.3193 | 0.4997 | 0.9552 | nan | 0.0029 | 0.9965 | 0.0 | 0.0027 | 0.9552 |
| 0.4692 | 100.0 | 200 | 0.4272 | 0.3193 | 0.4997 | 0.9556 | nan | 0.0024 | 0.9970 | 0.0 | 0.0023 | 0.9556 |
| 0.4632 | 105.0 | 210 | 0.4251 | 0.3193 | 0.4996 | 0.9556 | nan | 0.0023 | 0.9969 | 0.0 | 0.0023 | 0.9556 |
| 0.4626 | 110.0 | 220 | 0.4236 | 0.3193 | 0.4997 | 0.9556 | nan | 0.0024 | 0.9970 | 0.0 | 0.0024 | 0.9556 |
| 0.4837 | 115.0 | 230 | 0.4216 | 0.3194 | 0.4998 | 0.9558 | nan | 0.0023 | 0.9972 | 0.0 | 0.0023 | 0.9558 |
| 0.4809 | 120.0 | 240 | 0.4198 | 0.3194 | 0.4998 | 0.9558 | nan | 0.0023 | 0.9972 | 0.0 | 0.0022 | 0.9558 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b5_2", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b5_2 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:23:34+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b5\_2
====================================
This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4198
* Mean Iou: 0.3194
* Mean Accuracy: 0.4998
* Overall Accuracy: 0.9558
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.0023
* Accuracy Undropoff: 0.9972
* Iou Unlabeled: 0.0
* Iou Dropoff: 0.0022
* Iou Undropoff: 0.9558
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 4e-06
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10645467787981033,
0.03549440577626228,
-0.0019894561264663935,
0.11752888560295105,
0.17239142954349518,
0.028765182942152023,
0.11628658324480057,
0.11495105177164078,
-0.10961394011974335,
0.030972762033343315,
0.10459419339895248,
0.1416487693786621,
0.016113338991999626,
0.09334846585988998,
-0.019717268645763397,
-0.3065769076347351,
-0.025647716596722603,
0.03148462995886803,
-0.0859055444598198,
0.12469952553510666,
0.06395023316144943,
-0.16260577738285065,
0.09234622120857239,
-0.004153305198997259,
-0.22135910391807556,
0.01641453616321087,
-0.0049131582491099834,
-0.03042782098054886,
0.16022798418998718,
0.02333923615515232,
0.11599082499742508,
0.009599044919013977,
0.11429212242364883,
-0.20107313990592957,
0.017695993185043335,
0.0556800551712513,
-0.0045519922859966755,
0.06757128983736038,
0.06523606926202774,
0.002027882030233741,
0.1509314924478531,
-0.10636400431394577,
0.0670810341835022,
0.0016819033771753311,
-0.14493833482265472,
-0.21270783245563507,
-0.07475170493125916,
0.020741110667586327,
0.07699872553348541,
0.0965762808918953,
-0.0053629581816494465,
0.11220142245292664,
-0.09177467226982117,
0.1127576157450676,
0.267737478017807,
-0.24488116800785065,
-0.08604063093662262,
0.038578107953071594,
0.00190632080193609,
0.06764357537031174,
-0.13343720138072968,
0.008954931981861591,
0.033482976257801056,
0.04753899574279785,
0.1159764900803566,
-0.0331973135471344,
-0.09878786653280258,
0.02736218459904194,
-0.13830196857452393,
-0.0325172021985054,
0.0562511570751667,
0.053064968436956406,
-0.02077104151248932,
-0.031305186450481415,
-0.06789511442184448,
-0.18216079473495483,
-0.06614596396684647,
0.01323644071817398,
0.06690515577793121,
-0.06026866286993027,
-0.11564785987138748,
-0.014790885150432587,
-0.11018315702676773,
-0.08470869064331055,
-0.04977115988731384,
0.1272638589143753,
0.034169796854257584,
0.019876716658473015,
-0.033830177038908005,
0.12667599320411682,
-0.0265678521245718,
-0.13946960866451263,
0.017729688435792923,
0.030323045328259468,
-0.04210963100194931,
-0.031279295682907104,
-0.0493755117058754,
-0.06159085035324097,
-0.012572284787893295,
0.10793985426425934,
-0.06125389412045479,
0.06789875775575638,
0.03564045578241348,
0.050442587584257126,
-0.1146913692355156,
0.18945670127868652,
-0.06697286665439606,
-0.008087542839348316,
-0.036932382732629776,
0.05790580436587334,
0.0038325435016304255,
-0.02256491407752037,
-0.10615450143814087,
0.004646112211048603,
0.06964331865310669,
-0.008260689675807953,
-0.08835919946432114,
0.07089193910360336,
-0.03890223428606987,
-0.011155642569065094,
-0.00022475946752820164,
-0.07622074335813522,
0.046170566231012344,
-0.0004161580291111022,
-0.08361644297838211,
-0.028772728517651558,
0.051251061260700226,
0.01392409112304449,
0.013646918348968029,
0.1640656739473343,
-0.08687426149845123,
0.06304077059030533,
-0.11255604028701782,
-0.10038492828607559,
0.000005053904260421405,
-0.08535294234752655,
0.03644756227731705,
-0.07803448289632797,
-0.14873576164245605,
-0.009416691027581692,
0.07173692435026169,
-0.04108810797333717,
0.002765945391729474,
-0.05283200368285179,
-0.09118574112653732,
0.0030624002683907747,
-0.008197766728699207,
0.16342270374298096,
-0.06522787362337112,
0.12239515036344528,
0.03861893340945244,
0.07293567806482315,
-0.06540898978710175,
0.039012566208839417,
-0.08479136228561401,
0.01976170763373375,
-0.22152431309223175,
0.04245636984705925,
-0.050750743597745895,
0.06743131577968597,
-0.059846729040145874,
-0.12263311445713043,
0.009024965576827526,
0.0023447624407708645,
0.09129508584737778,
0.1063806340098381,
-0.2233811616897583,
-0.07605550438165665,
0.14826567471027374,
-0.07279576361179352,
-0.09847637265920639,
0.11344340443611145,
-0.06374383717775345,
0.011949700303375721,
0.06088237091898918,
0.19839169085025787,
0.05342784523963928,
-0.1378502994775772,
0.022322295233607292,
-0.016007911413908005,
0.049086570739746094,
-0.02665541134774685,
0.050036318600177765,
0.02287379652261734,
0.08675699681043625,
0.019471876323223114,
-0.0651109516620636,
0.06717219203710556,
-0.12421663850545883,
-0.09683408588171005,
-0.025652438402175903,
-0.08608803153038025,
0.04294900968670845,
0.08948921412229538,
0.061217986047267914,
-0.10542458295822144,
-0.07876261323690414,
0.08998437225818634,
0.07573439925909042,
-0.06901959329843521,
0.039398614317178726,
-0.06573182344436646,
0.04398604482412338,
-0.018164459615945816,
-0.036450713872909546,
-0.1745569407939911,
-0.026372788473963737,
-0.02217327430844307,
0.036117952316999435,
0.03002355620265007,
0.024471435695886612,
0.09217200428247452,
0.08923450857400894,
-0.0719524472951889,
-0.024867918342351913,
-0.06502968817949295,
0.0023006880655884743,
-0.1218622550368309,
-0.2284659743309021,
-0.04353959485888481,
-0.007800485473126173,
0.08937118202447891,
-0.21349094808101654,
0.02386532723903656,
0.024261044338345528,
0.08736936002969742,
0.025278694927692413,
-0.03047209605574608,
-0.053478822112083435,
0.07668041437864304,
-0.0104594137519598,
-0.06555622816085815,
0.07022886723279953,
-0.004920400679111481,
-0.06732498854398727,
-0.05509662628173828,
-0.11323491483926773,
0.16368286311626434,
0.13394232094287872,
-0.14716890454292297,
-0.09276054054498672,
-0.011483496055006981,
-0.06413038820028305,
-0.03321034833788872,
-0.04300127178430557,
0.0388919897377491,
0.18043121695518494,
-0.00043485971400514245,
0.13279183208942413,
-0.06004500016570091,
-0.034383486956357956,
0.029603350907564163,
-0.027281925082206726,
0.027968324720859528,
0.12892355024814606,
0.12423139810562134,
-0.06043965741991997,
0.12383166700601578,
0.12368139624595642,
-0.08046963810920715,
0.14836305379867554,
-0.03367164731025696,
-0.07989601045846939,
-0.01869826391339302,
-0.014778371900320053,
-0.008074665442109108,
0.17722241580486298,
-0.14890390634536743,
-0.017195822671055794,
-0.004505519289523363,
0.014145521447062492,
0.015024593099951744,
-0.2507716119289398,
-0.05621974915266037,
0.038987934589385986,
-0.04495035856962204,
-0.009762858971953392,
-0.024578509852290154,
-0.004175929352641106,
0.10472983866930008,
-0.006838222499936819,
-0.07559406012296677,
0.0014720249455422163,
-0.007263994310051203,
-0.04906705766916275,
0.20668402314186096,
-0.058608636260032654,
-0.11799104511737823,
-0.09034152328968048,
-0.07861156761646271,
-0.03646349534392357,
0.003358695423230529,
0.05818289518356323,
-0.10918679088354111,
-0.01875029318034649,
-0.05911076441407204,
0.018494341522455215,
0.006001806352287531,
0.036409035325050354,
-0.0007691751234233379,
-0.008336002938449383,
0.05672286078333855,
-0.09651154279708862,
-0.009910210967063904,
-0.06600496917963028,
-0.053951408714056015,
0.05566030740737915,
0.05986762419342995,
0.14779333770275116,
0.1346147358417511,
-0.0253940187394619,
0.020075440406799316,
-0.032353129237890244,
0.2587055563926697,
-0.09519686549901962,
-0.026904337108135223,
0.11813291162252426,
-0.01202543918043375,
0.0562250018119812,
0.10732518881559372,
0.08279003202915192,
-0.10942653566598892,
-0.0020300415344536304,
0.06382972747087479,
-0.05198628455400467,
-0.1556749939918518,
-0.014573412016034126,
-0.057845644652843475,
-0.029581867158412933,
0.0766868069767952,
0.027410561218857765,
-0.004033652599900961,
0.05567706748843193,
0.04888365790247917,
0.04381426051259041,
-0.02505796030163765,
0.049698151648044586,
0.08959633857011795,
0.031986698508262634,
0.1090121939778328,
-0.04502400383353233,
-0.06693879514932632,
0.031747546046972275,
0.003660444635897875,
0.24556462466716766,
-0.015796232968568802,
0.09784109145402908,
0.07348457723855972,
0.16242653131484985,
-0.012219002470374107,
0.04803235083818436,
-0.01564599573612213,
-0.06799059361219406,
-0.01907340995967388,
-0.044339511543512344,
-0.015962211415171623,
0.010059015825390816,
-0.05229263752698898,
0.03902992233633995,
-0.12638065218925476,
0.01008702628314495,
0.0675162822008133,
0.2496398687362671,
0.028711361810564995,
-0.31895872950553894,
-0.06590895354747772,
-0.005903983023017645,
-0.01033161859959364,
-0.008553633466362953,
0.006839428097009659,
0.15332528948783875,
-0.0815042033791542,
0.055848728865385056,
-0.08530396968126297,
0.08507559448480606,
-0.03766774758696556,
0.05135248228907585,
0.07839367538690567,
0.07372046262025833,
-0.0041855014860630035,
0.055943090468645096,
-0.2848871946334839,
0.3007214367389679,
0.0018568799132481217,
0.08426950126886368,
-0.06370364129543304,
-0.03191357105970383,
0.033258624374866486,
0.08194421976804733,
0.08662831783294678,
-0.015122613869607449,
-0.022832322865724564,
-0.2143436074256897,
-0.021257147192955017,
0.031523093581199646,
0.1294616460800171,
-0.017603358253836632,
0.10479064285755157,
-0.009977518580853939,
-0.005672742146998644,
0.07390152662992477,
-0.0013763908063992858,
-0.03351758420467377,
-0.09085075557231903,
-0.026255333796143532,
-0.024759657680988312,
-0.04987579211592674,
-0.05812210962176323,
-0.10678534209728241,
-0.11510380357503891,
0.11311542987823486,
0.0193545650690794,
-0.014276168309152126,
-0.12073586136102676,
0.09767226874828339,
0.07851475477218628,
-0.07547194510698318,
0.04071405529975891,
0.032087888568639755,
0.05777658522129059,
0.03285352513194084,
-0.05820035934448242,
0.1178136095404625,
-0.060052502900362015,
-0.15988034009933472,
-0.056330785155296326,
0.09147214144468307,
0.05073469877243042,
0.05669025331735611,
-0.02444225177168846,
0.016377177089452744,
-0.01788281463086605,
-0.0921831727027893,
0.05595136061310768,
-0.04493480920791626,
0.06428442895412445,
0.0093407416716218,
-0.02062670700252056,
0.053645018488168716,
-0.055465828627347946,
-0.012063504196703434,
0.1460232436656952,
0.285040944814682,
-0.08919622749090195,
0.012748324312269688,
0.015989331528544426,
-0.06543833017349243,
-0.190673366189003,
0.07916895300149918,
0.057650890201330185,
0.00007080454815877602,
0.08606767654418945,
-0.167779803276062,
0.09834159910678864,
0.10234241932630539,
0.000778562854975462,
0.11731700599193573,
-0.36549586057662964,
-0.12770944833755493,
0.07912580668926239,
0.19051295518875122,
0.07794856280088425,
-0.15524452924728394,
0.0007823081687092781,
-0.0022413842380046844,
-0.14897899329662323,
0.09074757248163223,
-0.07937273383140564,
0.13576993346214294,
-0.019925523549318314,
0.08893294632434845,
0.016168413683772087,
-0.06092334911227226,
0.12240763008594513,
-0.002805601805448532,
0.1403646469116211,
-0.0697253867983818,
-0.04001297801733017,
0.05406046286225319,
-0.03775382786989212,
-0.013346805237233639,
-0.04673207923769951,
0.027169907465577126,
-0.061201903969049454,
-0.011458905413746834,
-0.10478471964597702,
0.012362256646156311,
-0.038975004106760025,
-0.06667368113994598,
-0.04562864080071449,
0.04325044900178909,
0.04434077814221382,
-0.004186018370091915,
0.1508554220199585,
-0.009778941050171852,
0.11333291977643967,
0.049667563289403915,
0.05886968597769737,
-0.061432208865880966,
-0.10747721791267395,
-0.017476335167884827,
0.008791729807853699,
0.048251666128635406,
-0.13475820422172546,
0.015046105720102787,
0.15296024084091187,
0.04960563778877258,
0.12213842570781708,
0.08683425933122635,
-0.03205673024058342,
0.03234707936644554,
0.06957677006721497,
-0.1579504758119583,
-0.11389602720737457,
0.0021929466165602207,
-0.06897783279418945,
-0.07269660383462906,
0.05266629904508591,
0.0780910775065422,
-0.07507123798131943,
0.012594709172844887,
-0.0062615228816866875,
0.006315930280834436,
-0.06801296025514603,
0.20484000444412231,
0.05618663877248764,
0.04116050526499748,
-0.10399443656206131,
0.07304475456476212,
0.01865335740149021,
-0.0871361717581749,
-0.0015379019314423203,
0.09169291704893112,
-0.06949954479932785,
-0.025104494765400887,
0.08114895224571228,
0.19243213534355164,
-0.07658086717128754,
-0.022713856771588326,
-0.15076731145381927,
-0.106988824903965,
0.0695074200630188,
0.1827487200498581,
0.09953391551971436,
-0.006815826985985041,
-0.05254752188920975,
0.0471976213157177,
-0.11768919974565506,
0.07769598811864853,
0.023487616330385208,
0.08155255019664764,
-0.14964446425437927,
0.18171627819538116,
0.01071421429514885,
0.05652603879570961,
-0.026187658309936523,
0.03273303061723709,
-0.11897500604391098,
0.04025048390030861,
-0.11505827307701111,
-0.03603046014904976,
-0.014626987278461456,
0.004858328495174646,
-0.013677160255610943,
-0.062129247933626175,
-0.06312460452318192,
0.005508847534656525,
-0.12769348919391632,
-0.0221107117831707,
0.046304941177368164,
0.02300187386572361,
-0.1259208619594574,
-0.03896871209144592,
0.027591420337557793,
-0.06414736062288284,
0.05614398419857025,
0.036410003900527954,
0.014112655073404312,
0.06592270731925964,
-0.1701734960079193,
-0.022437352687120438,
0.06952561438083649,
-0.007474852725863457,
0.06373673677444458,
-0.036649931222200394,
-0.026365652680397034,
-0.030115826055407524,
0.08787032961845398,
0.012785504572093487,
0.06296513974666595,
-0.13695582747459412,
0.006546925753355026,
-0.03284335881471634,
-0.09253928810358047,
-0.05889536067843437,
0.05330406874418259,
0.062132541090250015,
0.03654617816209793,
0.1625298708677292,
-0.0834943950176239,
0.04461115598678589,
-0.21891038119792938,
-0.016711145639419556,
0.0020935116335749626,
-0.10775449872016907,
-0.08129590004682541,
-0.07239338755607605,
0.08299613744020462,
-0.07581038773059845,
0.11016344279050827,
0.03691704571247101,
0.06457903236150742,
0.030824029818177223,
-0.03310227766633034,
-0.004639619495719671,
0.03437498211860657,
0.20998850464820862,
0.01107510644942522,
-0.033157046884298325,
0.08834434300661087,
0.07868461310863495,
0.09991360455751419,
0.1372518092393875,
0.22679507732391357,
0.15364183485507965,
-0.02456810511648655,
0.08946958184242249,
0.052169833332300186,
-0.0650557428598404,
-0.1729613095521927,
0.037424247711896896,
-0.05259924754500389,
0.09864875674247742,
-0.06155087426304817,
0.20155687630176544,
0.08658217638731003,
-0.18225568532943726,
0.06633610278367996,
-0.04674891382455826,
-0.10165897011756897,
-0.07999345660209656,
-0.03585449233651161,
-0.07003724575042725,
-0.1488531082868576,
0.02581186406314373,
-0.10281489044427872,
0.04298600181937218,
0.15136416256427765,
0.01020808331668377,
-0.012660548090934753,
0.21385036408901215,
0.034511417150497437,
0.036387503147125244,
0.05728518217802048,
0.014456420205533504,
-0.02972065843641758,
-0.09064330160617828,
-0.06035810708999634,
0.017924262210726738,
-0.03022041544318199,
0.01832186058163643,
-0.06853306293487549,
-0.07686514407396317,
0.02717045694589615,
0.004998005926609039,
-0.09357250481843948,
0.023185301572084427,
0.02117464318871498,
0.09076129645109177,
0.026184339076280594,
0.005695232655853033,
0.016210613772273064,
-0.0285439882427454,
0.24586156010627747,
-0.09328481554985046,
-0.08106804639101028,
-0.0820724293589592,
0.2174108475446701,
0.030564475804567337,
0.000933788891416043,
0.008055564016103745,
-0.08130227774381638,
0.009747414849698544,
0.2300957888364792,
0.17322005331516266,
-0.1329042911529541,
-0.010538358241319656,
0.00044154157512821257,
0.0015264940448105335,
-0.028923898935317993,
0.11909337341785431,
0.1208348348736763,
0.05190243944525719,
-0.11467701941728592,
-0.05350208654999733,
-0.05307093635201454,
-0.018299298360943794,
-0.027189942076802254,
0.050728730857372284,
0.0666707381606102,
0.02250974252820015,
-0.0698627233505249,
0.0753372386097908,
-0.05815022066235542,
-0.14319398999214172,
0.1068287268280983,
-0.22700728476047516,
-0.1564132124185562,
-0.00703990226611495,
0.1218322366476059,
0.004089253023266792,
0.060577332973480225,
-0.04248756170272827,
0.0006349264876917005,
0.05172162130475044,
-0.0057282946072518826,
-0.07891079038381577,
-0.1027054488658905,
0.08472800999879837,
-0.11649352312088013,
0.21673883497714996,
-0.05936650186777115,
0.03490997105836868,
0.11324001848697662,
0.06379811465740204,
-0.05071869492530823,
0.05613761395215988,
0.04134894534945488,
-0.12473081797361374,
-0.004584678448736668,
0.12382537126541138,
-0.03864980489015579,
0.05305786058306694,
0.03270053490996361,
-0.1309548169374466,
0.03233545273542404,
-0.054752521216869354,
-0.04094216227531433,
-0.02742963843047619,
-0.05162636563181877,
-0.06412235647439957,
0.1152728796005249,
0.20922206342220306,
-0.008453384041786194,
0.024568362161517143,
-0.08674848079681396,
0.015763629227876663,
0.06609011441469193,
0.04711932688951492,
-0.07806653529405594,
-0.21542683243751526,
0.00674476008862257,
0.07231094688177109,
-0.04130934551358223,
-0.20649492740631104,
-0.1113503947854042,
0.03701882064342499,
-0.05374825745820999,
-0.07155512273311615,
0.09098676592111588,
0.09037945419549942,
0.05648452788591385,
-0.0551205538213253,
-0.10326382517814636,
-0.05867664888501167,
0.17025257647037506,
-0.14683909714221954,
-0.07810017466545105
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b5_3
This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2768
- Mean Iou: 0.3194
- Mean Accuracy: 0.4999
- Overall Accuracy: 0.9578
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.0006
- Accuracy Undropoff: 0.9993
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.0006
- Iou Undropoff: 0.9578
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 15
- eval_batch_size: 15
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.0992 | 5.0 | 10 | 1.0599 | 0.1938 | 0.4241 | 0.5281 | nan | 0.3106 | 0.5376 | 0.0 | 0.0540 | 0.5273 |
| 1.0188 | 10.0 | 20 | 0.9493 | 0.2781 | 0.4808 | 0.7846 | nan | 0.1494 | 0.8122 | 0.0 | 0.0476 | 0.7868 |
| 0.9218 | 15.0 | 30 | 0.8130 | 0.3074 | 0.4913 | 0.8851 | nan | 0.0618 | 0.9209 | 0.0 | 0.0364 | 0.8858 |
| 0.8411 | 20.0 | 40 | 0.7253 | 0.3089 | 0.4866 | 0.9038 | nan | 0.0315 | 0.9416 | 0.0 | 0.0221 | 0.9047 |
| 0.7583 | 25.0 | 50 | 0.6719 | 0.3097 | 0.4890 | 0.9069 | nan | 0.0331 | 0.9448 | 0.0 | 0.0216 | 0.9076 |
| 0.688 | 30.0 | 60 | 0.6303 | 0.3109 | 0.4883 | 0.9170 | nan | 0.0207 | 0.9559 | 0.0 | 0.0149 | 0.9179 |
| 0.6279 | 35.0 | 70 | 0.5919 | 0.3139 | 0.4918 | 0.9276 | nan | 0.0164 | 0.9671 | 0.0 | 0.0133 | 0.9283 |
| 0.5533 | 40.0 | 80 | 0.5375 | 0.3168 | 0.4961 | 0.9377 | nan | 0.0144 | 0.9777 | 0.0 | 0.0125 | 0.9380 |
| 0.5116 | 45.0 | 90 | 0.5111 | 0.3176 | 0.4970 | 0.9442 | nan | 0.0093 | 0.9847 | 0.0 | 0.0083 | 0.9445 |
| 0.4801 | 50.0 | 100 | 0.4696 | 0.3183 | 0.4981 | 0.9492 | nan | 0.0062 | 0.9901 | 0.0 | 0.0057 | 0.9492 |
| 0.4744 | 55.0 | 110 | 0.4317 | 0.3187 | 0.4987 | 0.9543 | nan | 0.0018 | 0.9956 | 0.0 | 0.0017 | 0.9543 |
| 0.4494 | 60.0 | 120 | 0.3991 | 0.3189 | 0.4991 | 0.9555 | nan | 0.0013 | 0.9969 | 0.0 | 0.0012 | 0.9555 |
| 0.386 | 65.0 | 130 | 0.3737 | 0.3189 | 0.4990 | 0.9565 | nan | 0.0000 | 0.9980 | 0.0 | 0.0000 | 0.9565 |
| 0.3674 | 70.0 | 140 | 0.3538 | 0.3191 | 0.4994 | 0.9567 | nan | 0.0007 | 0.9981 | 0.0 | 0.0007 | 0.9567 |
| 0.3601 | 75.0 | 150 | 0.3413 | 0.3192 | 0.4995 | 0.9573 | nan | 0.0002 | 0.9988 | 0.0 | 0.0002 | 0.9573 |
| 0.3626 | 80.0 | 160 | 0.3225 | 0.3193 | 0.4996 | 0.9569 | nan | 0.0009 | 0.9984 | 0.0 | 0.0009 | 0.9569 |
| 0.3331 | 85.0 | 170 | 0.3163 | 0.3195 | 0.5000 | 0.9576 | nan | 0.0009 | 0.9991 | 0.0 | 0.0009 | 0.9576 |
| 0.319 | 90.0 | 180 | 0.3004 | 0.3200 | 0.5008 | 0.9577 | nan | 0.0024 | 0.9991 | 0.0 | 0.0024 | 0.9577 |
| 0.3163 | 95.0 | 190 | 0.2931 | 0.3198 | 0.5004 | 0.9575 | nan | 0.0020 | 0.9989 | 0.0 | 0.0020 | 0.9575 |
| 0.3185 | 100.0 | 200 | 0.2920 | 0.3194 | 0.4999 | 0.9577 | nan | 0.0006 | 0.9992 | 0.0 | 0.0006 | 0.9577 |
| 0.3122 | 105.0 | 210 | 0.2831 | 0.3194 | 0.4999 | 0.9578 | nan | 0.0005 | 0.9994 | 0.0 | 0.0005 | 0.9578 |
| 0.3218 | 110.0 | 220 | 0.2788 | 0.3195 | 0.5000 | 0.9576 | nan | 0.0009 | 0.9991 | 0.0 | 0.0009 | 0.9576 |
| 0.3037 | 115.0 | 230 | 0.2752 | 0.3194 | 0.4999 | 0.9577 | nan | 0.0006 | 0.9992 | 0.0 | 0.0006 | 0.9577 |
| 0.3319 | 120.0 | 240 | 0.2768 | 0.3194 | 0.4999 | 0.9578 | nan | 0.0006 | 0.9993 | 0.0 | 0.0006 | 0.9578 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b5_3", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b5_3 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:23:45+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b5\_3
====================================
This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2768
* Mean Iou: 0.3194
* Mean Accuracy: 0.4999
* Overall Accuracy: 0.9578
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.0006
* Accuracy Undropoff: 0.9993
* Iou Unlabeled: 0.0
* Iou Dropoff: 0.0006
* Iou Undropoff: 0.9578
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-06
* train\_batch\_size: 15
* eval\_batch\_size: 15
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 15\n* eval\\_batch\\_size: 15\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 15\n* eval\\_batch\\_size: 15\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 15\n* eval\\_batch\\_size: 15\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10599785298109055,
0.03578374907374382,
-0.001980931730940938,
0.11470667272806168,
0.17304065823554993,
0.028470339253544807,
0.11612924188375473,
0.11376673728227615,
-0.10989533364772797,
0.030647696927189827,
0.10509989410638809,
0.1421472430229187,
0.015868188813328743,
0.09631344676017761,
-0.02112257480621338,
-0.3076513111591339,
-0.025059247389435768,
0.032038576900959015,
-0.08679423481225967,
0.12586866319179535,
0.06506085395812988,
-0.1631171852350235,
0.09098067879676819,
-0.004193675704300404,
-0.2214123159646988,
0.01642296090722084,
-0.004632070194929838,
-0.03144506737589836,
0.15938067436218262,
0.0233444906771183,
0.11589717864990234,
0.009730372577905655,
0.1131984144449234,
-0.20343546569347382,
0.01824348419904709,
0.055901363492012024,
-0.004015943501144648,
0.0677042305469513,
0.06474506855010986,
0.0020185213070362806,
0.15129311382770538,
-0.10766469687223434,
0.06800296157598495,
0.0015600386541336775,
-0.144645094871521,
-0.21063238382339478,
-0.07281782478094101,
0.023649126291275024,
0.07880499213933945,
0.09665782004594803,
-0.005898830480873585,
0.11329825967550278,
-0.09313972294330597,
0.11280244588851929,
0.27152836322784424,
-0.24320922791957855,
-0.08692778646945953,
0.0389886349439621,
0.0004270164354238659,
0.06728167086839676,
-0.1336653232574463,
0.008933739736676216,
0.03305128961801529,
0.04579981043934822,
0.11855509877204895,
-0.03252872824668884,
-0.09823975712060928,
0.027097314596176147,
-0.13888734579086304,
-0.03353056684136391,
0.05507707968354225,
0.054410625249147415,
-0.019917864352464676,
-0.03338659182190895,
-0.06681235134601593,
-0.18128535151481628,
-0.06692934036254883,
0.010828223079442978,
0.06689813733100891,
-0.060511939227581024,
-0.11441339552402496,
-0.014852439984679222,
-0.11089112609624863,
-0.08559652417898178,
-0.05033481493592262,
0.12773044407367706,
0.034442879259586334,
0.017918849363923073,
-0.03641127422451973,
0.12627984583377838,
-0.02956497110426426,
-0.13902948796749115,
0.017199037596583366,
0.028727643191814423,
-0.04338344931602478,
-0.03360677510499954,
-0.049244225025177,
-0.06622205674648285,
-0.012307954952120781,
0.10776039958000183,
-0.059319060295820236,
0.06751785427331924,
0.037370990961790085,
0.04947255179286003,
-0.11308366060256958,
0.1879674345254898,
-0.06672581285238266,
-0.009549976326525211,
-0.037771597504615784,
0.05821103975176811,
0.0026202190201729536,
-0.021870741620659828,
-0.10443688184022903,
0.004137836396694183,
0.07058446109294891,
-0.009228195995092392,
-0.08780770003795624,
0.06982551515102386,
-0.03891696035861969,
-0.011333907023072243,
-0.003219040110707283,
-0.0751807689666748,
0.04547804966568947,
-0.0003665561380330473,
-0.08356676995754242,
-0.029376085847616196,
0.050585485994815826,
0.014916598796844482,
0.013865283690392971,
0.16637727618217468,
-0.08772653341293335,
0.06352262198925018,
-0.11289113759994507,
-0.10051927715539932,
-0.00011677090515149757,
-0.08743171393871307,
0.03681189939379692,
-0.07790671288967133,
-0.14933116734027863,
-0.008599096909165382,
0.07227656245231628,
-0.03955072537064552,
0.0020466565620154142,
-0.05375794321298599,
-0.09046896547079086,
0.002416175790131092,
-0.008537172339856625,
0.1645904928445816,
-0.06427879631519318,
0.12204480916261673,
0.038110848516225815,
0.0713704451918602,
-0.06390289962291718,
0.03973497077822685,
-0.0849480852484703,
0.018623938784003258,
-0.2220117300748825,
0.04294414073228836,
-0.05082700774073601,
0.06849340349435806,
-0.06116573512554169,
-0.12229757010936737,
0.006748559419065714,
0.001828103675507009,
0.09097609668970108,
0.10762621462345123,
-0.22252173721790314,
-0.0768526941537857,
0.1503560096025467,
-0.07339048385620117,
-0.0991414487361908,
0.11120936274528503,
-0.06386711448431015,
0.013790122233331203,
0.061760466545820236,
0.19774408638477325,
0.05457749217748642,
-0.1398932784795761,
0.0230797678232193,
-0.0155402896925807,
0.04827088490128517,
-0.02755185402929783,
0.05038265511393547,
0.02211332879960537,
0.08714286983013153,
0.01861109957098961,
-0.06748761981725693,
0.06874023377895355,
-0.12472725659608841,
-0.09741034358739853,
-0.02563215233385563,
-0.08694913238286972,
0.041979920119047165,
0.09025444090366364,
0.0634593740105629,
-0.10294970124959946,
-0.0767761841416359,
0.09443460404872894,
0.07567331194877625,
-0.06844434887170792,
0.038684699684381485,
-0.0648253932595253,
0.0429084487259388,
-0.018464677035808563,
-0.03553847596049309,
-0.176044300198555,
-0.02498953975737095,
-0.022670019418001175,
0.029121726751327515,
0.02946845442056656,
0.022720670327544212,
0.09245879203081131,
0.08786077797412872,
-0.07124996930360794,
-0.02378864400088787,
-0.06533430516719818,
0.0032317773438990116,
-0.12247941642999649,
-0.229670912027359,
-0.04228634014725685,
-0.007864773273468018,
0.09070718288421631,
-0.21217358112335205,
0.02540314383804798,
0.021768825128674507,
0.0887717753648758,
0.0256603192538023,
-0.03159245848655701,
-0.05273760110139847,
0.07678700238466263,
-0.009631110355257988,
-0.06467398256063461,
0.0697820857167244,
-0.005868946202099323,
-0.06718899309635162,
-0.0570395290851593,
-0.11202783137559891,
0.16448910534381866,
0.1333935558795929,
-0.1459328830242157,
-0.09268268942832947,
-0.014589263126254082,
-0.06388746201992035,
-0.03310970962047577,
-0.04522298276424408,
0.039880361407995224,
0.1771155148744583,
0.0006628984701819718,
0.1324668973684311,
-0.06059924140572548,
-0.03455951437354088,
0.03021775186061859,
-0.026765713468194008,
0.02760317735373974,
0.12901726365089417,
0.12643298506736755,
-0.06459658592939377,
0.12457828968763351,
0.12447775155305862,
-0.08082269877195358,
0.1502513736486435,
-0.033638354390859604,
-0.08138604462146759,
-0.018850434571504593,
-0.014971688389778137,
-0.009357013739645481,
0.17700915038585663,
-0.15128134191036224,
-0.01754908449947834,
-0.0048764352686703205,
0.013768814504146576,
0.014543958939611912,
-0.24974000453948975,
-0.056868065148591995,
0.04026392847299576,
-0.04403680935502052,
-0.011993113905191422,
-0.02307782880961895,
-0.004774156957864761,
0.10380399972200394,
-0.007284258492290974,
-0.07395125180482864,
0.0006719219381920993,
-0.007716873195022345,
-0.04914311692118645,
0.20747289061546326,
-0.05689734220504761,
-0.11848733574151993,
-0.08721234649419785,
-0.0772201418876648,
-0.035656921565532684,
0.0013090170687064528,
0.05974764749407768,
-0.10834690183401108,
-0.01833771914243698,
-0.05830783769488335,
0.017892824485898018,
0.008130489848554134,
0.03745817020535469,
-0.0013912495924159884,
-0.008663981221616268,
0.05556797608733177,
-0.09650827199220657,
-0.009289025329053402,
-0.06706330180168152,
-0.051320064812898636,
0.05405036732554436,
0.05877078324556351,
0.14879165589809418,
0.13395704329013824,
-0.02510441094636917,
0.020358029752969742,
-0.03366491198539734,
0.2606906592845917,
-0.09616482257843018,
-0.03008382022380829,
0.11714302003383636,
-0.01119747944176197,
0.055539391934871674,
0.10767123848199844,
0.08285144716501236,
-0.10801872611045837,
-0.002340996405109763,
0.0648007020354271,
-0.050649553537368774,
-0.15523038804531097,
-0.014422369189560413,
-0.05839639902114868,
-0.031177399680018425,
0.07755106687545776,
0.026925062760710716,
-0.0011483526322990656,
0.05502734333276749,
0.0493292436003685,
0.04159614071249962,
-0.022975316271185875,
0.05057018995285034,
0.08771800249814987,
0.032943595200777054,
0.10893792659044266,
-0.04498551785945892,
-0.06705578416585922,
0.030204782262444496,
0.0015561466570943594,
0.24766495823860168,
-0.013738440349698067,
0.09518134593963623,
0.07376066595315933,
0.16152486205101013,
-0.011684861034154892,
0.048623226583004,
-0.015320337377488613,
-0.06895081698894501,
-0.01854110322892666,
-0.04433819651603699,
-0.014919765293598175,
0.010862322524189949,
-0.05097292363643646,
0.038516391068696976,
-0.12701155245304108,
0.01001956220716238,
0.0679720863699913,
0.25282642245292664,
0.027033785358071327,
-0.3151516616344452,
-0.0647297352552414,
-0.006671454757452011,
-0.012420369312167168,
-0.008383671753108501,
0.005709845572710037,
0.15088734030723572,
-0.08186236768960953,
0.0560678206384182,
-0.08595949411392212,
0.08599747717380524,
-0.03495463728904724,
0.051536925137043,
0.07647943496704102,
0.07389416545629501,
-0.004564838018268347,
0.054639894515275955,
-0.2891220450401306,
0.3037629723548889,
0.0017237879801541567,
0.08560161292552948,
-0.06337571144104004,
-0.03276994079351425,
0.033445291221141815,
0.08157234638929367,
0.08814430236816406,
-0.01630210503935814,
-0.01978442259132862,
-0.2127411812543869,
-0.020913364365696907,
0.03155696392059326,
0.12983617186546326,
-0.0182687658816576,
0.10304811596870422,
-0.010243640281260014,
-0.0060826134867966175,
0.07536187767982483,
-0.00028399695293046534,
-0.03623456880450249,
-0.09051825851202011,
-0.025629447773098946,
-0.02625398337841034,
-0.048811156302690506,
-0.05846918746829033,
-0.10728967189788818,
-0.11655055731534958,
0.10841504484415054,
0.016302254050970078,
-0.012801926583051682,
-0.12038274109363556,
0.10036090016365051,
0.07695183902978897,
-0.07569359987974167,
0.04271325841546059,
0.031349752098321915,
0.05769035965204239,
0.03278568759560585,
-0.05747028812766075,
0.11796698719263077,
-0.05968150123953819,
-0.15772737562656403,
-0.05620700120925903,
0.08866807818412781,
0.05169015750288963,
0.057039495557546616,
-0.024061396718025208,
0.014873940497636795,
-0.017082728445529938,
-0.09262511134147644,
0.05392412096261978,
-0.04304245859384537,
0.061844002455472946,
0.009207043796777725,
-0.0206108670681715,
0.04755889251828194,
-0.057326942682266235,
-0.013153646141290665,
0.14727257192134857,
0.2870364189147949,
-0.08980068564414978,
0.012585246004164219,
0.017213232815265656,
-0.06579350680112839,
-0.1891518235206604,
0.08100714534521103,
0.05930645763874054,
0.0002228871890110895,
0.08665474504232407,
-0.16720764338970184,
0.09824882447719574,
0.1034776121377945,
0.00021768733859062195,
0.11155156791210175,
-0.36765870451927185,
-0.12776276469230652,
0.08142917603254318,
0.19183656573295593,
0.07853881269693375,
-0.15398529171943665,
0.0011989205377176404,
-0.0013756308471783996,
-0.14915575087070465,
0.092704638838768,
-0.07864098250865936,
0.13499341905117035,
-0.019544167444109917,
0.08468633890151978,
0.01622251607477665,
-0.06210505589842796,
0.12136965245008469,
-0.0042747436091303825,
0.14148439466953278,
-0.0682516023516655,
-0.040891386568546295,
0.05329928174614906,
-0.037892796099185944,
-0.012739947997033596,
-0.04660214111208916,
0.026310306042432785,
-0.05761232227087021,
-0.012051764875650406,
-0.1056264340877533,
0.013595682568848133,
-0.03873981535434723,
-0.06727226078510284,
-0.044815246015787125,
0.043938472867012024,
0.04607657715678215,
-0.003600472118705511,
0.15385566651821136,
-0.011345542967319489,
0.11523479223251343,
0.045890145003795624,
0.059125564992427826,
-0.06096282973885536,
-0.10700159519910812,
-0.018439069390296936,
0.00875788927078247,
0.04848200082778931,
-0.13612695038318634,
0.015362225472927094,
0.1531652808189392,
0.05088818445801735,
0.12376942485570908,
0.08595848083496094,
-0.030759697780013084,
0.030633332207798958,
0.0695304274559021,
-0.15732689201831818,
-0.11688997596502304,
0.0018812973285093904,
-0.06663649529218674,
-0.07354524731636047,
0.05251266434788704,
0.0766068696975708,
-0.0757896825671196,
0.012031891383230686,
-0.0067054289393126965,
0.005041682161390781,
-0.06764774024486542,
0.20583468675613403,
0.05481047183275223,
0.04123679921030998,
-0.1038924977183342,
0.07295709103345871,
0.01801607385277748,
-0.08972930163145065,
-0.0019254449289292097,
0.09033973515033722,
-0.07028824090957642,
-0.02483285591006279,
0.08186651766300201,
0.1933574676513672,
-0.07763192802667618,
-0.02186586894094944,
-0.1496036797761917,
-0.10616438090801239,
0.0699567198753357,
0.1888570636510849,
0.0998668447136879,
-0.007991187274456024,
-0.05224577710032463,
0.04893102869391441,
-0.11673058569431305,
0.07781676203012466,
0.02493635192513466,
0.08096084743738174,
-0.14993394911289215,
0.18200404942035675,
0.01269743125885725,
0.053227558732032776,
-0.025968994945287704,
0.03218986093997955,
-0.11967368423938751,
0.039954718202352524,
-0.11289042234420776,
-0.03800061345100403,
-0.01616675779223442,
0.004554385785013437,
-0.013089364394545555,
-0.06262225657701492,
-0.06354238092899323,
0.005049035418778658,
-0.1285432130098343,
-0.021502336487174034,
0.046392858028411865,
0.02390572987496853,
-0.12624911963939667,
-0.03819253295660019,
0.026960773393511772,
-0.06340741366147995,
0.05564087629318237,
0.037264205515384674,
0.014254285953938961,
0.06623151898384094,
-0.17386221885681152,
-0.021297795698046684,
0.06955330818891525,
-0.008022291585803032,
0.06362996995449066,
-0.03716161102056503,
-0.025720521807670593,
-0.02875145524740219,
0.08824464678764343,
0.012661131098866463,
0.060562413185834885,
-0.1365700662136078,
0.0063349357806146145,
-0.03198300302028656,
-0.0920213833451271,
-0.05962100625038147,
0.05189242959022522,
0.06303446739912033,
0.036008838564157486,
0.16379819810390472,
-0.08233678340911865,
0.044915471225976944,
-0.21772152185440063,
-0.015558636747300625,
0.002178011927753687,
-0.10807916522026062,
-0.08314941078424454,
-0.07301981002092361,
0.08241410553455353,
-0.07647397369146347,
0.10973040014505386,
0.03664131090044975,
0.0661936029791832,
0.03216363862156868,
-0.03265561908483505,
-0.004617665428668261,
0.03464195132255554,
0.21221937239170074,
0.01298568956553936,
-0.03276941552758217,
0.0901692807674408,
0.08071177452802658,
0.09991906583309174,
0.13773974776268005,
0.22820544242858887,
0.15616349875926971,
-0.026854297146201134,
0.08934786170721054,
0.05135384574532509,
-0.06474989652633667,
-0.17186203598976135,
0.03889846056699753,
-0.05224683880805969,
0.099118672311306,
-0.06244885176420212,
0.20340517163276672,
0.08861441910266876,
-0.18354810774326324,
0.06669668108224869,
-0.04530829191207886,
-0.10087841749191284,
-0.0812583789229393,
-0.037305671721696854,
-0.06999137997627258,
-0.14920377731323242,
0.02645428292453289,
-0.10181604325771332,
0.04320421442389488,
0.14922599494457245,
0.01070871390402317,
-0.011801445856690407,
0.21418018639087677,
0.03570457920432091,
0.03550020977854729,
0.05908380448818207,
0.0128675801679492,
-0.030569670721888542,
-0.09461183100938797,
-0.05887708440423012,
0.018119877204298973,
-0.03218993544578552,
0.017130691558122635,
-0.07026086747646332,
-0.07863885164260864,
0.02655067667365074,
0.005168697331100702,
-0.09354639798402786,
0.02409343421459198,
0.020586226135492325,
0.08882627636194229,
0.02705151028931141,
0.005931399762630463,
0.0164275374263525,
-0.028490591794252396,
0.24819067120552063,
-0.09352385997772217,
-0.08324048668146133,
-0.08338063210248947,
0.21792460978031158,
0.03254571929574013,
0.0018683957168832421,
0.00878078956156969,
-0.0806100144982338,
0.010932651348412037,
0.22971895337104797,
0.1703924834728241,
-0.13070552051067352,
-0.009424680843949318,
0.0006825320306234062,
0.0012693514581769705,
-0.02992149442434311,
0.11615455150604248,
0.12083893269300461,
0.05206131935119629,
-0.11449939012527466,
-0.04928244650363922,
-0.05235479027032852,
-0.019191011786460876,
-0.024873929098248482,
0.050549473613500595,
0.06832662224769592,
0.02183319814503193,
-0.06875406950712204,
0.07692699879407883,
-0.05848773941397667,
-0.14061667025089264,
0.10345496237277985,
-0.22643421590328217,
-0.15574651956558228,
-0.007475243415683508,
0.12206802517175674,
0.004799033049494028,
0.05937625467777252,
-0.04112417250871658,
0.0005423026159405708,
0.04994497448205948,
-0.004666462074965239,
-0.07802581042051315,
-0.10789608210325241,
0.08630269765853882,
-0.11553428322076797,
0.21665872633457184,
-0.05909910053014755,
0.03331036493182182,
0.11367577314376831,
0.06366783380508423,
-0.049621328711509705,
0.055216234177351,
0.041512325406074524,
-0.12412609159946442,
-0.005244002677500248,
0.12472046911716461,
-0.03893563151359558,
0.05589040741324425,
0.03174397349357605,
-0.13107264041900635,
0.033217981457710266,
-0.057220641523599625,
-0.04109598323702812,
-0.02814982272684574,
-0.04958943650126457,
-0.062039416283369064,
0.11504364758729935,
0.20804281532764435,
-0.007590585853904486,
0.025024687871336937,
-0.0879189670085907,
0.016961975023150444,
0.06584127247333527,
0.04996630549430847,
-0.07944443821907043,
-0.21632152795791626,
0.007658934686332941,
0.0709305852651596,
-0.04062234237790108,
-0.20306815207004547,
-0.11251303553581238,
0.03686079382896423,
-0.0541410855948925,
-0.07096908986568451,
0.08896335959434509,
0.09363124519586563,
0.05647851899266243,
-0.05502629280090332,
-0.10803864151239395,
-0.05842435732483864,
0.1705980896949768,
-0.1462692767381668,
-0.07700590789318085
] |
null | null | null | This is a word2vec model trained on CC100 Georgian dataset. | {"language": ["ka"], "license": "mit"} | null | davmel/ka_word2vec | [
"ka",
"license:mit",
"region:us"
] | 2024-02-12T13:23:58+00:00 | [] | [
"ka"
] | TAGS
#ka #license-mit #region-us
| This is a word2vec model trained on CC100 Georgian dataset. | [] | [
"TAGS\n#ka #license-mit #region-us \n"
] | [
13
] | [
"passage: TAGS\n#ka #license-mit #region-us \n"
] | [
0.04662298783659935,
-0.1562400460243225,
-0.008024810813367367,
-0.05871446058154106,
0.07307320833206177,
0.05578330159187317,
0.13335871696472168,
0.03784003481268883,
0.21340487897396088,
-0.02969779632985592,
0.14734697341918945,
0.13976621627807617,
0.04501339793205261,
0.0539342500269413,
0.006310745142400265,
-0.17808355391025543,
0.059474699199199677,
-0.04938209801912308,
0.11440765857696533,
0.03029872104525566,
-0.016629088670015335,
-0.04740922525525093,
0.01185351237654686,
-0.03143475949764252,
-0.05654323473572731,
-0.0020379596389830112,
0.057623490691185,
-0.062112968415021896,
0.12371741980314255,
-0.00048435566714033484,
0.12562060356140137,
0.019429774954915047,
-0.016860920935869217,
-0.26509571075439453,
0.016934188082814217,
-0.08575866371393204,
-0.084803506731987,
-0.0207230132073164,
-0.03297685831785202,
-0.020426014438271523,
0.019425755366683006,
0.07102543115615845,
-0.03644143044948578,
0.047474026679992676,
-0.18575771152973175,
-0.19913266599178314,
-0.07001473009586334,
-0.06158751621842384,
0.09867946058511734,
0.022249430418014526,
0.030648911371827126,
0.10509175062179565,
-0.1861112117767334,
-0.0022120983339846134,
0.030292566865682602,
-0.3298361599445343,
0.07362022995948792,
0.1625571846961975,
0.04783118888735771,
0.08586452901363373,
-0.049112237989902496,
0.09486685693264008,
0.10148447751998901,
-0.06949936598539352,
-0.08658549934625626,
-0.06371773779392242,
0.0298402588814497,
0.14978300034999847,
-0.03640984371304512,
-0.07275397330522537,
0.2491542547941208,
-0.01533625926822424,
-0.027922818437218666,
0.06658071279525757,
0.014215238392353058,
-0.03311753645539284,
0.03340531513094902,
0.05748958885669708,
0.011408081278204918,
0.11034609377384186,
0.1643989533185959,
-0.03637859597802162,
-0.1606951653957367,
-0.033190786838531494,
-0.24347636103630066,
0.18153703212738037,
0.03897567465901375,
0.09786791354417801,
-0.16153380274772644,
-0.022604286670684814,
-0.15823008120059967,
-0.02614995464682579,
-0.02771672233939171,
-0.06605345755815506,
0.048739731311798096,
-0.006409234832972288,
0.006340658292174339,
0.09531746804714203,
0.10681278258562088,
0.11370629072189331,
0.07936083525419235,
0.011013476178050041,
-0.08283001184463501,
0.15148532390594482,
0.04052182659506798,
0.07662340998649597,
0.20211166143417358,
0.09297364950180054,
-0.03662088140845299,
-0.16256077587604523,
0.03924953192472458,
-0.053908150643110275,
-0.11216695606708527,
0.02558170258998871,
-0.21897916495800018,
0.18041135370731354,
-0.07343210279941559,
-0.0334116667509079,
-0.06486783176660538,
0.1052396148443222,
0.030196866020560265,
-0.019341453909873962,
-0.05105309188365936,
-0.020828714594244957,
0.046152323484420776,
0.02128533646464348,
-0.1546824425458908,
0.014864097349345684,
0.1042904257774353,
0.14682704210281372,
-0.12330707162618637,
0.0012524907942861319,
0.004715954419225454,
0.02641361951828003,
0.07672818005084991,
-0.12003809213638306,
0.05205646902322769,
-0.150715172290802,
-0.06420435011386871,
0.022420428693294525,
0.005484003573656082,
-0.0021154924761503935,
0.08533583581447601,
0.06612139940261841,
0.02957686223089695,
-0.025610506534576416,
-0.07070361822843552,
-0.13278380036354065,
-0.08874808996915817,
0.10776029527187347,
-0.029534291476011276,
0.03084704466164112,
-0.16649752855300903,
-0.01127874106168747,
-0.11822281032800674,
0.05476044490933418,
-0.06208899989724159,
-0.09048668295145035,
-0.13730455935001373,
0.19013965129852295,
0.009610038250684738,
0.04285002872347832,
-0.09511321783065796,
0.05537375062704086,
-0.0720856785774231,
0.07703039050102234,
-0.06810618937015533,
-0.07436838001012802,
0.1410922408103943,
-0.1171775683760643,
-0.1523454189300537,
0.008906285278499126,
0.005152746103703976,
0.14026989042758942,
0.08864759653806686,
0.46972694993019104,
-0.04961211606860161,
-0.1956491321325302,
0.08711335062980652,
0.16680602729320526,
-0.13645416498184204,
-0.24473877251148224,
0.1259588748216629,
-0.13636688888072968,
-0.2231345772743225,
0.02229580469429493,
-0.020669586956501007,
0.09762199968099594,
-0.03194300830364227,
-0.08270754665136337,
0.06380316615104675,
-0.0035247597843408585,
0.0304374061524868,
0.0058389827609062195,
0.06700140237808228,
-0.11007703095674515,
0.0864185243844986,
-0.0037152653094381094,
0.03524119406938553,
0.16234493255615234,
0.0073076821863651276,
-0.05221068859100342,
0.05160066485404968,
0.03377591818571091,
-0.001268032705411315,
0.010095265693962574,
-0.07617208361625671,
0.019413815811276436,
-0.055628564208745956,
0.06531307846307755,
0.0949072390794754,
-0.007754092570394278,
-0.05521652474999428,
0.03485782444477081,
0.013814606703817844,
0.08692586421966553,
0.08066953718662262,
0.012505792081356049,
-0.059720247983932495,
0.09733070433139801,
-0.01548276748508215,
-0.11469962447881699,
-0.00257401866838336,
-0.030185170471668243,
0.07108961045742035,
-0.11340492963790894,
-0.03584059700369835,
0.0924064889550209,
-0.037826020270586014,
0.0405447781085968,
0.052875105291604996,
0.04214194416999817,
0.1278877854347229,
-0.01997085101902485,
-0.0387677364051342,
0.23772747814655304,
-0.0026159812696278095,
0.10120764374732971,
0.17807088792324066,
-0.0350249707698822,
0.009818825870752335,
-0.05466485768556595,
-0.0024546703789383173,
-0.027799002826213837,
0.0343799963593483,
-0.035038724541664124,
-0.037955332547426224,
-0.015206382609903812,
0.0005491705960594118,
-0.01849917322397232,
0.004222479648888111,
-0.026551106944680214,
-0.09070175886154175,
-0.10245400667190552,
0.05374767631292343,
0.1866692155599594,
-0.19295577704906464,
0.1520773321390152,
0.4162115752696991,
0.12501372396945953,
0.2604975700378418,
-0.12704290449619293,
0.0013230008771643043,
-0.06963832676410675,
0.026546727865934372,
-0.023931870236992836,
0.18390384316444397,
-0.10703377425670624,
-0.04924760013818741,
0.019411971792578697,
0.00263756955973804,
0.04067317396402359,
-0.13538892567157745,
-0.16917599737644196,
-0.026097385212779045,
-0.003921163734048605,
-0.11944599449634552,
0.10792914777994156,
-0.09031855314970016,
0.029060542583465576,
0.04362604767084122,
-0.08909287303686142,
0.12802916765213013,
-0.016359493136405945,
-0.012170111760497093,
0.053331539034843445,
-0.16857455670833588,
-0.02357722446322441,
-0.13944639265537262,
-0.14567670226097107,
0.04979589581489563,
0.004141575191169977,
0.07505180686712265,
-0.09815600514411926,
-0.009674640372395515,
0.08841834962368011,
0.021899953484535217,
-0.157120943069458,
-0.06534849852323532,
-0.06187417358160019,
0.05643469840288162,
-0.15304601192474365,
-0.0465400405228138,
-0.08559395372867584,
-0.05762593820691109,
-0.01582508347928524,
0.08367624133825302,
-0.08240467309951782,
0.10539180785417557,
0.15669946372509003,
0.04044397175312042,
0.027634935453534126,
-0.08436001092195511,
0.07074693590402603,
-0.06919445097446442,
-0.07064402848482132,
0.06950391083955765,
0.015955360606312752,
0.06506650894880295,
0.2526175379753113,
0.06440968811511993,
-0.06242327764630318,
-0.014258993789553642,
-0.0552247054874897,
-0.16273096203804016,
-0.2662905156612396,
-0.04813802242279053,
-0.07373720407485962,
0.19943657517433167,
-0.059888459742069244,
0.06400750577449799,
0.09944232553243637,
0.05453398451209068,
0.10883748531341553,
-0.09562370181083679,
-0.03951362892985344,
0.028152279555797577,
0.24323095381259918,
-0.06641083210706711,
0.027158638462424278,
-0.14011737704277039,
-0.03741026297211647,
0.13885706663131714,
0.12826548516750336,
0.10129527747631073,
0.24994434416294098,
0.08070576936006546,
0.17606261372566223,
0.14491303265094757,
0.12203554809093475,
0.008583850227296352,
0.027705758810043335,
-0.013844184577465057,
-0.044469673186540604,
-0.004129877779632807,
0.023390021175146103,
0.07192996889352798,
0.05381879583001137,
-0.23209375143051147,
0.049397021532058716,
-0.22894997894763947,
0.0500335656106472,
-0.0768100693821907,
0.13372385501861572,
0.042365819215774536,
0.0898599699139595,
0.1350952833890915,
0.050768934190273285,
-0.011064651422202587,
0.17126548290252686,
0.008540913462638855,
-0.11335247755050659,
0.019865330308675766,
0.043534573167562485,
0.04539445787668228,
0.09942609816789627,
0.07397061586380005,
-0.05190841481089592,
-0.16148310899734497,
0.008289700374007225,
0.11535058170557022,
-0.2195344865322113,
0.28695833683013916,
0.01690981723368168,
-0.04509327933192253,
-0.03619441017508507,
-0.08063696324825287,
0.04025398567318916,
0.0672890916466713,
0.11260741204023361,
0.06982016563415527,
-0.3054905831813812,
-0.2245437651872635,
0.016780748963356018,
0.02231977880001068,
0.10612505674362183,
-0.014453417621552944,
-0.1432153284549713,
-0.06032539904117584,
0.06983280926942825,
-0.0492873378098011,
0.14001762866973877,
-0.04274846613407135,
-0.03432995453476906,
-0.013186998665332794,
0.08061697334051132,
-0.03320442885160446,
-0.032119303941726685,
0.047694213688373566,
-0.006697532255202532,
-0.020906619727611542,
-0.013139775022864342,
-0.004173729568719864,
-0.06140708923339844,
-0.31774622201919556,
-0.007554000709205866,
-0.08329081535339355,
-0.0070104519836604595,
-0.058060821145772934,
-0.17209002375602722,
-0.1149817630648613,
-0.13600492477416992,
0.0887032300233841,
-0.014043817296624184,
0.05898378789424896,
-0.035792771726846695,
0.15638935565948486,
-0.06329671293497086,
0.040435124188661575,
-0.0049284519627690315,
0.04266795516014099,
-0.011186651885509491,
-0.11485980451107025,
0.12265289574861526,
-0.13856947422027588,
0.021029213443398476,
0.03293056786060333,
-0.06990247964859009,
-0.05553796887397766,
-0.044010668992996216,
-0.13243737816810608,
0.19261854887008667,
0.34386056661605835,
-0.007603890728205442,
0.22745861113071442,
0.3652937114238739,
-0.1383175104856491,
-0.2550698220729828,
-0.16583268344402313,
-0.31094780564308167,
-0.05026507005095482,
0.17901752889156342,
-0.1588669717311859,
0.04944232106208801,
0.14749737083911896,
-0.12194661796092987,
0.20895758271217346,
-0.23748649656772614,
-0.05872934311628342,
0.18520456552505493,
-0.01760212890803814,
0.49894434213638306,
-0.10326229780912399,
-0.14446142315864563,
-0.03318008780479431,
-0.07978343963623047,
0.09731000661849976,
-0.03355099633336067,
0.08143935352563858,
0.0026973241474479437,
0.02893068827688694,
-0.004141704645007849,
-0.023039182648062706,
0.20507429540157318,
0.035500288009643555,
0.06805090606212616,
-0.1114174947142601,
-0.2114822119474411,
0.2206110954284668,
-0.027711518108844757,
0.0042048790492117405,
-0.08765134960412979,
-0.03357243537902832,
-0.10110847651958466,
0.02799224480986595,
-0.05669180676341057,
0.0547763854265213,
0.02570851892232895,
-0.11567216366529465,
-0.15343520045280457,
0.0581517368555069,
-0.11344702541828156,
-0.048551179468631744,
0.22290652990341187,
0.006600338965654373,
0.10718430578708649,
-0.01350021455436945,
-0.09173593670129776,
0.00897009763866663,
0.038134343922138214,
-0.0922432541847229,
-0.08203572034835815,
0.09509959816932678,
-0.15187154710292816,
-0.05887697637081146,
0.13781802356243134,
-0.032770030200481415,
0.06760214269161224,
0.04157891124486923,
-0.11871879547834396,
0.057420819997787476,
0.12756949663162231,
-0.06329713016748428,
-0.11925800889730453,
0.04553932696580887,
-0.013379893265664577,
0.21611225605010986,
-0.0014617942506447434,
0.019013963639736176,
0.054708000272512436,
0.03766358643770218,
-0.01618354581296444,
0.014922100119292736,
-0.09378862380981445,
-0.036007896065711975,
0.053738173097372055,
0.002145508537068963,
-0.06752073019742966,
0.11800877004861832,
0.04258142411708832,
0.032779447734355927,
-0.01832764968276024,
0.062428951263427734,
-0.04057243466377258,
-0.07833532243967056,
-0.13195344805717468,
0.0011149394558742642,
-0.07932478189468384,
-0.0850963220000267,
0.11770971864461899,
-0.11765411496162415,
-0.05636414885520935,
0.1522468775510788,
0.026395272463560104,
0.12526261806488037,
0.04359298199415207,
-0.010471262037754059,
0.13545271754264832,
-0.0256954338401556,
-0.2071976363658905,
0.0018794442294165492,
-0.07281870394945145,
-0.10932707786560059,
-0.001683413633145392,
0.13946057856082916,
-0.060803014785051346,
-0.10558038204908371,
-0.21434888243675232,
0.09456854313611984,
-0.11671408265829086,
-0.02262185700237751,
-0.11345913261175156,
-0.06088830530643463,
0.053902752697467804,
-0.029296576976776123,
-0.06753600388765335,
-0.05591321736574173,
-0.14819224178791046,
0.05963706225156784,
0.07755481451749802,
0.14627918601036072,
-0.052029699087142944,
-0.05755556374788284,
0.14158779382705688,
0.04044242948293686,
0.06918217241764069,
0.08294087648391724,
0.06133995205163956,
0.1946103721857071,
-0.0879577249288559,
0.007951115258038044,
0.09519247710704803,
-0.012682954780757427,
0.013884059153497219,
0.07774815708398819,
-0.05262492969632149,
0.020986657589673996,
-0.029455455020070076,
0.08449288457632065,
-0.1281556934118271,
-0.12229777872562408,
-0.05430274084210396,
0.03869778290390968,
-0.21957330405712128,
0.045270320028066635,
-0.16762800514698029,
0.1603700816631317,
-0.014946609735488892,
0.10077998787164688,
0.07024253904819489,
0.049473706632852554,
0.06741180270910263,
0.00811739917844534,
-0.009375246241688728,
-0.12910208106040955,
-0.0011619220022112131,
-0.08456599712371826,
-0.1068166047334671,
-0.0501597560942173,
0.35608524084091187,
0.03359551727771759,
-0.1915825605392456,
0.055092860013246536,
0.1149105355143547,
0.004826549906283617,
0.019675223156809807,
0.21449242532253265,
0.06574378907680511,
-0.02770112454891205,
-0.1952553391456604,
0.021136783063411713,
-0.1052313819527626,
-0.14545823633670807,
0.08525598794221878,
0.08024612814188004,
0.10421343892812729,
0.020119473338127136,
0.1067986711859703,
-0.041118472814559937,
-0.020526474341750145,
-0.17739218473434448,
0.011788375675678253,
-0.015412203967571259,
0.026930058375000954,
0.09183410555124283,
0.19951577484607697,
-0.03147312253713608,
0.03246663510799408,
-0.10490676760673523,
-0.032646600157022476,
-0.1703505963087082,
-0.17107164859771729,
0.04620024189352989,
-0.03671404719352722,
0.0790218859910965,
0.07413724064826965,
0.028472693637013435,
0.2923215925693512,
0.038452621549367905,
-0.040136877447366714,
0.06905336678028107,
-0.1120644211769104,
-0.07444192469120026,
0.016890626400709152,
-0.02786833979189396,
-0.013875806704163551,
-0.13527671992778778,
-0.08857232332229614,
-0.07544831931591034,
-0.1634308248758316,
-0.07016162574291229,
0.021698132157325745,
-0.05650779604911804,
-0.030325384810566902,
-0.16973388195037842,
-0.04892585799098015,
-0.05459575727581978,
0.09066718071699142,
-0.007489144802093506,
0.09268747270107269,
0.028526142239570618,
0.03241739794611931,
0.06702002882957458,
0.06898864358663559,
0.05344186723232269,
-0.01866883598268032,
0.06208580359816551,
0.06267309933900833,
-0.014811228029429913,
0.09518132358789444,
-0.06710624694824219,
0.025259604677557945,
-0.0280500166118145,
0.1817840039730072,
0.251807302236557,
0.00675539392977953,
0.0018201200291514397,
-0.06320375949144363,
0.026596253737807274,
0.08341284096240997,
0.2073134481906891,
-0.03757203742861748,
0.21948370337486267,
-0.05920792371034622,
0.045294519513845444,
-0.04035895690321922,
-0.013886122032999992,
-0.06631749868392944,
0.05558416247367859,
0.06643210351467133,
-0.088894784450531,
-0.09472814947366714,
0.13389001786708832,
-0.19369639456272125,
0.13624122738838196,
0.14099594950675964,
-0.08748848736286163,
0.003106537973508239,
-0.00846429355442524,
0.19025298953056335,
-0.021516166627407074,
0.09765060991048813,
-0.09632234275341034,
-0.11835373193025589,
-0.17244262993335724,
0.05555199086666107,
-0.31219059228897095,
-0.1493690013885498,
0.1055484414100647,
0.18635082244873047,
0.159046471118927,
-0.0020490491297096014,
0.04899170994758606,
0.006156246643513441,
0.0653718113899231,
-0.014633335173130035,
0.19702574610710144,
0.06269877403974533,
-0.005037630442529917,
-0.16190220415592194,
-0.15814577043056488,
-0.0009162997012026608,
-0.0855892077088356,
0.10137481242418289,
0.004857033956795931,
0.0327199250459671,
0.06542843580245972,
-0.07258812338113785,
-0.029283449053764343,
0.027490807697176933,
-0.16831561923027039,
0.05896387994289398,
-0.006096852011978626,
0.023798229172825813,
-0.054140083491802216,
-0.009344429709017277,
-0.06035307049751282,
0.07267584651708603,
-0.20074546337127686,
-0.07089012116193771,
0.14335359632968903,
-0.07033578306436539,
0.09208258241415024,
0.005340466741472483,
-0.09330546855926514,
0.0023167410399764776,
-0.06230110302567482,
0.12903550267219543,
-0.13336557149887085,
0.03529646247625351,
0.1112222820520401,
-0.009681382216513157,
0.019864268600940704,
-0.28267979621887207,
0.036728110164403915,
-0.033206433057785034,
-0.028169604018330574,
-0.04895680770277977
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | anupk/akmixtral-v1 | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | 2024-02-12T13:24:11+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #mixtral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #mixtral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
63,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #mixtral #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04221513494849205,
0.184287428855896,
-0.00541354576125741,
0.01813345029950142,
0.09975318610668182,
0.006727275438606739,
0.05537417531013489,
0.11690112948417664,
-0.05280387029051781,
0.12401485443115234,
0.04123157635331154,
0.10999780148267746,
0.11695726215839386,
0.1439584195613861,
-0.0016006447840481997,
-0.2171592116355896,
0.04970335587859154,
-0.10745643824338913,
-0.012722083367407322,
0.12266865372657776,
0.14661581814289093,
-0.10084246844053268,
0.0690428614616394,
-0.035281796008348465,
-0.019668355584144592,
-0.03666619211435318,
-0.06200377643108368,
-0.04225201904773712,
0.04082702472805977,
0.05657738447189331,
0.06525418162345886,
0.0007988896104507148,
0.08292273432016373,
-0.27886417508125305,
0.01916610635817051,
0.0690983384847641,
-0.0026473698671907187,
0.0653543770313263,
0.06614711880683899,
-0.0642068162560463,
0.10732404887676239,
-0.053686823695898056,
0.13770683109760284,
0.08448706567287445,
-0.0933433324098587,
-0.1807156652212143,
-0.09203960746526718,
0.10945519059896469,
0.1761799156665802,
0.05038842186331749,
-0.027028262615203857,
0.10480218380689621,
-0.08191276341676712,
0.01856868714094162,
0.04882891848683357,
-0.0887085348367691,
-0.056460484862327576,
0.06749644875526428,
0.09106393158435822,
0.054793208837509155,
-0.126346156001091,
-0.03368553891777992,
0.0055172317661345005,
0.016711445525288582,
0.07341105490922928,
0.021553555503487587,
0.14834244549274445,
0.03507466986775398,
-0.1330583095550537,
-0.05186847597360611,
0.10549769550561905,
0.04068231210112572,
-0.040558721870183945,
-0.2435608059167862,
-0.027402730658650398,
-0.025094103068113327,
-0.034207601100206375,
-0.04436539113521576,
0.04032588005065918,
-0.002683435333892703,
0.08578626811504364,
-0.007898394018411636,
-0.07299083471298218,
-0.03298328444361687,
0.0659714788198471,
0.06374184787273407,
0.029039599001407623,
-0.019903002306818962,
0.023135339841246605,
0.10607591271400452,
0.09508808702230453,
-0.11632565408945084,
-0.058000531047582626,
-0.06565536558628082,
-0.07209265977144241,
-0.04211447387933731,
0.03442756459116936,
0.012441841885447502,
0.06999391317367554,
0.2545918822288513,
0.01630474627017975,
0.05792796239256859,
0.029032578691840172,
0.007418633438646793,
0.05208069086074829,
0.10441845655441284,
-0.06236622482538223,
-0.1142968013882637,
-0.01992841437458992,
0.08917800337076187,
0.020353268831968307,
-0.03652595356106758,
-0.04742337018251419,
0.06647280603647232,
0.04289766401052475,
0.11000671237707138,
0.09759560227394104,
0.02146090194582939,
-0.07399335503578186,
-0.06239839270710945,
0.1980348527431488,
-0.1560843288898468,
0.03896867483854294,
0.042537346482276917,
-0.035302191972732544,
-0.021802768111228943,
0.009326064959168434,
0.025014307349920273,
-0.033905092626810074,
0.086074098944664,
-0.054577551782131195,
-0.04565592110157013,
-0.1113087385892868,
-0.03316990286111832,
0.043823856860399246,
0.010432148352265358,
-0.0352739654481411,
-0.03673075884580612,
-0.07512396574020386,
-0.08497974276542664,
0.08706919848918915,
-0.06947311013936996,
-0.05725887417793274,
-0.026451431214809418,
-0.08210154622793198,
0.02368011139333248,
0.02095865271985531,
0.07270745187997818,
-0.025722766295075417,
0.05510387197136879,
-0.05191600322723389,
0.055096905678510666,
0.10204803943634033,
0.035042162984609604,
-0.05917169526219368,
0.05856769531965256,
-0.23029311001300812,
0.08601892739534378,
-0.06803197413682938,
0.06128280237317085,
-0.15735802054405212,
-0.02454591728746891,
0.034742921590805054,
0.004123088903725147,
-0.004979089833796024,
0.13348817825317383,
-0.20943224430084229,
-0.023575911298394203,
0.1675792634487152,
-0.09664381295442581,
-0.07183204591274261,
0.05180574581027031,
-0.047221288084983826,
0.10299770534038544,
0.033924151211977005,
0.0016977025661617517,
0.06376571953296661,
-0.10932669788599014,
-0.010861302725970745,
-0.0565115362405777,
-0.026496540755033493,
0.13796129822731018,
0.07464733719825745,
-0.07794016599655151,
0.06307749450206757,
0.022317413240671158,
-0.020236613228917122,
-0.0635935366153717,
-0.01833304576575756,
-0.10035751014947891,
0.016148116439580917,
-0.06738708168268204,
0.01297660730779171,
-0.017415180802345276,
-0.09428401291370392,
-0.030845720320940018,
-0.16816076636314392,
-0.02979908511042595,
0.08065250515937805,
-0.004283347632735968,
-0.01463167555630207,
-0.11198673397302628,
0.023750487715005875,
0.03302661329507828,
0.004127424210309982,
-0.13246795535087585,
-0.03694915026426315,
0.03416776657104492,
-0.1499258577823639,
0.037259262055158615,
-0.0717690959572792,
0.05251135677099228,
0.015435646288096905,
-0.029025115072727203,
-0.026191391050815582,
0.02241940051317215,
0.009188603609800339,
-0.0158093199133873,
-0.237104132771492,
-0.02543395571410656,
-0.029634976759552956,
0.16371376812458038,
-0.20647861063480377,
0.03469964861869812,
0.07984067499637604,
0.15713369846343994,
0.0041772471740841866,
-0.05103093013167381,
0.01886747032403946,
-0.0689789205789566,
-0.024611005559563637,
-0.055689942091703415,
0.0031509981490671635,
-0.017811978235840797,
-0.045525819063186646,
0.028546759858727455,
-0.17733623087406158,
-0.04698030278086662,
0.09794966131448746,
0.04618212953209877,
-0.1270289570093155,
-0.025627359747886658,
-0.036636319011449814,
-0.051987387239933014,
-0.04109248146414757,
-0.062059417366981506,
0.09966698288917542,
0.06159719452261925,
0.038986023515462875,
-0.06061495468020439,
-0.07946256548166275,
-0.004432924557477236,
-0.01510956883430481,
-0.02534763514995575,
0.09422753006219864,
0.07688393443822861,
-0.13348789513111115,
0.09380457550287247,
0.08497519791126251,
0.077485091984272,
0.08916220813989639,
-0.02106413058936596,
-0.07364203035831451,
-0.03753411024808884,
0.037760764360427856,
0.019331656396389008,
0.12313684076070786,
-0.0403442345559597,
0.04381752386689186,
0.04058803245425224,
-0.026602068915963173,
0.017947345972061157,
-0.07871748507022858,
0.0343768335878849,
0.022591181099414825,
-0.015913113951683044,
0.05186452716588974,
-0.03760656341910362,
0.020155183970928192,
0.08793376386165619,
0.058310605585575104,
0.0435023196041584,
0.01572067104279995,
-0.05249664559960365,
-0.11178376525640488,
0.15832744538784027,
-0.1254170835018158,
-0.21833674609661102,
-0.1318792849779129,
0.01056607160717249,
0.025977717712521553,
-0.014864957891404629,
0.004777961410582066,
-0.060105256736278534,
-0.10902713239192963,
-0.09231965243816376,
0.006923808250576258,
0.05533609911799431,
-0.08361107856035233,
-0.05992421507835388,
0.0466950386762619,
0.041529107838869095,
-0.14265857636928558,
0.02103171870112419,
0.04252464324235916,
-0.08993346989154816,
-0.01037062518298626,
0.07844703644514084,
0.0757763683795929,
0.18556450307369232,
0.020603220909833908,
-0.019793877378106117,
0.029633264988660812,
0.2210925817489624,
-0.1358625292778015,
0.11243002116680145,
0.1331499218940735,
-0.08656672388315201,
0.08076657354831696,
0.2098151296377182,
0.041653964668512344,
-0.09678150713443756,
0.031032631173729897,
0.029492780566215515,
-0.022830050438642502,
-0.23440127074718475,
-0.07088205218315125,
-0.0015115659916773438,
-0.06652486324310303,
0.07880941033363342,
0.09516941010951996,
0.07647950947284698,
0.017290078103542328,
-0.0953589528799057,
-0.09145430475473404,
0.060411155223846436,
0.10859961062669754,
0.01645820401608944,
-0.007976547814905643,
0.0887393057346344,
-0.03546225279569626,
0.014248249121010303,
0.08671107888221741,
0.003626003162935376,
0.16089561581611633,
0.04911671206355095,
0.17880931496620178,
0.08424027264118195,
0.07261724025011063,
0.0031940436456352472,
0.008434025570750237,
0.01300175953656435,
0.041327159851789474,
-0.005564264953136444,
-0.08350218832492828,
-0.026614299044013023,
0.1086716279387474,
0.06851482391357422,
0.01911756955087185,
0.01351115107536316,
-0.04823964461684227,
0.08722595870494843,
0.17843492329120636,
0.0011667930521070957,
-0.18133501708507538,
-0.05884885787963867,
0.07403749227523804,
-0.09703923761844635,
-0.10217505693435669,
-0.009492456912994385,
0.016711611300706863,
-0.1657785028219223,
0.03710712864995003,
-0.020444581285119057,
0.10862789303064346,
-0.13312694430351257,
-0.017684578895568848,
0.07561379671096802,
0.07036890089511871,
-0.002324859146028757,
0.05715097859501839,
-0.18029482662677765,
0.09946050494909286,
0.01222510077059269,
0.0713859349489212,
-0.09676310420036316,
0.09160122275352478,
-0.009009894914925098,
-0.028643371537327766,
0.14034296572208405,
-0.00393899017944932,
-0.0735570639371872,
-0.06331590563058853,
-0.09252014011144638,
-0.011413493193686008,
0.12664973735809326,
-0.12715506553649902,
0.09184347093105316,
-0.03371168300509453,
-0.03572249785065651,
-0.010392389260232449,
-0.0874352976679802,
-0.11128740757703781,
-0.18001364171504974,
0.05886504054069519,
-0.1267988234758377,
0.03583109751343727,
-0.10564197599887848,
-0.025834258645772934,
-0.02566644735634327,
0.18152424693107605,
-0.23926568031311035,
-0.0720348209142685,
-0.14428266882896423,
-0.09259714186191559,
0.1311977356672287,
-0.04692009091377258,
0.08983000367879868,
-0.016000131145119667,
0.16034725308418274,
0.019729461520910263,
-0.020031020045280457,
0.08633604645729065,
-0.08478595316410065,
-0.1944543421268463,
-0.06976234912872314,
0.16522546112537384,
0.11978457868099213,
0.03413308411836624,
0.0005305535742081702,
0.037522342056035995,
-0.02053103968501091,
-0.11814063787460327,
0.02148975431919098,
0.15172436833381653,
0.06668060272932053,
0.011740060523152351,
-0.02492174133658409,
-0.1116826981306076,
-0.07566066086292267,
-0.028715990483760834,
0.03360319137573242,
0.17093613743782043,
-0.07131344825029373,
0.1709417849779129,
0.1420736014842987,
-0.05928913876414299,
-0.2076146900653839,
-0.00025488235405646265,
0.027109205722808838,
-0.008449733257293701,
0.011266698129475117,
-0.18693651258945465,
0.08480579406023026,
-0.0034061591140925884,
-0.05366044491529465,
0.10366818308830261,
-0.16269411146640778,
-0.13774904608726501,
0.08198150992393494,
0.05098048225045204,
-0.18582311272621155,
-0.13527271151542664,
-0.09568615257740021,
-0.04171176627278328,
-0.16096943616867065,
0.0937938243150711,
0.020575474947690964,
0.012337464839220047,
0.03146562725305557,
0.013561155647039413,
0.02399645745754242,
-0.04811491072177887,
0.17489318549633026,
-0.018814681097865105,
0.02325345389544964,
-0.09529950469732285,
-0.0796426311135292,
0.018070273101329803,
-0.0504426434636116,
0.07173099368810654,
-0.01860329508781433,
0.011446788907051086,
-0.10249658674001694,
-0.036428485065698624,
-0.0424206368625164,
0.016606872901320457,
-0.09855442494153976,
-0.0854218602180481,
-0.0471508763730526,
0.09432252496480942,
0.09617727249860764,
-0.023518286645412445,
-0.027280345559120178,
-0.07908966392278671,
0.05642307922244072,
0.20724594593048096,
0.18812845647335052,
0.04271014779806137,
-0.061882440000772476,
-0.0007074945024214685,
-0.014857797883450985,
0.042021650820970535,
-0.19539381563663483,
0.05957552045583725,
0.05593441054224968,
0.020818915218114853,
0.10569853335618973,
-0.018852733075618744,
-0.15716201066970825,
-0.0767967477440834,
0.06931815296411514,
-0.06405216455459595,
-0.20057755708694458,
0.010168159380555153,
0.058348797261714935,
-0.17509421706199646,
-0.03968575969338417,
0.046998072415590286,
-0.0030537894926965237,
-0.038909200578927994,
0.02336076647043228,
0.0943564847111702,
0.004098820965737104,
0.07767555862665176,
0.0698731541633606,
0.08152256906032562,
-0.09939708560705185,
0.08329307287931442,
0.09752772748470306,
-0.07524976879358292,
0.028848174959421158,
0.10165023058652878,
-0.05759456008672714,
-0.03906384855508804,
0.03566185012459755,
0.08366306126117706,
0.028924236074090004,
-0.04381433501839638,
0.011427318677306175,
-0.09722886979579926,
0.06626088917255402,
0.10108906775712967,
0.02990528754889965,
0.01752258837223053,
0.043706659227609634,
0.04610227048397064,
-0.07610015571117401,
0.12442032247781754,
0.03222634270787239,
0.014953792095184326,
-0.04057202488183975,
-0.04249828681349754,
0.009985716082155704,
-0.031441830098629,
-0.004838535562157631,
-0.0224502794444561,
-0.08733663707971573,
-0.015615971758961678,
-0.1300002485513687,
-0.009049749933183193,
-0.0620594322681427,
0.013472124002873898,
0.028659816831350327,
-0.031356848776340485,
0.0050497776828706264,
0.00556915020570159,
-0.070175401866436,
-0.0676659420132637,
-0.013332669623196125,
0.0951605886220932,
-0.165130153298378,
0.02589079737663269,
0.08305386453866959,
-0.11233886331319809,
0.09956564009189606,
0.01177335437387228,
-0.006499884650111198,
0.02282649278640747,
-0.1472083181142807,
0.03567599132657051,
-0.037067241966724396,
0.00929268728941679,
0.02333861030638218,
-0.20213311910629272,
0.001025883830152452,
-0.033243946731090546,
-0.07056321948766708,
-0.008617796003818512,
-0.024491773918271065,
-0.11184563487768173,
0.1056598424911499,
-0.0007291476940736175,
-0.08149659633636475,
-0.029699476435780525,
0.03239017724990845,
0.0781724825501442,
-0.027899043634533882,
0.15244191884994507,
-0.014194467104971409,
0.06529146432876587,
-0.15931859612464905,
-0.012150387279689312,
-0.01064259372651577,
0.01446043886244297,
-0.038343921303749084,
-0.007628817111253738,
0.051220040768384933,
-0.0141983050853014,
0.17392969131469727,
-0.03533218801021576,
0.014612596482038498,
0.06651913374662399,
0.04345446452498436,
-0.034696172922849655,
0.09892062842845917,
0.05123850330710411,
0.016616137698292732,
0.008681890554726124,
0.01373197790235281,
-0.041807834059000015,
-0.036241836845874786,
-0.18902859091758728,
0.07016544789075851,
0.1871395707130432,
0.0983971655368805,
-0.021315084770321846,
0.07314375787973404,
-0.09926417469978333,
-0.09135987609624863,
0.1513443887233734,
-0.036852844059467316,
-0.006045074667781591,
-0.07252927869558334,
0.1307610124349594,
0.1448325663805008,
-0.18091213703155518,
0.06699461489915848,
-0.07125783711671829,
-0.04240879788994789,
-0.10921358317136765,
-0.19331879913806915,
-0.0623009018599987,
-0.04963294044137001,
-0.018874214962124825,
-0.04811594635248184,
0.06601831316947937,
0.05830875784158707,
0.0013103733072057366,
-0.0083082290366292,
0.06900402158498764,
-0.03446618840098381,
-0.003237725468352437,
0.028336303308606148,
0.06025578826665878,
0.009017042815685272,
-0.037391647696495056,
0.01728014461696148,
-0.012124460190534592,
0.053606923669576645,
0.07629187405109406,
0.050742074847221375,
-0.02711794711649418,
0.01936633139848709,
-0.040714479982852936,
-0.10713635385036469,
0.0497349351644516,
-0.027561530470848083,
-0.07333123683929443,
0.15451441705226898,
0.020698266103863716,
0.0043359557166695595,
-0.012706978246569633,
0.23796072602272034,
-0.06257869303226471,
-0.10354099422693253,
-0.14519017934799194,
0.07478044182062149,
-0.04043815657496452,
0.050089918076992035,
0.03861495852470398,
-0.11048015207052231,
0.025532208383083344,
0.14805902540683746,
0.15562273561954498,
-0.04159191995859146,
0.023179002106189728,
0.034206323325634,
0.008626899681985378,
-0.02300102449953556,
0.03865930438041687,
0.06402948498725891,
0.1511281132698059,
-0.04722335562109947,
0.07980680465698242,
-0.0010522230295464396,
-0.087273508310318,
-0.037062808871269226,
0.11390886455774307,
-0.010565071366727352,
0.01396904420107603,
-0.058574456721544266,
0.11875217407941818,
-0.07046420872211456,
-0.21688689291477203,
0.03972739353775978,
-0.07080025970935822,
-0.13209335505962372,
-0.023885875940322876,
0.07631940394639969,
-0.012364840134978294,
0.021499641239643097,
0.07817210257053375,
-0.07095219194889069,
0.1888955533504486,
0.03878142312169075,
-0.058479487895965576,
-0.0510299876332283,
0.07379616051912308,
-0.07671606540679932,
0.29275667667388916,
0.015718603506684303,
0.04095849022269249,
0.11097828298807144,
-0.014666162431240082,
-0.14088360965251923,
0.024412285536527634,
0.09559007734060287,
-0.09700141102075577,
0.05309328809380531,
0.17917801439762115,
0.002087496453896165,
0.12748242914676666,
0.0774056613445282,
-0.08899977058172226,
0.04657258838415146,
-0.07473798841238022,
-0.0706852450966835,
-0.09825796633958817,
0.10449032485485077,
-0.08841956406831741,
0.1456042230129242,
0.12363041937351227,
-0.054907187819480896,
0.011366641148924828,
-0.034876853227615356,
0.04667789116501808,
-0.003957353066653013,
0.12085358798503876,
0.010115955024957657,
-0.19124247133731842,
0.026088077574968338,
-0.02669909968972206,
0.10137506574392319,
-0.16934356093406677,
-0.08768579363822937,
0.0443950779736042,
0.009959801100194454,
-0.07176091521978378,
0.1259419173002243,
0.058785971254110336,
0.030002396553754807,
-0.048616923391819,
-0.024574855342507362,
-0.010527990758419037,
0.14097049832344055,
-0.10342929512262344,
-0.004433069843798876
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b5_4
This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2351
- Mean Iou: 0.4792
- Mean Accuracy: 0.5
- Overall Accuracy: 0.9584
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.0
- Accuracy Undropoff: 1.0
- Iou Unlabeled: nan
- Iou Dropoff: 0.0
- Iou Undropoff: 0.9584
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.0114 | 5.0 | 10 | 1.0037 | 0.2459 | 0.4345 | 0.7074 | nan | 0.1368 | 0.7322 | 0.0 | 0.0286 | 0.7089 |
| 0.9088 | 10.0 | 20 | 0.8245 | 0.3119 | 0.5046 | 0.8887 | nan | 0.0857 | 0.9235 | 0.0 | 0.0460 | 0.8897 |
| 0.8029 | 15.0 | 30 | 0.6620 | 0.3157 | 0.4998 | 0.9214 | nan | 0.0399 | 0.9596 | 0.0 | 0.0253 | 0.9219 |
| 0.6935 | 20.0 | 40 | 0.5662 | 0.3154 | 0.4959 | 0.9309 | nan | 0.0214 | 0.9704 | 0.0 | 0.0151 | 0.9311 |
| 0.635 | 25.0 | 50 | 0.5018 | 0.3175 | 0.4978 | 0.9401 | nan | 0.0153 | 0.9803 | 0.0 | 0.0121 | 0.9404 |
| 0.5579 | 30.0 | 60 | 0.4701 | 0.3178 | 0.4978 | 0.9422 | nan | 0.0131 | 0.9825 | 0.0 | 0.0111 | 0.9423 |
| 0.5086 | 35.0 | 70 | 0.4403 | 0.3181 | 0.4977 | 0.9459 | nan | 0.0088 | 0.9866 | 0.0 | 0.0080 | 0.9461 |
| 0.472 | 40.0 | 80 | 0.4328 | 0.3177 | 0.4971 | 0.9471 | nan | 0.0063 | 0.9879 | 0.0 | 0.0059 | 0.9473 |
| 0.4484 | 45.0 | 90 | 0.4136 | 0.3184 | 0.4981 | 0.9506 | nan | 0.0046 | 0.9916 | 0.0 | 0.0044 | 0.9508 |
| 0.4026 | 50.0 | 100 | 0.4013 | 0.3186 | 0.4985 | 0.9516 | nan | 0.0043 | 0.9926 | 0.0 | 0.0042 | 0.9517 |
| 0.3873 | 55.0 | 110 | 0.3621 | 0.3189 | 0.4991 | 0.9557 | nan | 0.0010 | 0.9971 | 0.0 | 0.0009 | 0.9557 |
| 0.3549 | 60.0 | 120 | 0.3479 | 0.3189 | 0.4992 | 0.9564 | nan | 0.0004 | 0.9979 | 0.0 | 0.0004 | 0.9564 |
| 0.3358 | 65.0 | 130 | 0.3282 | 0.3191 | 0.4994 | 0.9571 | nan | 0.0001 | 0.9986 | 0.0 | 0.0001 | 0.9571 |
| 0.3146 | 70.0 | 140 | 0.3141 | 0.3193 | 0.4996 | 0.9577 | nan | 0.0000 | 0.9993 | 0.0 | 0.0000 | 0.9577 |
| 0.3116 | 75.0 | 150 | 0.2941 | 0.3194 | 0.4999 | 0.9582 | nan | 0.0 | 0.9998 | 0.0 | 0.0 | 0.9582 |
| 0.3151 | 80.0 | 160 | 0.2809 | 0.3195 | 0.5000 | 0.9584 | nan | 0.0 | 0.9999 | 0.0 | 0.0 | 0.9584 |
| 0.2778 | 85.0 | 170 | 0.2750 | 0.3195 | 0.5000 | 0.9584 | nan | 0.0 | 1.0000 | 0.0 | 0.0 | 0.9584 |
| 0.2753 | 90.0 | 180 | 0.2615 | 0.3195 | 0.5000 | 0.9584 | nan | 0.0 | 1.0000 | 0.0 | 0.0 | 0.9584 |
| 0.2809 | 95.0 | 190 | 0.2547 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
| 0.2606 | 100.0 | 200 | 0.2464 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
| 0.2563 | 105.0 | 210 | 0.2459 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
| 0.2454 | 110.0 | 220 | 0.2393 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
| 0.2707 | 115.0 | 230 | 0.2368 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
| 0.2433 | 120.0 | 240 | 0.2351 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b5_4", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b5_4 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:24:40+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b5\_4
====================================
This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2351
* Mean Iou: 0.4792
* Mean Accuracy: 0.5
* Overall Accuracy: 0.9584
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.0
* Accuracy Undropoff: 1.0
* Iou Unlabeled: nan
* Iou Dropoff: 0.0
* Iou Undropoff: 0.9584
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7e-06
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10655785351991653,
0.03439371660351753,
-0.0019419043092057109,
0.11647038161754608,
0.17142370343208313,
0.028207143768668175,
0.11584724485874176,
0.1156783401966095,
-0.10815674811601639,
0.030483907088637352,
0.104539655148983,
0.14309664070606232,
0.016284151002764702,
0.09365739673376083,
-0.020075228065252304,
-0.30752265453338623,
-0.025177782401442528,
0.032259900122880936,
-0.08498271554708481,
0.12484107166528702,
0.06451871991157532,
-0.1631065458059311,
0.0912153497338295,
-0.0041216216050088406,
-0.21991907060146332,
0.016335058957338333,
-0.004855596460402012,
-0.030940677970647812,
0.16011905670166016,
0.023652391508221626,
0.11567266285419464,
0.010063016787171364,
0.11419277638196945,
-0.20168451964855194,
0.017739634960889816,
0.05614887923002243,
-0.004454104695469141,
0.06771793216466904,
0.06573683768510818,
0.002863664645701647,
0.15237309038639069,
-0.10577593743801117,
0.06690772622823715,
0.001729099196381867,
-0.14488482475280762,
-0.21147213876247406,
-0.07507898658514023,
0.021204248070716858,
0.07770133018493652,
0.09665393084287643,
-0.005311892367899418,
0.11404626816511154,
-0.09282761812210083,
0.1135876402258873,
0.268428236246109,
-0.2434520125389099,
-0.0864938497543335,
0.040276724845170975,
0.002296437043696642,
0.06698580086231232,
-0.13410255312919617,
0.008745159022510052,
0.03286037594079971,
0.046143390238285065,
0.1157086193561554,
-0.03266897797584534,
-0.09886717051267624,
0.027298366650938988,
-0.1385115534067154,
-0.03307430073618889,
0.05458071082830429,
0.05366646498441696,
-0.020015768706798553,
-0.032231152057647705,
-0.06846322119235992,
-0.18220806121826172,
-0.06598435342311859,
0.012970462441444397,
0.06652937084436417,
-0.059989701956510544,
-0.11528854072093964,
-0.01545904390513897,
-0.10996679961681366,
-0.08571542799472809,
-0.0498616099357605,
0.12707598507404327,
0.03385087475180626,
0.01983794942498207,
-0.034173328429460526,
0.12694084644317627,
-0.027783220633864403,
-0.13988074660301208,
0.016288649290800095,
0.03066813014447689,
-0.042788226157426834,
-0.031850140541791916,
-0.04907780513167381,
-0.06402455270290375,
-0.012945506721735,
0.10796521604061127,
-0.060812514275312424,
0.06878747045993805,
0.03513380140066147,
0.05092594027519226,
-0.1154203787446022,
0.19138316810131073,
-0.06633422523736954,
-0.0074147069826722145,
-0.036858074367046356,
0.05833913013339043,
0.0038941865786910057,
-0.022488748654723167,
-0.10529760271310806,
0.00400175666436553,
0.07033026218414307,
-0.009039760567247868,
-0.08818159252405167,
0.07010415196418762,
-0.0390881709754467,
-0.011480316519737244,
0.0020251090172678232,
-0.0756145566701889,
0.04617195203900337,
-0.0009888473432511091,
-0.08433356881141663,
-0.029409244656562805,
0.05034353584051132,
0.014501574449241161,
0.014096632599830627,
0.16452893614768982,
-0.08693498373031616,
0.06325913220643997,
-0.11365720629692078,
-0.09951546788215637,
0.0005137284751981497,
-0.08633188903331757,
0.03637081757187843,
-0.07722365111112595,
-0.15081025660037994,
-0.009420786052942276,
0.07129019498825073,
-0.040994152426719666,
0.0034082066267728806,
-0.05251622200012207,
-0.09082376211881638,
0.003182404674589634,
-0.008306088857352734,
0.16302210092544556,
-0.06500169634819031,
0.12293682992458344,
0.03844435513019562,
0.07211542874574661,
-0.06644859164953232,
0.03897332400083542,
-0.08577434718608856,
0.019608434289693832,
-0.22291125357151031,
0.0425945483148098,
-0.051503270864486694,
0.06748142093420029,
-0.05975400283932686,
-0.12240885943174362,
0.006993331015110016,
0.0016796911368146539,
0.09235352277755737,
0.10573233664035797,
-0.22369422018527985,
-0.0755210593342781,
0.14812913537025452,
-0.07298964262008667,
-0.09880882501602173,
0.11347880959510803,
-0.06331167370080948,
0.011316175572574139,
0.060314908623695374,
0.1997022181749344,
0.05352737754583359,
-0.13612011075019836,
0.022617068141698837,
-0.015745118260383606,
0.049697767943143845,
-0.027215413749217987,
0.050328243523836136,
0.022695818915963173,
0.0876397043466568,
0.019419336691498756,
-0.0641174390912056,
0.06691133975982666,
-0.12434175610542297,
-0.09658738970756531,
-0.02562580816447735,
-0.08605615794658661,
0.042483314871788025,
0.09079793095588684,
0.06148774176836014,
-0.10512689501047134,
-0.0782778337597847,
0.09066124260425568,
0.0760594978928566,
-0.06889493018388748,
0.039631977677345276,
-0.0655430480837822,
0.04406265169382095,
-0.016970550641417503,
-0.03653893247246742,
-0.17524521052837372,
-0.02572304755449295,
-0.021444575861096382,
0.03602661192417145,
0.02988014928996563,
0.023306239396333694,
0.09192436188459396,
0.08817601948976517,
-0.07236892729997635,
-0.02498922124505043,
-0.06554409116506577,
0.00195695785805583,
-0.12232392281293869,
-0.2276451587677002,
-0.04402796924114227,
-0.007870622910559177,
0.09010160714387894,
-0.21428459882736206,
0.024035753682255745,
0.023377446457743645,
0.08857125788927078,
0.025000697001814842,
-0.030875617638230324,
-0.053667012602090836,
0.07695993036031723,
-0.010099534876644611,
-0.06556744873523712,
0.07007180154323578,
-0.005636604968458414,
-0.06785865873098373,
-0.05610000714659691,
-0.1128782108426094,
0.16328482329845428,
0.1333325058221817,
-0.14613056182861328,
-0.09203368425369263,
-0.013182863593101501,
-0.06400810182094574,
-0.033297572284936905,
-0.04319099336862564,
0.03918531909584999,
0.179500013589859,
-0.00021598715102300048,
0.13226386904716492,
-0.06127970665693283,
-0.034872349351644516,
0.029291631653904915,
-0.02734227478504181,
0.027958944439888,
0.1292913258075714,
0.12447617202997208,
-0.064796082675457,
0.12406503409147263,
0.12408094853162766,
-0.08070776611566544,
0.15003320574760437,
-0.03365969657897949,
-0.07991021126508713,
-0.01752924732863903,
-0.01445924025028944,
-0.008121333085000515,
0.17715951800346375,
-0.1496065855026245,
-0.01638961397111416,
-0.004780160263180733,
0.01417581457644701,
0.015375225804746151,
-0.2509234845638275,
-0.056553106755018234,
0.03839623183012009,
-0.04467190429568291,
-0.009688720107078552,
-0.023493610322475433,
-0.00397091917693615,
0.10473201423883438,
-0.007147649303078651,
-0.07598382234573364,
0.0014107826864346862,
-0.007306807674467564,
-0.048840854316949844,
0.2075139582157135,
-0.05845463648438454,
-0.11787601560354233,
-0.09032980352640152,
-0.07667361944913864,
-0.0369265042245388,
0.0029638831038028,
0.05814146623015404,
-0.10795105993747711,
-0.018405186012387276,
-0.05949919670820236,
0.017319368198513985,
0.0065062581561505795,
0.03535284847021103,
-0.0009675166220404208,
-0.008709326386451721,
0.056462984532117844,
-0.09729452431201935,
-0.009872367605566978,
-0.0664907917380333,
-0.05385076254606247,
0.055070001631975174,
0.05854441598057747,
0.14708560705184937,
0.13520096242427826,
-0.026131367310881615,
0.019924743101000786,
-0.03333013877272606,
0.25834745168685913,
-0.0960642620921135,
-0.026972517371177673,
0.11841947585344315,
-0.011005581356585026,
0.05642074719071388,
0.10694640874862671,
0.08243675529956818,
-0.10921422392129898,
-0.0015873879892751575,
0.06385644525289536,
-0.05226629227399826,
-0.15586163103580475,
-0.015294238924980164,
-0.05826826021075249,
-0.029144899919629097,
0.07601360976696014,
0.027871014550328255,
-0.0016913677100092173,
0.05530894175171852,
0.047622110694646835,
0.04204198718070984,
-0.023995686322450638,
0.050299812108278275,
0.08889541774988174,
0.0324765220284462,
0.10983635485172272,
-0.04483110457658768,
-0.06616087257862091,
0.030778637155890465,
0.0028702181298285723,
0.24377994239330292,
-0.015954382717609406,
0.09668251127004623,
0.07280968129634857,
0.16313408315181732,
-0.011920777149498463,
0.048626337200403214,
-0.01611848548054695,
-0.06883250921964645,
-0.019178291782736778,
-0.04418110474944115,
-0.0173158198595047,
0.009652365930378437,
-0.052238475531339645,
0.03944715857505798,
-0.1254812628030777,
0.009642758406698704,
0.06745787709951401,
0.2490721344947815,
0.02875838242471218,
-0.3177716135978699,
-0.06548649817705154,
-0.006361030973494053,
-0.011195100843906403,
-0.009119133464992046,
0.006431183312088251,
0.15195438265800476,
-0.08143394440412521,
0.05599069222807884,
-0.08508007228374481,
0.08586227148771286,
-0.036104388535022736,
0.05105825513601303,
0.07741276174783707,
0.0738375335931778,
-0.004101335536688566,
0.05687262862920761,
-0.28516700863838196,
0.30232998728752136,
0.001763241016305983,
0.08437072485685349,
-0.06442057341337204,
-0.03234527260065079,
0.032423604279756546,
0.0813775509595871,
0.08581230044364929,
-0.015342125669121742,
-0.021080272272229195,
-0.21459567546844482,
-0.02115500159561634,
0.0309151578694582,
0.12993404269218445,
-0.01727038063108921,
0.1034577339887619,
-0.009836672805249691,
-0.0057778614573180676,
0.07409024238586426,
0.00006670767470495775,
-0.03398491442203522,
-0.09045706689357758,
-0.026275768876075745,
-0.024513516575098038,
-0.04902320355176926,
-0.058453600853681564,
-0.106418676674366,
-0.11503050476312637,
0.11188827455043793,
0.020231155678629875,
-0.012833706103265285,
-0.12076574563980103,
0.09899020940065384,
0.07909516990184784,
-0.07568500936031342,
0.04100129008293152,
0.031808868050575256,
0.057038258761167526,
0.032662395387887955,
-0.05778760835528374,
0.11824706196784973,
-0.05958421900868416,
-0.16074101626873016,
-0.05580658093094826,
0.09117401391267776,
0.0511515736579895,
0.057268526405096054,
-0.023830506950616837,
0.0162728950381279,
-0.017078017815947533,
-0.09215141087770462,
0.054472923278808594,
-0.04371056333184242,
0.06308288127183914,
0.009868331253528595,
-0.019164465367794037,
0.05419205501675606,
-0.05578171834349632,
-0.012737590819597244,
0.1458025425672531,
0.28508689999580383,
-0.08882385492324829,
0.012595952488481998,
0.01662515103816986,
-0.06532689929008484,
-0.19074192643165588,
0.07934858649969101,
0.05827634036540985,
-0.00002044037501036655,
0.08630405366420746,
-0.16634730994701385,
0.09838809072971344,
0.10395929217338562,
0.00024270844005513936,
0.1143452525138855,
-0.3662281334400177,
-0.12796613574028015,
0.07930406183004379,
0.18980877101421356,
0.07616543024778366,
-0.15437094867229462,
0.001177114201709628,
-0.0019996133632957935,
-0.14790639281272888,
0.09125274419784546,
-0.078684002161026,
0.13601312041282654,
-0.0205882228910923,
0.08651898801326752,
0.016242267563939095,
-0.06157783046364784,
0.12217224389314651,
-0.004029902629554272,
0.14075084030628204,
-0.06917835026979446,
-0.039119698107242584,
0.05506245791912079,
-0.03783659264445305,
-0.01300020795315504,
-0.046359967440366745,
0.027763230726122856,
-0.061134908348321915,
-0.011547166854143143,
-0.10540271550416946,
0.01306911651045084,
-0.039088934659957886,
-0.06643501669168472,
-0.04614172875881195,
0.0434253066778183,
0.04405223950743675,
-0.004135769791901112,
0.15444247424602509,
-0.010101831518113613,
0.11540419608354568,
0.051089659333229065,
0.05997030436992645,
-0.06379367411136627,
-0.10517048090696335,
-0.018302928656339645,
0.009427312761545181,
0.048023175448179245,
-0.13348188996315002,
0.014971204102039337,
0.15337783098220825,
0.05037428438663483,
0.12198521196842194,
0.08696672320365906,
-0.032687585800886154,
0.0319390669465065,
0.06921397894620895,
-0.15700671076774597,
-0.11388019472360611,
0.001775260316208005,
-0.06430050730705261,
-0.07271434366703033,
0.05362175405025482,
0.07674501091241837,
-0.07544249296188354,
0.012483804486691952,
-0.005836455151438713,
0.005315045360475779,
-0.06793602555990219,
0.2054125964641571,
0.055473193526268005,
0.041644614189863205,
-0.10368243604898453,
0.07282178103923798,
0.018457740545272827,
-0.08791252970695496,
-0.0022180252708494663,
0.09214261174201965,
-0.06891580671072006,
-0.025142230093479156,
0.0816199854016304,
0.1903740018606186,
-0.07765546441078186,
-0.022338004782795906,
-0.14981213212013245,
-0.10622859746217728,
0.06920967251062393,
0.18646159768104553,
0.1001945286989212,
-0.007322062272578478,
-0.052430447190999985,
0.047040972858667374,
-0.11725176870822906,
0.07696860283613205,
0.02333701401948929,
0.08153577148914337,
-0.14965760707855225,
0.18029698729515076,
0.010873853228986263,
0.0554998405277729,
-0.026271076872944832,
0.03220634534955025,
-0.11924842745065689,
0.040565017610788345,
-0.11277999728918076,
-0.036816492676734924,
-0.015116266906261444,
0.0047827609814703465,
-0.014008915051817894,
-0.06251934170722961,
-0.062461525201797485,
0.005086496938019991,
-0.12765127420425415,
-0.02208099141716957,
0.045821450650691986,
0.02249678410589695,
-0.1260601133108139,
-0.03872985392808914,
0.027657946571707726,
-0.06314889341592789,
0.055593717843294144,
0.03600839897990227,
0.014559016562998295,
0.06648937612771988,
-0.17204146087169647,
-0.021901287138462067,
0.06927093118429184,
-0.006862170062959194,
0.06347285956144333,
-0.0352729894220829,
-0.026355452835559845,
-0.02952849678695202,
0.08770837634801865,
0.012953107245266438,
0.061266738921403885,
-0.1372096836566925,
0.005563123617321253,
-0.03347758576273918,
-0.09372458606958389,
-0.0587952546775341,
0.05385543033480644,
0.061779096722602844,
0.03645414113998413,
0.1621091663837433,
-0.08332296460866928,
0.0443309023976326,
-0.21935757994651794,
-0.0163591168820858,
0.002420956501737237,
-0.10787147283554077,
-0.0816957876086235,
-0.0720716118812561,
0.08325030654668808,
-0.0750500038266182,
0.10929673165082932,
0.03709273412823677,
0.0655255988240242,
0.03087216056883335,
-0.03323964774608612,
-0.0030460995621979237,
0.03406311199069023,
0.21146325767040253,
0.011713345535099506,
-0.03250991553068161,
0.09028588235378265,
0.07935278117656708,
0.09949658811092377,
0.1390925794839859,
0.22692039608955383,
0.1544298529624939,
-0.02621646225452423,
0.08930294215679169,
0.05155103653669357,
-0.06453407555818558,
-0.17219622433185577,
0.03603094071149826,
-0.051922205835580826,
0.09734722226858139,
-0.06216628476977348,
0.20176732540130615,
0.08707885444164276,
-0.18288393318653107,
0.06626500189304352,
-0.04640160873532295,
-0.10203054547309875,
-0.08037317544221878,
-0.03779850900173187,
-0.07001520693302155,
-0.1496809422969818,
0.026186123490333557,
-0.1031782478094101,
0.043618105351924896,
0.15112344920635223,
0.010453086346387863,
-0.012568332254886627,
0.21406567096710205,
0.034283094108104706,
0.03663906082510948,
0.05729563161730766,
0.01460018940269947,
-0.029047535732388496,
-0.09190166741609573,
-0.06035790219902992,
0.018174003809690475,
-0.0309623833745718,
0.018547726795077324,
-0.0689283162355423,
-0.07707813382148743,
0.02670993097126484,
0.005773511715233326,
-0.09377828240394592,
0.023972712457180023,
0.021197224035859108,
0.09090705215930939,
0.026613924652338028,
0.005685432814061642,
0.016624299809336662,
-0.02895846962928772,
0.2477066069841385,
-0.09332553297281265,
-0.08153019845485687,
-0.08104895800352097,
0.21974092721939087,
0.03199583292007446,
0.0011258278973400593,
0.008472003974020481,
-0.081879161298275,
0.009961080737411976,
0.22860732674598694,
0.17143528163433075,
-0.13233722746372223,
-0.009988950565457344,
0.00035298039438202977,
0.001229340792633593,
-0.03000050038099289,
0.1184752881526947,
0.12115052342414856,
0.05244031921029091,
-0.1145184189081192,
-0.052871156483888626,
-0.052820414304733276,
-0.01911143586039543,
-0.026891250163316727,
0.04928678646683693,
0.06720800697803497,
0.021779146045446396,
-0.07012256234884262,
0.0754547193646431,
-0.05898663401603699,
-0.14151348173618317,
0.10546699166297913,
-0.22786404192447662,
-0.1560644805431366,
-0.00745416758581996,
0.12185660749673843,
0.0036275924649089575,
0.06005868315696716,
-0.04212125763297081,
0.0016253362409770489,
0.04826400429010391,
-0.0054392763413488865,
-0.07812444865703583,
-0.1038990467786789,
0.08435551077127457,
-0.1158025935292244,
0.21679194271564484,
-0.05977733060717583,
0.035186007618904114,
0.11364871263504028,
0.06379308551549911,
-0.05035962164402008,
0.056368425488471985,
0.04177653044462204,
-0.12549206614494324,
-0.004849068820476532,
0.12468018382787704,
-0.037965405732393265,
0.05485619232058525,
0.03251924365758896,
-0.13323518633842468,
0.03293527662754059,
-0.055859751999378204,
-0.040195614099502563,
-0.027509232982993126,
-0.04999568313360214,
-0.06395310163497925,
0.11583485454320908,
0.20904980599880219,
-0.008248346857726574,
0.02415088564157486,
-0.08683031797409058,
0.015579107217490673,
0.06556610018014908,
0.04657597094774246,
-0.07849030196666718,
-0.21617646515369415,
0.007010455708950758,
0.0708790272474289,
-0.04262971505522728,
-0.2049013078212738,
-0.11141997575759888,
0.03732346370816231,
-0.054816391319036484,
-0.0715206116437912,
0.09036406129598618,
0.0901840329170227,
0.055542439222335815,
-0.054743655025959015,
-0.10565118491649628,
-0.05920054018497467,
0.17044715583324432,
-0.1473100781440735,
-0.07735525071620941
] |
null | null | pruna-engine | <!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://www.pruna.ai/" target="_blank" rel="noopener noreferrer">
<img src="https://i.imgur.com/eDAlcgk.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
[](https://twitter.com/PrunaAI)
[](https://github.com/PrunaAI)
[](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[](https://discord.gg/CP4VSgck)
# Simply make AI models cheaper, smaller, faster, and greener!
- Give a thumbs up if you like this model!
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
- Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help.
## Results

**Frequently Asked Questions**
- ***How does the compression work?*** The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.
- ***How does the model quality change?*** The quality of the model output might slightly vary compared to the base model.
- ***How is the model efficiency evaluated?*** These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- ***What is the model format?*** We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/) if needed.
- ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
## Setup
You can run the smashed model with these steps:
0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with `nvcc --version` and install with `conda install nvidia/label/cuda-12.1.0::cuda`.
1. Install the `pruna-engine` available [here](https://pypi.org/project/pruna-engine/) on Pypi. It might take up to 15 minutes to install.
```bash
pip install pruna-engine[gpu]==0.6.0 --extra-index-url https://pypi.nvidia.com --extra-index-url https://pypi.ngc.nvidia.com --extra-index-url https://prunaai.pythonanywhere.com/
```
3. Download the model files using one of these three options.
- Option 1 - Use command line interface (CLI):
```bash
mkdir SG161222-Realistic_Vision_V1.4-turbo-tiny-green-smashed
huggingface-cli download PrunaAI/SG161222-Realistic_Vision_V1.4-turbo-tiny-green-smashed --local-dir SG161222-Realistic_Vision_V1.4-turbo-tiny-green-smashed --local-dir-use-symlinks False
```
- Option 2 - Use Python:
```python
import subprocess
repo_name = "SG161222-Realistic_Vision_V1.4-turbo-tiny-green-smashed"
subprocess.run(["mkdir", repo_name])
subprocess.run(["huggingface-cli", "download", 'PrunaAI/'+ repo_name, "--local-dir", repo_name, "--local-dir-use-symlinks", "False"])
```
- Option 3 - Download them manually on the HuggingFace model page.
3. Load & run the model.
```python
from pruna_engine.PrunaModel import PrunaModel
model_path = "SG161222-Realistic_Vision_V1.4-turbo-tiny-green-smashed/model" # Specify the downloaded model path.
smashed_model = PrunaModel.load_model(model_path) # Load the model.
smashed_model(prompt='Beautiful fruits in trees', height=512, width=512)[0][0] # Run the model where x is the expected input of.
```
## Configurations
The configuration info are in `config.json`.
## Credits & License
We follow the same license as the original model. Please check the license of the original model SG161222/Realistic_Vision_V1.4 before using this model which provided the base model.
## Want to compress other models?
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your own AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). | {"license": "apache-2.0", "library_name": "pruna-engine", "metrics": ["memory_disk", "memory_inference", "inference_latency", "inference_throughput", "inference_CO2_emissions", "inference_energy_consumption"], "thumbnail": "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg"} | null | PrunaAI/SG161222-Realistic_Vision_V1.4-turbo-tiny-green-smashed | [
"pruna-engine",
"license:apache-2.0",
"region:us"
] | 2024-02-12T13:24:46+00:00 | [] | [] | TAGS
#pruna-engine #license-apache-2.0 #region-us
|
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="URL target="_blank" rel="noopener noreferrer">
<img src="https://i.URL alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
. We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.
- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.
- *What are "first" metrics?* Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
## Setup
You can run the smashed model with these steps:
0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.
1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.
3. Download the model files using one of these three options.
- Option 1 - Use command line interface (CLI):
- Option 2 - Use Python:
- Option 3 - Download them manually on the HuggingFace model page.
3. Load & run the model.
## Configurations
The configuration info are in 'URL'.
## Credits & License
We follow the same license as the original model. Please check the license of the original model SG161222/Realistic_Vision_V1.4 before using this model which provided the base model.
## Want to compress other models?
- Contact us and tell us which model to compress next here.
- Request access to easily compress your own AI models here. | [
"# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help.",
"## Results\n\n!image info\n\nFrequently Asked Questions\n- *How does the compression work?* The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.\n- *How does the model quality change?* The quality of the model output might slightly vary compared to the base model.\n- *How is the model efficiency evaluated?* These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in 'model/smash_config.json' and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.\n- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.\n- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append \"turbo\", \"tiny\", or \"green\" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.\n- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.\n- *What are \"first\" metrics?* Results mentioning \"first\" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.",
"## Setup\n\nYou can run the smashed model with these steps:\n\n0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.\n1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.\n \n3. Download the model files using one of these three options. \n - Option 1 - Use command line interface (CLI):\n \n - Option 2 - Use Python:\n \n - Option 3 - Download them manually on the HuggingFace model page.\n3. Load & run the model.",
"## Configurations\n\nThe configuration info are in 'URL'.",
"## Credits & License\n\nWe follow the same license as the original model. Please check the license of the original model SG161222/Realistic_Vision_V1.4 before using this model which provided the base model.",
"## Want to compress other models?\n\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your own AI models here."
] | [
"TAGS\n#pruna-engine #license-apache-2.0 #region-us \n",
"# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help.",
"## Results\n\n!image info\n\nFrequently Asked Questions\n- *How does the compression work?* The model is compressed by combining xformers, triton, jit, cuda graphs, tiling, and step caching.\n- *How does the model quality change?* The quality of the model output might slightly vary compared to the base model.\n- *How is the model efficiency evaluated?* These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in 'model/smash_config.json' and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.\n- *What is the model format?* We used a custom Pruna model format based on pickle to make models compatible with the compression methods. We provide a tutorial to run models in dockers in the documentation here if needed.\n- *What is the naming convention for Pruna Huggingface models?* We take the original model name and append \"turbo\", \"tiny\", or \"green\" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.\n- *How to compress my own models?* You can request premium access to more compression methods and tech support for your specific use-cases here.\n- *What are \"first\" metrics?* Results mentioning \"first\" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.",
"## Setup\n\nYou can run the smashed model with these steps:\n\n0. Check that you have linux, python 3.10, and cuda 12.1.0 requirements installed. For cuda, check with 'nvcc --version' and install with 'conda install nvidia/label/cuda-12.1.0::cuda'.\n1. Install the 'pruna-engine' available here on Pypi. It might take up to 15 minutes to install.\n \n3. Download the model files using one of these three options. \n - Option 1 - Use command line interface (CLI):\n \n - Option 2 - Use Python:\n \n - Option 3 - Download them manually on the HuggingFace model page.\n3. Load & run the model.",
"## Configurations\n\nThe configuration info are in 'URL'.",
"## Credits & License\n\nWe follow the same license as the original model. Please check the license of the original model SG161222/Realistic_Vision_V1.4 before using this model which provided the base model.",
"## Want to compress other models?\n\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your own AI models here."
] | [
19,
92,
402,
155,
13,
44,
36
] | [
"passage: TAGS\n#pruna-engine #license-apache-2.0 #region-us \n# Simply make AI models cheaper, smaller, faster, and greener!\n\n- Give a thumbs up if you like this model!\n- Contact us and tell us which model to compress next here.\n- Request access to easily compress your *own* AI models here.\n- Read the documentations to know more here\n- Join Pruna AI community on Discord here to share feedback/suggestions or get help."
] | [
-0.03573288768529892,
0.13100412487983704,
-0.001540288794785738,
0.022444402799010277,
0.10067341476678848,
0.055228229612112045,
0.07351797074079514,
0.10892018675804138,
0.048860158771276474,
0.008610251359641552,
0.1138949766755104,
0.11960647255182266,
0.08499637991189957,
0.1467972695827484,
0.04886651411652565,
-0.36119624972343445,
0.12328322976827621,
0.030059607699513435,
0.11286655068397522,
0.06615262478590012,
0.1305178701877594,
-0.08176743984222412,
0.12242286652326584,
0.11668917536735535,
-0.07146744430065155,
-0.058715738356113434,
0.01354521606117487,
-0.04568718746304512,
0.047688763588666916,
0.006920846179127693,
0.11834761500358582,
0.016820384189486504,
0.08045748621225357,
-0.13821633160114288,
0.04106122627854347,
-0.030310893431305885,
-0.009298821911215782,
0.1184764951467514,
0.02953704260289669,
0.10704020410776138,
0.2749005854129791,
0.11839497834444046,
-0.1026306003332138,
0.08103878796100616,
-0.07297040522098541,
-0.10587455332279205,
-0.0623883455991745,
-0.010004812851548195,
0.06654322892427444,
0.04772229865193367,
-0.06958308070898056,
0.18299241364002228,
-0.12177606672048569,
-0.07967323064804077,
0.07131063938140869,
-0.2638101875782013,
-0.05668892711400986,
0.031334877014160156,
0.10106468200683594,
-0.061002105474472046,
0.011471700854599476,
0.0484081394970417,
0.07423757761716843,
0.003831893904134631,
0.02755037508904934,
0.0231326911598444,
0.1263711303472519,
-0.026059621945023537,
-0.08856093138456345,
-0.05992208048701286,
0.12840677797794342,
0.01695944555103779,
-0.04769099876284599,
-0.09044229239225388,
-0.05195551738142967,
-0.14491762220859528,
-0.05469183251261711,
-0.07260868698358536,
0.00944194383919239,
0.12943263351917267,
0.10376454889774323,
-0.12659528851509094,
-0.12641963362693787,
-0.08089456707239151,
-0.0874733030796051,
0.12121638655662537,
0.08226454257965088,
0.06846921145915985,
-0.020507263019680977,
0.05536094680428505,
0.07567624747753143,
-0.051027387380599976,
0.023186611011624336,
-0.1948918253183365,
-0.016140220686793327,
0.004553433042019606,
-0.12328886240720749,
0.074356809258461,
0.05938659980893135,
0.1526218056678772,
0.10516335070133209,
-0.07109570503234863,
0.14941607415676117,
0.03119758330285549,
0.05183535814285278,
0.0016945579554885626,
-0.19732382893562317,
0.044110603630542755,
0.03397487848997116,
0.029030555859208107,
0.07634714245796204,
0.01838402822613716,
-0.09485938400030136,
0.03653564304113388,
-0.10627062618732452,
0.05883750319480896,
-0.056691549718379974,
-0.02411743625998497,
-0.1321919858455658,
-0.06697039306163788,
0.20706887543201447,
-0.01862199977040291,
-0.020022790879011154,
-0.02232302725315094,
-0.02678769826889038,
0.16259340941905975,
-0.08499446511268616,
0.07367748767137527,
-0.06866656243801117,
-0.07434844970703125,
-0.11305806785821915,
0.008160690777003765,
-0.13459642231464386,
0.04250127822160721,
0.007071048486977816,
-0.09537149220705032,
0.050375018268823624,
-0.11538799852132797,
-0.05920444801449776,
0.10739794373512268,
0.11890298873186111,
-0.038714323192834854,
-0.07378857582807541,
0.07845373451709747,
0.021176116541028023,
-0.1269662231206894,
-0.04373960942029953,
-0.06861985474824905,
-0.0158606618642807,
-0.0012650806456804276,
0.022747088223695755,
0.05848544463515282,
-0.17233939468860626,
0.06719639152288437,
-0.11638288199901581,
0.016827668994665146,
-0.04659169912338257,
0.021059416234493256,
-0.030015811324119568,
0.12740249931812286,
-0.07529475539922714,
0.00627522449940443,
-0.0467182919383049,
0.03761160373687744,
0.04215855151414871,
0.05199176073074341,
-0.2508608102798462,
0.030298501253128052,
0.1098698228597641,
-0.12760701775550842,
-0.08380157500505447,
0.17382264137268066,
0.014979318715631962,
-0.03036264143884182,
0.10716407746076584,
0.08396483957767487,
0.023649759590625763,
-0.05164189636707306,
0.03376004472374916,
-0.05715443566441536,
-0.11622927337884903,
-0.13548094034194946,
0.16667284071445465,
0.08274152129888535,
-0.18323549628257751,
0.05830454081296921,
-0.03704249486327171,
0.07311563938856125,
-0.0804717093706131,
-0.11762288212776184,
-0.022608499974012375,
-0.15161584317684174,
-0.008770188316702843,
0.06604208797216415,
0.028975360095500946,
0.006918381433933973,
-0.06216441094875336,
-0.08010344207286835,
0.17335863411426544,
0.04135739430785179,
-0.07565157860517502,
-0.20717956125736237,
0.1366199105978012,
-0.02334769256412983,
0.017625825479626656,
-0.11182565987110138,
0.0436074398458004,
0.04173315688967705,
-0.0868011936545372,
0.0665515661239624,
0.13982601463794708,
0.013523505069315434,
-0.018048446625471115,
0.05104345083236694,
0.10467413812875748,
0.02380356192588806,
0.014978175982832909,
-0.017454128712415695,
0.035858120769262314,
-0.03912922739982605,
-0.020320983603596687,
0.16504476964473724,
-0.03555602207779884,
0.00458597531542182,
-0.10513406246900558,
0.11301354318857193,
-0.021027158945798874,
0.009408654645085335,
0.0349666066467762,
-0.0038192281499505043,
-0.04639878869056702,
0.039253491908311844,
0.09851683676242828,
-0.07275566458702087,
-0.017454536631703377,
0.11367116123437881,
0.06871969252824783,
0.019681844860315323,
0.16850601136684418,
0.03191102296113968,
0.06726911664009094,
0.012733091600239277,
-0.03376281261444092,
0.060752686113119125,
-0.06495172530412674,
0.011898837052285671,
0.030007708817720413,
-0.00917510874569416,
0.02521984465420246,
-0.053047437220811844,
0.0434323213994503,
0.0030628403183072805,
-0.021423103287816048,
-0.02638157084584236,
0.022684084251523018,
0.3416460454463959,
-0.13255257904529572,
0.07944878935813904,
0.15327060222625732,
-0.048273082822561264,
0.03909280523657799,
-0.01437810342758894,
-0.15436100959777832,
-0.0026699609588831663,
-0.042268503457307816,
-0.03032074123620987,
0.15610229969024658,
0.01616038754582405,
0.014675425365567207,
0.07114095240831375,
-0.10695059597492218,
0.03881934657692909,
-0.1104544848203659,
-0.06612041592597961,
-0.02173178642988205,
-0.06350896507501602,
-0.028971035033464432,
0.03697436302900314,
-0.08885105699300766,
0.04942630976438522,
-0.06041031703352928,
-0.0835113525390625,
-0.00279027596116066,
0.05765359848737717,
0.05292991176247597,
-0.007865676656365395,
0.0266135074198246,
-0.12370338290929794,
-0.1333969384431839,
0.04225758835673332,
0.059311866760253906,
0.07703723758459091,
0.03257518634200096,
-0.04629363492131233,
-0.08232147246599197,
-0.023416923359036446,
-0.11862172931432724,
0.08274577558040619,
-0.05005235597491264,
0.02116742543876171,
0.036951590329408646,
0.05057327821850777,
-0.045071668922901154,
-0.028049485757946968,
-0.031239887699484825,
0.03177438676357269,
0.027354221791028976,
-0.06876137107610703,
0.10529255121946335,
0.06632733345031738,
-0.0037226425483822823,
-0.032648492604494095,
0.00643171789124608,
0.17272710800170898,
-0.055237580090761185,
0.023839082568883896,
0.2087412327528,
-0.011401763185858727,
0.011130680330097675,
0.16558146476745605,
0.019315127283334732,
-0.10980436205863953,
0.04490264132618904,
-0.011618091724812984,
-0.024433093145489693,
-0.24778036773204803,
-0.0949326828122139,
-0.04385732486844063,
-0.0007706784526817501,
0.06361816078424454,
0.011042679660022259,
-0.03859875723719597,
0.23577840626239777,
-0.026849139481782913,
0.018829621374607086,
-0.12459840625524521,
0.0004486647667363286,
-0.0048059262335300446,
-0.023355385288596153,
0.0931726023554802,
-0.08299873769283295,
-0.13033398985862732,
0.16407406330108643,
-0.027186540886759758,
0.16622412204742432,
0.1495974212884903,
0.19688820838928223,
0.05116363242268562,
0.15335489809513092,
0.10609954595565796,
0.07970421761274338,
0.03676251694560051,
-0.03067491576075554,
-0.09282566606998444,
0.021405506879091263,
-0.164025217294693,
0.04669925570487976,
0.13864991068840027,
-0.12398896366357803,
0.02497870661318302,
0.05730264261364937,
0.05873735621571541,
0.18854323029518127,
0.07822155207395554,
-0.2538689970970154,
0.01973096653819084,
0.055058181285858154,
-0.07391679286956787,
0.022711774334311485,
0.06807753443717957,
0.0036701757926493883,
-0.0067734792828559875,
-0.010745672509074211,
-0.06353491544723511,
0.0965924933552742,
-0.04308589547872543,
0.04941694438457489,
0.01959800533950329,
0.16887013614177704,
0.07060588151216507,
0.08728066831827164,
-0.09544848650693893,
0.18648295104503632,
-0.03208779916167259,
-0.011359825730323792,
-0.08284597098827362,
0.004318110644817352,
0.13521210849285126,
0.019071491435170174,
0.09255356341600418,
-0.044269099831581116,
-0.121957927942276,
0.006323229055851698,
-0.23562082648277283,
0.08331291377544403,
-0.05058889836072922,
-0.09113127738237381,
-0.011328799650073051,
-0.05844489112496376,
0.009289783425629139,
-0.08998627215623856,
0.09676292538642883,
-0.17567263543605804,
-0.11333465576171875,
0.04538518562912941,
0.12856784462928772,
0.11294994503259659,
-0.0013648143503814936,
-0.030656781047582626,
-0.0699990838766098,
-0.011223812587559223,
0.20029914379119873,
0.00366450403816998,
-0.07581856101751328,
-0.0320870466530323,
0.15712374448776245,
-0.039278823882341385,
0.04287504032254219,
-0.03278914466500282,
0.0889153853058815,
0.014466522261500359,
-0.054190460592508316,
0.049298208206892014,
-0.06592636555433273,
-0.02200072817504406,
-0.010078194551169872,
0.043240293860435486,
-0.006756619084626436,
0.08686710149049759,
0.08034902811050415,
-0.02028435468673706,
-0.08227753639221191,
-0.12333385646343231,
-0.08462619036436081,
0.02255922183394432,
0.014176737517118454,
-0.03690947964787483,
-0.1973867118358612,
-0.16155937314033508,
-0.09742958098649979,
-0.0629657432436943,
0.15201593935489655,
0.14297084510326385,
-0.08191602677106857,
0.00922099594026804,
0.21116870641708374,
0.06292436271905899,
-0.19491003453731537,
-0.24432238936424255,
-0.03983849659562111,
0.01585426554083824,
0.09687988460063934,
-0.1933489441871643,
0.08233629912137985,
0.182514950633049,
-0.06398133188486099,
-0.07713980972766876,
-0.18120619654655457,
-0.028968514874577522,
0.15624676644802094,
0.08880409598350525,
0.034364253282547,
-0.12160810083150864,
-0.019484393298625946,
-0.08997685462236404,
-0.06645027548074722,
0.1836807280778885,
-0.1339590847492218,
0.10656681656837463,
0.047402817755937576,
-0.03403366357088089,
-0.003687099553644657,
0.005388102028518915,
0.18110449612140656,
-0.06074931100010872,
-0.03121829405426979,
-0.07036804407835007,
-0.040840599685907364,
-0.03660854324698448,
0.002168794395402074,
0.18515387177467346,
-0.12171797454357147,
-0.038098592311143875,
-0.10541598498821259,
-0.06389426440000534,
0.03395857289433479,
-0.07763153314590454,
0.060194969177246094,
-0.03642268478870392,
-0.10219133645296097,
0.08158847689628601,
-0.08674930036067963,
0.04596758633852005,
0.08569365739822388,
0.025330739095807076,
-0.13958010077476501,
-0.041966404765844345,
0.13235506415367126,
-0.05054902657866478,
0.16125746071338654,
-0.10117390006780624,
-0.035509366542100906,
0.06462109833955765,
-0.08589782565832138,
-0.020662501454353333,
0.05795184150338173,
-0.13294143974781036,
0.05381989851593971,
-0.034517206251621246,
-0.054303109645843506,
0.06638223677873611,
0.13539311289787292,
-0.08174611628055573,
-0.26379698514938354,
-0.028586648404598236,
0.18116533756256104,
-0.011824109591543674,
0.03490758687257767,
0.007525808177888393,
-0.06581656634807587,
-0.09945452958345413,
0.03421634063124657,
0.015848571434617043,
-0.058819036930799484,
0.011253371834754944,
0.03603195771574974,
-0.025691984221339226,
-0.12638072669506073,
0.05936821177601814,
0.06823792308568954,
-0.0941934734582901,
-0.032972872257232666,
0.000624003354460001,
-0.0942000150680542,
-0.1938478946685791,
-0.17557795345783234,
-0.05545451119542122,
-0.04006402567028999,
-0.05628135800361633,
-0.021884309127926826,
-0.08124807476997375,
0.018083522096276283,
-0.1502816379070282,
0.14475713670253754,
-0.038145869970321655,
0.006252584047615528,
-0.0209975466132164,
-0.031452372670173645,
0.006174853537231684,
0.025056565180420876,
0.009395711123943329,
-0.1390637904405594,
-0.13066346943378448,
0.024542182683944702,
0.014182592742145061,
-0.07936779409646988,
0.028898935765028,
-0.07041247189044952,
-0.013368768617510796,
-0.19853784143924713,
-0.016248196363449097,
-0.22301587462425232,
-0.04428691789507866,
0.07373011857271194,
-0.06579138338565826,
-0.0643509179353714,
-0.008556312881410122,
-0.10626900941133499,
0.011290484108030796,
-0.001610172912478447,
0.03655340522527695,
-0.014065271243453026,
0.1284508854150772,
0.09549132734537125,
0.022866301238536835,
0.0911332368850708,
-0.012985085137188435,
0.03078892081975937,
-0.002364721614867449,
0.005114673636853695,
-0.008743821643292904,
0.026989391073584557,
0.041291479021310806,
0.036686498671770096,
-0.09319939464330673,
0.051485151052474976,
0.07455861568450928,
0.012060899287462234,
-0.016920236870646477,
0.05300012603402138,
-0.06268929690122604,
-0.04640784114599228,
0.1739540696144104,
-0.15719911456108093,
0.021751491352915764,
-0.14736860990524292,
0.1444556564092636,
-0.010513187386095524,
0.2193831503391266,
0.06425803899765015,
0.006590514909476042,
-0.06233430281281471,
0.09131742268800735,
-0.11381170153617859,
-0.04915030673146248,
-0.09661798924207687,
-0.12367360293865204,
-0.06325404345989227,
-0.007560494355857372,
0.21556709706783295,
0.05387962982058525,
-0.019160475581884384,
0.01564701832830906,
0.10275020450353622,
0.018284136429429054,
0.0486762709915638,
0.19745464622974396,
0.15271995961666107,
0.02341005951166153,
-0.0889909639954567,
-0.01627037115395069,
0.01350808423012495,
-0.0853503867983818,
0.0018235126044601202,
0.06810587644577026,
0.009528014808893204,
0.10209719091653824,
0.04158112779259682,
0.07730802148580551,
-0.08426982909440994,
-0.036100927740335464,
-0.054412636905908585,
-0.039002493023872375,
0.025442468002438545,
0.14708547294139862,
0.1561807543039322,
-0.045414891093969345,
0.00960021186619997,
-0.044451650232076645,
-0.014602012000977993,
-0.11235427111387253,
-0.14517642557621002,
-0.05312458053231239,
-0.11602789163589478,
-0.012407226487994194,
-0.012481655925512314,
-0.10946076363325119,
0.19402679800987244,
0.0020146952010691166,
-0.10891479253768921,
0.15836599469184875,
-0.0684225931763649,
-0.05141875520348549,
-0.01276442687958479,
0.01876121386885643,
-0.07850293815135956,
-0.0022818923462182283,
-0.10077759623527527,
-0.10132168978452682,
0.02481774054467678,
0.01985805295407772,
0.004291232209652662,
-0.08486028015613556,
0.017950383946299553,
-0.0426529161632061,
-0.025940028950572014,
-0.03003011643886566,
-0.0709792897105217,
-0.036955878138542175,
0.031351156532764435,
0.0058463444001972675,
0.01388684380799532,
0.07808863371610641,
0.07333865016698837,
0.018339574337005615,
-0.01919405534863472,
-0.2616608738899231,
0.24205926060676575,
-0.004039466846734285,
0.020393166691064835,
-0.010021558031439781,
-0.02907947264611721,
-0.009054461494088173,
0.3215360641479492,
0.23173850774765015,
-0.19331027567386627,
-0.05439596250653267,
0.012476698495447636,
0.0026319981552660465,
-0.07542501389980316,
0.1094593033194542,
0.018875539302825928,
0.00423765042796731,
-0.05596840754151344,
0.04422816261649132,
-0.05862133949995041,
-0.050088413059711456,
-0.1257522851228714,
-0.04494333639740944,
0.05905360355973244,
-0.022934434935450554,
-0.03221585974097252,
0.10358119010925293,
-0.13487578928470612,
0.16533994674682617,
-0.10657414048910141,
0.05017029866576195,
-0.07994366437196732,
0.028935737907886505,
0.0966583639383316,
0.07425305992364883,
0.07716985791921616,
-0.06533738225698471,
0.0058298250660300255,
0.1655988246202469,
-0.0008596695261076093,
-0.18927240371704102,
0.013590690679848194,
0.1733534038066864,
0.06023263931274414,
0.19089101254940033,
0.01796259731054306,
-0.04964626953005791,
0.06430752575397491,
-0.045803219079971313,
-0.17122182250022888,
0.1270221322774887,
0.01922406442463398,
-0.06127423793077469,
0.04282431676983833,
0.008042684756219387,
-0.024938564747571945,
-0.09220714867115021,
0.030806705355644226,
0.015598767437040806,
0.012003421783447266,
-0.06450504809617996,
0.10216627269983292,
0.005314295180141926,
0.15338653326034546,
-0.1475324183702469,
0.07550209760665894,
0.0819275975227356,
-0.06108740344643593,
-0.012732340954244137,
-0.043435707688331604,
0.12170585989952087,
0.027922818437218666,
-0.05581796541810036,
-0.08375987410545349,
-0.0668790340423584,
-0.08820337802171707,
0.061364442110061646,
0.004165459889918566,
-0.106651172041893,
0.01591005176305771,
-0.057940948754549026,
-0.012328415177762508,
-0.09319394081830978,
-0.02908286266028881,
0.21767060458660126,
0.016501590609550476,
0.006692094262689352,
-0.0009450694778934121,
-0.010726286098361015,
-0.06910049915313721,
-0.11001616716384888,
-0.08330272138118744
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGBD-b5_5
This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2636
- Mean Iou: 0.4256
- Mean Accuracy: 0.6832
- Overall Accuracy: 0.9656
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3752
- Accuracy Undropoff: 0.9912
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.3118
- Iou Undropoff: 0.9650
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.1487 | 5.0 | 10 | 1.0250 | 0.2562 | 0.6276 | 0.6778 | nan | 0.5730 | 0.6823 | 0.0 | 0.0971 | 0.6714 |
| 1.0128 | 10.0 | 20 | 0.9030 | 0.3142 | 0.6730 | 0.8268 | nan | 0.5053 | 0.8407 | 0.0 | 0.1195 | 0.8231 |
| 0.8561 | 15.0 | 30 | 0.7359 | 0.3520 | 0.6913 | 0.8949 | nan | 0.4692 | 0.9133 | 0.0 | 0.1632 | 0.8928 |
| 0.7551 | 20.0 | 40 | 0.6534 | 0.3634 | 0.6999 | 0.9090 | nan | 0.4719 | 0.9280 | 0.0 | 0.1829 | 0.9072 |
| 0.6236 | 25.0 | 50 | 0.5938 | 0.3710 | 0.7001 | 0.9189 | nan | 0.4614 | 0.9388 | 0.0 | 0.1955 | 0.9173 |
| 0.4977 | 30.0 | 60 | 0.5293 | 0.3850 | 0.6987 | 0.9341 | nan | 0.4420 | 0.9555 | 0.0 | 0.2222 | 0.9329 |
| 0.4188 | 35.0 | 70 | 0.4859 | 0.3935 | 0.6941 | 0.9425 | nan | 0.4231 | 0.9650 | 0.0 | 0.2390 | 0.9415 |
| 0.3532 | 40.0 | 80 | 0.4278 | 0.4019 | 0.6823 | 0.9519 | nan | 0.3881 | 0.9764 | 0.0 | 0.2547 | 0.9511 |
| 0.3187 | 45.0 | 90 | 0.3914 | 0.4098 | 0.6873 | 0.9560 | nan | 0.3942 | 0.9804 | 0.0 | 0.2742 | 0.9553 |
| 0.2631 | 50.0 | 100 | 0.3647 | 0.4134 | 0.6918 | 0.9575 | nan | 0.4020 | 0.9815 | 0.0 | 0.2835 | 0.9567 |
| 0.2565 | 55.0 | 110 | 0.3424 | 0.4141 | 0.6895 | 0.9585 | nan | 0.3962 | 0.9829 | 0.0 | 0.2846 | 0.9578 |
| 0.2259 | 60.0 | 120 | 0.3127 | 0.4178 | 0.6853 | 0.9613 | nan | 0.3843 | 0.9863 | 0.0 | 0.2926 | 0.9607 |
| 0.2263 | 65.0 | 130 | 0.2920 | 0.4202 | 0.6822 | 0.9632 | nan | 0.3757 | 0.9886 | 0.0 | 0.2981 | 0.9626 |
| 0.1961 | 70.0 | 140 | 0.2755 | 0.4218 | 0.6769 | 0.9649 | nan | 0.3627 | 0.9911 | 0.0 | 0.3009 | 0.9644 |
| 0.1897 | 75.0 | 150 | 0.2726 | 0.4232 | 0.6803 | 0.9650 | nan | 0.3698 | 0.9908 | 0.0 | 0.3052 | 0.9645 |
| 0.1863 | 80.0 | 160 | 0.2762 | 0.4241 | 0.6830 | 0.9649 | nan | 0.3756 | 0.9904 | 0.0 | 0.3079 | 0.9643 |
| 0.1656 | 85.0 | 170 | 0.2730 | 0.4241 | 0.6809 | 0.9653 | nan | 0.3708 | 0.9911 | 0.0 | 0.3076 | 0.9648 |
| 0.1745 | 90.0 | 180 | 0.2740 | 0.4241 | 0.6821 | 0.9651 | nan | 0.3736 | 0.9907 | 0.0 | 0.3079 | 0.9645 |
| 0.1726 | 95.0 | 190 | 0.2779 | 0.4242 | 0.6854 | 0.9645 | nan | 0.3809 | 0.9898 | 0.0 | 0.3085 | 0.9639 |
| 0.158 | 100.0 | 200 | 0.2661 | 0.4248 | 0.6808 | 0.9656 | nan | 0.3701 | 0.9915 | 0.0 | 0.3094 | 0.9651 |
| 0.19 | 105.0 | 210 | 0.2667 | 0.4240 | 0.6790 | 0.9656 | nan | 0.3664 | 0.9916 | 0.0 | 0.3070 | 0.9651 |
| 0.1533 | 110.0 | 220 | 0.2696 | 0.4258 | 0.6843 | 0.9655 | nan | 0.3777 | 0.9910 | 0.0 | 0.3126 | 0.9649 |
| 0.1644 | 115.0 | 230 | 0.2690 | 0.4261 | 0.6855 | 0.9654 | nan | 0.3803 | 0.9908 | 0.0 | 0.3136 | 0.9648 |
| 0.1594 | 120.0 | 240 | 0.2636 | 0.4256 | 0.6832 | 0.9656 | nan | 0.3752 | 0.9912 | 0.0 | 0.3118 | 0.9650 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
| {"license": "other", "tags": ["vision", "image-segmentation", "generated_from_trainer"], "model-index": [{"name": "dropoff-utcustom-train-SF-RGBD-b5_5", "results": []}]} | image-segmentation | sam1120/dropoff-utcustom-train-SF-RGBD-b5_5 | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:25:21+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us
| dropoff-utcustom-train-SF-RGBD-b5\_5
====================================
This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2636
* Mean Iou: 0.4256
* Mean Accuracy: 0.6832
* Overall Accuracy: 0.9656
* Accuracy Unlabeled: nan
* Accuracy Dropoff: 0.3752
* Accuracy Undropoff: 0.9912
* Iou Unlabeled: 0.0
* Iou Dropoff: 0.3118
* Iou Undropoff: 0.9650
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 9e-06
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.05
* num\_epochs: 120
### Training results
### Framework versions
* Transformers 4.30.2
* Pytorch 2.0.1+cu117
* Datasets 2.13.1
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
48,
117,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #segformer #vision #image-segmentation #generated_from_trainer #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 9e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.05\n* num\\_epochs: 120### Training results### Framework versions\n\n\n* Transformers 4.30.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.13.1\n* Tokenizers 0.13.3"
] | [
-0.10685057938098907,
0.03626430779695511,
-0.001935673295520246,
0.11635380983352661,
0.17187130451202393,
0.027859587222337723,
0.11530084162950516,
0.11533981561660767,
-0.10984227806329727,
0.03149544820189476,
0.10524952411651611,
0.143252432346344,
0.01654832810163498,
0.09448825567960739,
-0.019710686057806015,
-0.3064554035663605,
-0.02497927099466324,
0.032077379524707794,
-0.0844716802239418,
0.12509827315807343,
0.06434638053178787,
-0.16338524222373962,
0.09151484072208405,
-0.0040703509002923965,
-0.2204686552286148,
0.016166582703590393,
-0.005493758711963892,
-0.030991580337285995,
0.1595882922410965,
0.023075681179761887,
0.1162770614027977,
0.00943325087428093,
0.11400919407606125,
-0.2004028707742691,
0.0177561417222023,
0.05652876943349838,
-0.004191652871668339,
0.06761171668767929,
0.06547973304986954,
0.0025243882555514574,
0.15156961977481842,
-0.10634521394968033,
0.0674901083111763,
0.001579924370162189,
-0.14478418231010437,
-0.2124166041612625,
-0.0751550942659378,
0.02231522649526596,
0.07801754772663116,
0.09615685790777206,
-0.005620093550533056,
0.11418497562408447,
-0.09215269982814789,
0.11302897334098816,
0.2677992880344391,
-0.24256062507629395,
-0.08603566139936447,
0.038885246962308884,
0.002372533082962036,
0.06614953279495239,
-0.13294051587581635,
0.007953571155667305,
0.032668136060237885,
0.04610496014356613,
0.11620113253593445,
-0.03284338861703873,
-0.101409912109375,
0.02692420780658722,
-0.13881252706050873,
-0.03254663944244385,
0.05572222173213959,
0.053394194692373276,
-0.020368918776512146,
-0.03148704394698143,
-0.0676790252327919,
-0.1815362423658371,
-0.06611555069684982,
0.01311821211129427,
0.06654393672943115,
-0.06034168601036072,
-0.11456529796123505,
-0.015056446194648743,
-0.11003416031599045,
-0.08572620153427124,
-0.05005105957388878,
0.12854772806167603,
0.03410562872886658,
0.019352080300450325,
-0.03423167020082474,
0.12678693234920502,
-0.027194691821932793,
-0.14028315246105194,
0.017776047810912132,
0.030542144551873207,
-0.042405419051647186,
-0.031596340239048004,
-0.04922625422477722,
-0.06509648263454437,
-0.013548120856285095,
0.10681724548339844,
-0.05883427709341049,
0.06821789592504501,
0.03543945029377937,
0.05088932067155838,
-0.11458641290664673,
0.19146117568016052,
-0.06730221211910248,
-0.007298177573829889,
-0.036600541323423386,
0.05908145010471344,
0.0043907309882342815,
-0.02243092469871044,
-0.10571130365133286,
0.004728015512228012,
0.0698428750038147,
-0.00863322988152504,
-0.08780333399772644,
0.07071921229362488,
-0.03927977755665779,
-0.011372084729373455,
0.0008907864685170352,
-0.07625842839479446,
0.04624326899647713,
-0.0006231636507436633,
-0.08423244208097458,
-0.029023591428995132,
0.05135698243975639,
0.013989760540425777,
0.013417830690741539,
0.16561472415924072,
-0.08720922470092773,
0.0630008727312088,
-0.11391830444335938,
-0.1005091592669487,
0.0006727701984345913,
-0.08795957267284393,
0.0363733172416687,
-0.07745591551065445,
-0.15050405263900757,
-0.009602072648704052,
0.0712776929140091,
-0.04039333015680313,
0.0037425514310598373,
-0.05333979055285454,
-0.09146621823310852,
0.0031692148186266422,
-0.008615581318736076,
0.16386833786964417,
-0.06535213440656662,
0.12305673211812973,
0.037510260939598083,
0.07248316705226898,
-0.06562100350856781,
0.039912428706884384,
-0.08563850075006485,
0.01988103985786438,
-0.2224932163953781,
0.04311473295092583,
-0.051310792565345764,
0.06823128461837769,
-0.05971883237361908,
-0.12216491997241974,
0.007363898679614067,
0.002198630478233099,
0.09234175831079483,
0.1064518466591835,
-0.22506582736968994,
-0.07562999427318573,
0.1481783390045166,
-0.07299895584583282,
-0.0986948311328888,
0.1127622202038765,
-0.06419426947832108,
0.012417944148182869,
0.061098065227270126,
0.19991451501846313,
0.053843773901462555,
-0.1367926448583603,
0.021609637886285782,
-0.015871532261371613,
0.04890686273574829,
-0.02798183262348175,
0.050245851278305054,
0.022397510707378387,
0.08821465075016022,
0.019241036847233772,
-0.06581854820251465,
0.06778544932603836,
-0.1236887201666832,
-0.09655162692070007,
-0.02543337270617485,
-0.08594772219657898,
0.04241475462913513,
0.0907178595662117,
0.06137683242559433,
-0.10548804700374603,
-0.07827889919281006,
0.09143070131540298,
0.07605335861444473,
-0.06874499469995499,
0.03947000950574875,
-0.06556328386068344,
0.044134289026260376,
-0.01738561876118183,
-0.03630686178803444,
-0.17513135075569153,
-0.0254961010068655,
-0.02173396572470665,
0.03430168703198433,
0.030381524935364723,
0.022684089839458466,
0.09147888422012329,
0.08866636455059052,
-0.07124453783035278,
-0.025394883006811142,
-0.06514831632375717,
0.0025242348201572895,
-0.12216109782457352,
-0.22858741879463196,
-0.04355696588754654,
-0.00815774966031313,
0.08776155859231949,
-0.21201197803020477,
0.02407139725983143,
0.02383519895374775,
0.08827987313270569,
0.025346165522933006,
-0.031486958265304565,
-0.05262213200330734,
0.07697917520999908,
-0.01048391591757536,
-0.0657806470990181,
0.0699291080236435,
-0.005575011018663645,
-0.06857788562774658,
-0.055325381457805634,
-0.11398743838071823,
0.16212861239910126,
0.1343807429075241,
-0.14744463562965393,
-0.09236977249383926,
-0.010949798859655857,
-0.06368359923362732,
-0.0333402119576931,
-0.04263182729482651,
0.038925040513277054,
0.1805073618888855,
-0.00012792288907803595,
0.13281702995300293,
-0.0612633116543293,
-0.035053350031375885,
0.029012465849518776,
-0.027209581807255745,
0.027399972081184387,
0.1295827329158783,
0.12509913742542267,
-0.06315279752016068,
0.12457630038261414,
0.12523476779460907,
-0.08054209500551224,
0.14924952387809753,
-0.0337057001888752,
-0.08058784157037735,
-0.018124591559171677,
-0.01498951856046915,
-0.008144272491335869,
0.1774190068244934,
-0.15068712830543518,
-0.017481965944170952,
-0.004701419733464718,
0.014108945615589619,
0.015137489885091782,
-0.25119826197624207,
-0.056064672768116,
0.03871864080429077,
-0.04418681934475899,
-0.010122931562364101,
-0.024731511250138283,
-0.004263777751475573,
0.10469487309455872,
-0.006829763762652874,
-0.07522596418857574,
0.0009397919639013708,
-0.007544955238699913,
-0.04874071478843689,
0.20733216404914856,
-0.05867748707532883,
-0.11888998746871948,
-0.09063097834587097,
-0.07728288322687149,
-0.03643593192100525,
0.0032626588363200426,
0.05800759047269821,
-0.10886978358030319,
-0.018631264567375183,
-0.05953642725944519,
0.018379520624876022,
0.00653329212218523,
0.03581196814775467,
-0.0010796820279210806,
-0.008305853232741356,
0.05599924176931381,
-0.09695360064506531,
-0.009751099161803722,
-0.06640910357236862,
-0.05221163481473923,
0.0540371835231781,
0.0599382221698761,
0.1480494737625122,
0.13528180122375488,
-0.026023129001259804,
0.01946074888110161,
-0.032555028796195984,
0.25728389620780945,
-0.09593669325113297,
-0.027019720524549484,
0.1184641644358635,
-0.012993209064006805,
0.05643979832530022,
0.10674209147691727,
0.08224165439605713,
-0.10914762318134308,
-0.0021797730587422848,
0.06342672556638718,
-0.05210611969232559,
-0.15560463070869446,
-0.01492993999272585,
-0.05806518718600273,
-0.02997620962560177,
0.07656913250684738,
0.02729225903749466,
-0.0039429329335689545,
0.055973973125219345,
0.0485488697886467,
0.0422152541577816,
-0.024820512160658836,
0.05030658096075058,
0.08844783157110214,
0.031934622675180435,
0.10927179455757141,
-0.04496106877923012,
-0.06661403179168701,
0.03137355297803879,
0.003460187464952469,
0.2442261427640915,
-0.01649155281484127,
0.09702381491661072,
0.07324043661355972,
0.1625985950231552,
-0.012576618231832981,
0.04870334640145302,
-0.016265448182821274,
-0.06828615814447403,
-0.019438516348600388,
-0.0442563071846962,
-0.017400460317730904,
0.009829409420490265,
-0.052364785224199295,
0.03955406695604324,
-0.12583647668361664,
0.00892991479486227,
0.06750801205635071,
0.24912838637828827,
0.029018839821219444,
-0.3183880150318146,
-0.06565827131271362,
-0.0056962138041853905,
-0.010914385318756104,
-0.009167463518679142,
0.006561817601323128,
0.15286700427532196,
-0.08080077916383743,
0.05641217902302742,
-0.08472999930381775,
0.08544473350048065,
-0.036699049174785614,
0.05069311335682869,
0.07709122449159622,
0.07387557625770569,
-0.004402882419526577,
0.05627644807100296,
-0.2844226062297821,
0.30184653401374817,
0.0019412569236010313,
0.08472838252782822,
-0.06408196687698364,
-0.03190705552697182,
0.03334270417690277,
0.08087810128927231,
0.08644286543130875,
-0.015275918878614902,
-0.02278541401028633,
-0.21447885036468506,
-0.021816356107592583,
0.03074919618666172,
0.12930479645729065,
-0.017003260552883148,
0.10420249402523041,
-0.009626083076000214,
-0.0053878407925367355,
0.07411561906337738,
-0.0013217201922088861,
-0.032375771552324295,
-0.09013956040143967,
-0.02635461464524269,
-0.025063641369342804,
-0.04987993463873863,
-0.0583735927939415,
-0.10665848106145859,
-0.1148701012134552,
0.11142092943191528,
0.01925581507384777,
-0.013891641981899738,
-0.1200677752494812,
0.09841684997081757,
0.07931535691022873,
-0.07563216239213943,
0.04060979560017586,
0.031621385365724564,
0.0567694790661335,
0.03328625485301018,
-0.057950254529714584,
0.11829067021608353,
-0.059832677245140076,
-0.15997421741485596,
-0.0567440502345562,
0.09150678664445877,
0.05084343999624252,
0.05709720030426979,
-0.024659184738993645,
0.016479352489113808,
-0.017775993794202805,
-0.09203257411718369,
0.05514802038669586,
-0.044719912111759186,
0.06380371004343033,
0.01107920054346323,
-0.020117932930588722,
0.05097772553563118,
-0.056350190192461014,
-0.012211905792355537,
0.14652156829833984,
0.28519129753112793,
-0.08895742893218994,
0.013015838339924812,
0.018084168434143066,
-0.06592965126037598,
-0.19147610664367676,
0.08019894361495972,
0.05769340693950653,
0.000294802593998611,
0.08597011864185333,
-0.16697928309440613,
0.09797414392232895,
0.10399558395147324,
0.0006753257475793362,
0.11591461300849915,
-0.3673976957798004,
-0.12822552025318146,
0.08067037910223007,
0.1909283697605133,
0.07638444751501083,
-0.1555989533662796,
0.0011150550562888384,
-0.0020477990619838238,
-0.14684085547924042,
0.0915546864271164,
-0.0774684026837349,
0.1356402039527893,
-0.019881173968315125,
0.08665145933628082,
0.016510190442204475,
-0.06163441762328148,
0.12189887464046478,
-0.003514062613248825,
0.14098508656024933,
-0.06979281455278397,
-0.03949557989835739,
0.05547715723514557,
-0.03780345246195793,
-0.012532943859696388,
-0.046548739075660706,
0.027314508333802223,
-0.061505403369665146,
-0.011533861048519611,
-0.10497517883777618,
0.01234265137463808,
-0.03862180560827255,
-0.0665336400270462,
-0.046126287430524826,
0.043440841138362885,
0.04481000825762749,
-0.004333494696766138,
0.15160907804965973,
-0.009838461875915527,
0.11446475982666016,
0.048616085201501846,
0.059357717633247375,
-0.0625092014670372,
-0.10644620656967163,
-0.017419226467609406,
0.009137879125773907,
0.04812520742416382,
-0.13459797203540802,
0.014520320110023022,
0.15316778421401978,
0.05033844709396362,
0.12206333130598068,
0.08668199926614761,
-0.032152093946933746,
0.032478272914886475,
0.06915321201086044,
-0.15726125240325928,
-0.11348368972539902,
0.002647005720064044,
-0.0666738972067833,
-0.07283763587474823,
0.05314626917243004,
0.07690896093845367,
-0.07524632662534714,
0.01250340323895216,
-0.006433835253119469,
0.00623489823192358,
-0.06747746467590332,
0.20530791580677032,
0.05628814920783043,
0.04134016111493111,
-0.10373891890048981,
0.07344047725200653,
0.018822547048330307,
-0.08832450956106186,
-0.001273069647140801,
0.091876320540905,
-0.06927075237035751,
-0.02488582767546177,
0.08068958669900894,
0.19198188185691833,
-0.0760442391037941,
-0.022711295634508133,
-0.15018387138843536,
-0.10663120448589325,
0.0696946233510971,
0.1859212964773178,
0.10009218007326126,
-0.0066929589956998825,
-0.05261373147368431,
0.04718083515763283,
-0.11769203096628189,
0.07785926759243011,
0.02375919185578823,
0.08150064945220947,
-0.149414524435997,
0.18238325417041779,
0.011515012942254543,
0.05549228936433792,
-0.0262591689825058,
0.03299567103385925,
-0.11901942640542984,
0.04042082652449608,
-0.11323829740285873,
-0.03656476363539696,
-0.015819627791643143,
0.004982688929885626,
-0.013605719432234764,
-0.06250959634780884,
-0.06261712312698364,
0.004969872068613768,
-0.12757059931755066,
-0.022049032151699066,
0.04597632214426994,
0.0226356890052557,
-0.12621454894542694,
-0.039179492741823196,
0.027957504615187645,
-0.0634961947798729,
0.05570286884903908,
0.0362657755613327,
0.014641729183495045,
0.06589099019765854,
-0.1722085177898407,
-0.021717514842748642,
0.06956680864095688,
-0.00652063125744462,
0.06333942711353302,
-0.03564884886145592,
-0.026136288419365883,
-0.02971090003848076,
0.08751439303159714,
0.01264885812997818,
0.06254559755325317,
-0.13715514540672302,
0.005597017705440521,
-0.03272739425301552,
-0.09302836656570435,
-0.05889085307717323,
0.053976111114025116,
0.06266439706087112,
0.03673491254448891,
0.1626386046409607,
-0.08305823802947998,
0.04478827118873596,
-0.21874047815799713,
-0.016379784792661667,
0.0019039716571569443,
-0.10824413597583771,
-0.08191505819559097,
-0.07234217971563339,
0.08313220739364624,
-0.07516274601221085,
0.1098691001534462,
0.03700150176882744,
0.06477745622396469,
0.031236430630087852,
-0.03266031667590141,
-0.0035936387721449137,
0.03474467247724533,
0.21087779104709625,
0.010864556767046452,
-0.033186934888362885,
0.08911871165037155,
0.07923591136932373,
0.09979009628295898,
0.13633733987808228,
0.22690275311470032,
0.15538907051086426,
-0.025031035766005516,
0.0895581841468811,
0.05219664424657822,
-0.06445913761854172,
-0.17318131029605865,
0.037061262875795364,
-0.052631620317697525,
0.0983063206076622,
-0.06133917346596718,
0.20385372638702393,
0.08528558909893036,
-0.18248873949050903,
0.06686433404684067,
-0.04619051143527031,
-0.1014547124505043,
-0.07912042737007141,
-0.036780521273612976,
-0.06989338994026184,
-0.14807648956775665,
0.025808745995163918,
-0.10289522260427475,
0.043146952986717224,
0.15061259269714355,
0.010142462328076363,
-0.013186412863433361,
0.21496398746967316,
0.03361218795180321,
0.03597399592399597,
0.05797773599624634,
0.01461877766996622,
-0.02994365803897381,
-0.09197406470775604,
-0.060405321419239044,
0.01852886751294136,
-0.02963108941912651,
0.018435228615999222,
-0.06895328313112259,
-0.07708732038736343,
0.026520729064941406,
0.005244007334113121,
-0.09371335804462433,
0.023878421634435654,
0.020549390465021133,
0.09170868992805481,
0.026401499286293983,
0.006316365208476782,
0.016879843547940254,
-0.02882014960050583,
0.24586759507656097,
-0.09307842701673508,
-0.08082079142332077,
-0.08126507699489594,
0.21629635989665985,
0.031283751130104065,
0.00038532153121195734,
0.008487934246659279,
-0.08194472640752792,
0.009446374140679836,
0.22900176048278809,
0.17176134884357452,
-0.13262340426445007,
-0.010672206990420818,
0.00022125232499092817,
0.0014990816125646234,
-0.030007947236299515,
0.11874573677778244,
0.12176579236984253,
0.05226140096783638,
-0.1143321618437767,
-0.053072117269039154,
-0.05319010466337204,
-0.01884022355079651,
-0.026799771934747696,
0.04893719404935837,
0.06688852608203888,
0.021754715591669083,
-0.07005330920219421,
0.07569805532693863,
-0.058277588337659836,
-0.14415089786052704,
0.1064227893948555,
-0.22888155281543732,
-0.1564146727323532,
-0.007347749080508947,
0.12239295989274979,
0.0041984254494309425,
0.06011074781417847,
-0.04187969118356705,
0.0014577142428606749,
0.04891619831323624,
-0.0053495257161557674,
-0.07872701436281204,
-0.1040332019329071,
0.08433252573013306,
-0.1156696304678917,
0.21760204434394836,
-0.05983545631170273,
0.03409330174326897,
0.11338185518980026,
0.06366975605487823,
-0.05050501599907875,
0.05644633620977402,
0.04183590039610863,
-0.12415521591901779,
-0.004520765971392393,
0.12383025139570236,
-0.03830607607960701,
0.05426766723394394,
0.032783906906843185,
-0.13324357569217682,
0.03243369236588478,
-0.056396275758743286,
-0.04118708148598671,
-0.027750611305236816,
-0.050637196749448776,
-0.06415977329015732,
0.1156311109662056,
0.20928682386875153,
-0.008331581018865108,
0.023522816598415375,
-0.08692961931228638,
0.015440301969647408,
0.06543692201375961,
0.04765712842345238,
-0.07815010845661163,
-0.2155001014471054,
0.006871576886624098,
0.07048681378364563,
-0.04139501601457596,
-0.20594075322151184,
-0.11099885404109955,
0.03691459447145462,
-0.054492104798555374,
-0.07167309522628784,
0.09040647745132446,
0.0892757773399353,
0.05621092766523361,
-0.055385008454322815,
-0.10468907654285431,
-0.05855516716837883,
0.17021453380584717,
-0.14701642096042633,
-0.0777096375823021
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | ISdept/qwen-7b-1_5-hi200-faq-ym-intents-lang | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:25:24+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #qwen2 #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #qwen2 #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
52,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #qwen2 #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06744060665369034,
0.1237388551235199,
-0.004114609677344561,
0.02991606667637825,
0.11460870504379272,
0.005568372085690498,
0.06294357031583786,
0.10971193760633469,
-0.026014693081378937,
0.11581014841794968,
0.018924949690699577,
0.10499268025159836,
0.10659246146678925,
0.1691424399614334,
-0.006015846040099859,
-0.21231532096862793,
0.044865865260362625,
-0.13380737602710724,
-0.025073938071727753,
0.11961860954761505,
0.13043774664402008,
-0.12202122807502747,
0.06986955553293228,
-0.03994565084576607,
-0.009295043535530567,
-0.0361013263463974,
-0.05820033326745033,
-0.04808541759848595,
0.06927672773599625,
0.0690578892827034,
0.06336662918329239,
0.01922842301428318,
0.10299910604953766,
-0.2810887396335602,
0.0236574187874794,
0.08111110329627991,
0.002226806478574872,
0.07000467926263809,
0.06337219476699829,
-0.07296913117170334,
0.06984713673591614,
-0.06522127240896225,
0.14495620131492615,
0.08224987238645554,
-0.0922221839427948,
-0.19323916733264923,
-0.08794740587472916,
0.09357348084449768,
0.19385994970798492,
0.05913294479250908,
-0.03049401193857193,
0.12686537206172943,
-0.07434657961130142,
0.01852177456021309,
0.06567037850618362,
-0.08194528520107269,
-0.053086262196302414,
0.06812959164381027,
0.07113085687160492,
0.10160701721906662,
-0.13397133350372314,
-0.0072817932814359665,
0.03036416508257389,
0.013016993179917336,
0.10258961468935013,
0.017448842525482178,
0.11838137358427048,
0.04335033521056175,
-0.14493173360824585,
-0.038016412407159805,
0.0884561613202095,
0.04341543838381767,
-0.05371417850255966,
-0.24333322048187256,
-0.021258622407913208,
-0.033045537769794464,
-0.03133222460746765,
-0.048901937901973724,
0.046065423637628555,
-0.018345197662711143,
0.0746571272611618,
-0.00905180536210537,
-0.077952079474926,
-0.047369781881570816,
0.07820919156074524,
0.06576532125473022,
0.026357414200901985,
-0.0243342574685812,
0.00772935152053833,
0.11627262830734253,
0.09934048354625702,
-0.11843404918909073,
-0.049750957638025284,
-0.06367483735084534,
-0.08425901085138321,
-0.04867105185985565,
0.029223250225186348,
0.03197961300611496,
0.05072800815105438,
0.2138856053352356,
-0.0016585314879193902,
0.04777570813894272,
0.0300018060952425,
0.01629858836531639,
0.0634123831987381,
0.09685925394296646,
-0.058943528681993484,
-0.12131623923778534,
-0.022760409861803055,
0.10975006967782974,
0.002361652674153447,
-0.03354809433221817,
-0.04929806664586067,
0.0689367800951004,
0.017635801807045937,
0.12228328734636307,
0.07093650102615356,
0.01461301650851965,
-0.07341200113296509,
-0.0643758624792099,
0.17208924889564514,
-0.1599913388490677,
0.033031485974788666,
0.027699848636984825,
-0.049781136214733124,
-0.016962584108114243,
0.0206128042191267,
0.030544809997081757,
-0.009477566927671432,
0.08983151614665985,
-0.051631052047014236,
-0.03264494985342026,
-0.11271350830793381,
-0.05229318514466286,
0.022805018350481987,
0.02329850196838379,
-0.029599839821457863,
-0.04297630116343498,
-0.10461901128292084,
-0.0702618658542633,
0.08274642378091812,
-0.06679617613554001,
-0.04588131234049797,
-0.034392643719911575,
-0.08036767691373825,
0.012772615067660809,
0.006944936700165272,
0.11524419486522675,
-0.024861354380846024,
0.04965236783027649,
-0.05080482363700867,
0.07076980918645859,
0.12968726456165314,
0.0256124809384346,
-0.052786268293857574,
0.05227842554450035,
-0.23543758690357208,
0.10626004636287689,
-0.07104437053203583,
0.04600486531853676,
-0.16222067177295685,
-0.019692296162247658,
0.04013443738222122,
0.022423196583986282,
-0.0052419379353523254,
0.13304713368415833,
-0.20579689741134644,
-0.03484721481800079,
0.1778334081172943,
-0.10716996341943741,
-0.08844240009784698,
0.05829978361725807,
-0.05727203190326691,
0.12106184661388397,
0.046658918261528015,
-0.015959804877638817,
0.030861597508192062,
-0.14105893671512604,
-0.012573265470564365,
-0.05725134164094925,
-0.027953004464507103,
0.1594742387533188,
0.06174226105213165,
-0.04975385218858719,
0.06329082697629929,
0.017857130616903305,
-0.014720242470502853,
-0.047373462468385696,
-0.03508519008755684,
-0.10099945217370987,
0.009225212968885899,
-0.0735674798488617,
0.025139320641756058,
-0.03237168863415718,
-0.09091918170452118,
-0.030487151816487312,
-0.15721407532691956,
0.006027343682944775,
0.09086263924837112,
-0.0028123122174292803,
-0.02166888304054737,
-0.10495693236589432,
-0.015849687159061432,
0.023717699572443962,
0.0010735627729445696,
-0.14732947945594788,
-0.052729055285453796,
0.01963592879474163,
-0.16102278232574463,
0.03527507185935974,
-0.032337408512830734,
0.046559423208236694,
0.04404491186141968,
-0.044810350984334946,
-0.03644292429089546,
0.01527401339262724,
0.01702694222331047,
-0.01812152937054634,
-0.2757890224456787,
-0.016599029302597046,
-0.037502363324165344,
0.16484688222408295,
-0.2536672055721283,
0.044451385736465454,
0.052858345210552216,
0.12650004029273987,
0.011718528345227242,
-0.026840604841709137,
0.02031077817082405,
-0.06778053194284439,
-0.03378141298890114,
-0.060537584125995636,
-0.0102090397849679,
-0.036261335015296936,
-0.05234677344560623,
0.03442572429776192,
-0.16672758758068085,
-0.04233158379793167,
0.11038065701723099,
0.03841483220458031,
-0.1514066904783249,
-0.046796903014183044,
-0.04655757546424866,
-0.05544671788811684,
-0.06981822848320007,
-0.05111313611268997,
0.10990618914365768,
0.0552663654088974,
0.054820816963911057,
-0.06279280036687851,
-0.06714518368244171,
0.008098754100501537,
-0.023038236424326897,
-0.01628015749156475,
0.08303935825824738,
0.07147926092147827,
-0.12255207449197769,
0.09013188630342484,
0.0958702489733696,
0.08535332977771759,
0.10111390799283981,
0.0031223141122609377,
-0.08790350705385208,
-0.02990630455315113,
0.029989181086421013,
0.01356097124516964,
0.150030717253685,
-0.026905570179224014,
0.049839962273836136,
0.03979787230491638,
-0.007262712344527245,
0.005843297578394413,
-0.0978906974196434,
0.029100263491272926,
0.024840185418725014,
-0.011728756129741669,
0.036994971334934235,
-0.05755846947431564,
0.016809193417429924,
0.10532841086387634,
0.040135741233825684,
0.051635969430208206,
0.008006487041711807,
-0.05116545408964157,
-0.11712050437927246,
0.1763288974761963,
-0.11831972748041153,
-0.23028700053691864,
-0.12128487974405289,
-0.012982514686882496,
0.03150848671793938,
-0.012953351251780987,
0.025938911363482475,
-0.07433073222637177,
-0.11664986610412598,
-0.0922725722193718,
0.04694730415940285,
0.059740062803030014,
-0.08346977084875107,
-0.062362488359212875,
0.06679393351078033,
0.0457296296954155,
-0.1380528211593628,
0.026153815910220146,
0.035679563879966736,
-0.09117627143859863,
0.005887721199542284,
0.08140957355499268,
0.06103856489062309,
0.1818755865097046,
0.012728521600365639,
-0.023938871920108795,
0.019584620371460915,
0.20903365314006805,
-0.136505126953125,
0.10589402914047241,
0.13493265211582184,
-0.0703483521938324,
0.08147261291742325,
0.2107224464416504,
0.0418342649936676,
-0.10617547482252121,
0.04455582797527313,
0.034235551953315735,
-0.0238803718239069,
-0.25054290890693665,
-0.07808786630630493,
0.007576430216431618,
-0.06175751984119415,
0.06809944659471512,
0.08130444586277008,
0.09570267051458359,
0.01984638161957264,
-0.10488120466470718,
-0.06586658954620361,
0.05113326013088226,
0.11108365654945374,
-0.007418854162096977,
-0.012006757780909538,
0.0969165563583374,
-0.020286425948143005,
0.028002621605992317,
0.09235991537570953,
0.0084880031645298,
0.18746548891067505,
0.05100390687584877,
0.14692288637161255,
0.09142749756574631,
0.06584213674068451,
0.015684716403484344,
0.006666323635727167,
0.015644695609807968,
0.02073444239795208,
-0.014378254301846027,
-0.0880797803401947,
-0.0017288135131821036,
0.12815876305103302,
0.020411469042301178,
0.050393857061862946,
0.005088018253445625,
-0.032580070197582245,
0.08683152496814728,
0.17358696460723877,
0.010363306850194931,
-0.1908130794763565,
-0.07101033627986908,
0.06939493864774704,
-0.08181700855493546,
-0.10146915167570114,
-0.02635601907968521,
0.04305123910307884,
-0.17831183969974518,
0.014033086597919464,
-0.022382382303476334,
0.10410568863153458,
-0.11462701857089996,
-0.012489398010075092,
0.04906824603676796,
0.07298072427511215,
-0.016658522188663483,
0.06773389875888824,
-0.18002092838287354,
0.1395270675420761,
0.01758507452905178,
0.07150158286094666,
-0.08825206011533737,
0.08410486578941345,
0.003178939688950777,
0.0013509939890354872,
0.14415407180786133,
0.0013785995543003082,
-0.0523817352950573,
-0.10979107022285461,
-0.08634650707244873,
-0.009079654701054096,
0.13044366240501404,
-0.12778301537036896,
0.10016698390245438,
-0.01834736578166485,
-0.045373477041721344,
0.005183245521038771,
-0.11240560561418533,
-0.14056962728500366,
-0.1725207269191742,
0.04330243170261383,
-0.13124029338359833,
0.04465160518884659,
-0.10545487701892853,
-0.048093315213918686,
-0.05306214094161987,
0.19742146134376526,
-0.22286871075630188,
-0.07013117522001266,
-0.1519971340894699,
-0.05761480703949928,
0.119932159781456,
-0.04775578901171684,
0.08312731981277466,
0.012994625605642796,
0.18674440681934357,
0.014313536696135998,
-0.013770169578492641,
0.11090241372585297,
-0.10466983169317245,
-0.21406547725200653,
-0.10291838645935059,
0.14246919751167297,
0.13924811780452728,
0.041273895651102066,
0.0022257522214204073,
0.02827414683997631,
-0.014804026111960411,
-0.11688549816608429,
0.020713498815894127,
0.1711113303899765,
0.11356078088283539,
0.031762681901454926,
-0.045852549374103546,
-0.12838490307331085,
-0.08528922498226166,
-0.04527286812663078,
0.01937401480972767,
0.1929924041032791,
-0.07334718853235245,
0.17354312539100647,
0.15734395384788513,
-0.05666225776076317,
-0.1967383325099945,
0.02808118239045143,
0.04254651814699173,
0.0018926940392702818,
0.058352239429950714,
-0.19716250896453857,
0.0960150957107544,
0.0021078127902001143,
-0.054582200944423676,
0.11626559495925903,
-0.18086016178131104,
-0.1472223997116089,
0.055250246077775955,
0.06544214487075806,
-0.1867036670446396,
-0.12468403577804565,
-0.09152166545391083,
-0.040479280054569244,
-0.12750375270843506,
0.08364081382751465,
-0.015219016931951046,
0.011511581018567085,
0.03329310938715935,
0.02034589648246765,
0.010542148724198341,
-0.043612707406282425,
0.18297483026981354,
-0.0074994368478655815,
0.04291056841611862,
-0.07745802402496338,
-0.06123793497681618,
0.04548247158527374,
-0.06682101637125015,
0.0688505694270134,
-0.012457388453185558,
0.01576600968837738,
-0.10679414868354797,
-0.05470338836312294,
-0.03223368898034096,
0.019370099529623985,
-0.08504306524991989,
-0.10194364190101624,
-0.036353081464767456,
0.09871356934309006,
0.09517461061477661,
-0.037792425602674484,
-0.056679584085941315,
-0.08485732227563858,
0.04062115028500557,
0.20317383110523224,
0.18020522594451904,
0.053560756146907806,
-0.06437430530786514,
-0.006059312727302313,
-0.013237647712230682,
0.049002740532159805,
-0.22129850089550018,
0.05923459306359291,
0.041168149560689926,
0.03180031478404999,
0.11860810965299606,
-0.023935925215482712,
-0.1587793081998825,
-0.0502057746052742,
0.05410148575901985,
-0.07425004243850708,
-0.1685684472322464,
0.010434879921376705,
0.08286356180906296,
-0.1552492380142212,
-0.022906674072146416,
0.04575012996792793,
-0.020043641328811646,
-0.03438226878643036,
0.00707294000312686,
0.07919111847877502,
0.009836919605731964,
0.08478374034166336,
0.057017721235752106,
0.0959276556968689,
-0.10216023027896881,
0.06617968529462814,
0.08096546679735184,
-0.09338610619306564,
0.03410530090332031,
0.07545924931764603,
-0.07126593589782715,
-0.037233464419841766,
0.04482624679803848,
0.0918767899274826,
0.031775590032339096,
-0.050642579793930054,
0.012327476404607296,
-0.10012588649988174,
0.05418751388788223,
0.11697539687156677,
0.03980601206421852,
0.0020653458777815104,
0.0349934883415699,
0.04598642885684967,
-0.09361135214567184,
0.12619003653526306,
0.03253564611077309,
0.024358928203582764,
-0.044029660522937775,
-0.027948984876275063,
0.033686719834804535,
-0.020634718239307404,
-0.014900618232786655,
-0.04131974279880524,
-0.06906769424676895,
-0.011919837445020676,
-0.17663416266441345,
-0.0006877299747429788,
-0.03835081309080124,
0.008035878650844097,
0.01438689511269331,
-0.03798643499612808,
0.008271864615380764,
0.015990857034921646,
-0.07275852560997009,
-0.05440134555101395,
-0.01070401445031166,
0.10120883584022522,
-0.16839949786663055,
0.013798215426504612,
0.0738481730222702,
-0.11845122277736664,
0.08829576522111893,
0.01660950295627117,
0.004566526506096125,
0.03947852551937103,
-0.12990154325962067,
0.0469437912106514,
-0.015183643437922001,
0.017251212149858475,
0.051821283996105194,
-0.20713716745376587,
-0.005219681188464165,
-0.053738780319690704,
-0.054747533053159714,
-0.008454185910522938,
-0.028378764167428017,
-0.11614704132080078,
0.10657370090484619,
0.006339828949421644,
-0.07519937306642532,
-0.027563083916902542,
0.034499529749155045,
0.07487460225820541,
-0.031029552221298218,
0.1542745679616928,
-0.014918236993253231,
0.06987065821886063,
-0.1874280571937561,
-0.023337583988904953,
-0.014252493157982826,
0.024976249784231186,
-0.03739270567893982,
-0.01777520589530468,
0.05066380277276039,
-0.025644395500421524,
0.1947220265865326,
-0.02277233451604843,
0.05517526715993881,
0.06517178565263748,
-0.015353423543274403,
-0.025753356516361237,
0.10341554135084152,
0.055761225521564484,
0.015996338799595833,
0.03251899033784866,
0.007716674357652664,
-0.03165765851736069,
-0.005552713759243488,
-0.167100191116333,
0.07967466861009598,
0.16496649384498596,
0.08635497838258743,
-0.014588052406907082,
0.06132662668824196,
-0.11290588229894638,
-0.11605644226074219,
0.09777160733938217,
-0.056159622967243195,
-0.01740921474993229,
-0.062441661953926086,
0.13894620537757874,
0.1522199958562851,
-0.19082458317279816,
0.06211152300238609,
-0.06795507669448853,
-0.0487544871866703,
-0.10746019333600998,
-0.16687791049480438,
-0.05764069780707359,
-0.05954143404960632,
-0.020104030147194862,
-0.05745544657111168,
0.06959457695484161,
0.07283110171556473,
0.017621422186493874,
0.012575851753354073,
0.07775423675775528,
-0.017673097550868988,
0.00843984168022871,
0.026977673172950745,
0.06567810475826263,
0.013495570048689842,
-0.04381807893514633,
0.016235843300819397,
-0.00015613723371643573,
0.034048307687044144,
0.047009509056806564,
0.039173372089862823,
-0.03012777306139469,
0.005396591499447823,
-0.03004968911409378,
-0.1132737398147583,
0.04056783393025398,
-0.0245139729231596,
-0.06442589312791824,
0.13803128898143768,
0.026449358090758324,
-0.006702050566673279,
-0.025474393740296364,
0.2641041576862335,
-0.07600386440753937,
-0.09474562108516693,
-0.13578693568706512,
0.13365262746810913,
-0.0308542363345623,
0.06413768976926804,
0.033664409071207047,
-0.11381697654724121,
0.027896301820874214,
0.145524263381958,
0.14766931533813477,
-0.059594202786684036,
0.018058648332953453,
0.023248950019478798,
0.0036677704192698,
-0.038663145154714584,
0.05093686655163765,
0.07642526924610138,
0.13084270060062408,
-0.057510439306497574,
0.07993458956480026,
-0.00528855761513114,
-0.09648048877716064,
-0.03070426546037197,
0.12046385556459427,
-0.005974611733108759,
0.018961863592267036,
-0.06711561232805252,
0.12644343078136444,
-0.043718259781599045,
-0.261628121137619,
0.05282887443900108,
-0.06905496120452881,
-0.14716462790966034,
-0.02855629473924637,
0.05909299477934837,
-0.00726199010387063,
0.02540661208331585,
0.06713409721851349,
-0.06904488801956177,
0.19428247213363647,
0.03470597416162491,
-0.044902503490448,
-0.06258992105722427,
0.07463990896940231,
-0.10928831994533539,
0.28889188170433044,
0.010627356357872486,
0.05702703818678856,
0.1010323017835617,
-0.02710605598986149,
-0.13230937719345093,
0.030603965744376183,
0.08569987118244171,
-0.08157077431678772,
0.049359869211912155,
0.2173999398946762,
-0.00799210648983717,
0.11221332848072052,
0.0741662085056305,
-0.09916665405035019,
0.052276816219091415,
-0.10220054537057877,
-0.09391136467456818,
-0.08265925943851471,
0.09803684055805206,
-0.05557653307914734,
0.14824360609054565,
0.12248145043849945,
-0.04785078391432762,
0.022196060046553612,
-0.022353654727339745,
0.04894673451781273,
0.006722010672092438,
0.12958186864852905,
0.013888917863368988,
-0.19708466529846191,
0.027539461851119995,
-0.004416270647197962,
0.09896787256002426,
-0.2124645709991455,
-0.10066045075654984,
0.05214649438858032,
0.00458158552646637,
-0.06152847036719322,
0.12505200505256653,
0.06458623707294464,
0.040626320987939835,
-0.045448239892721176,
-0.0330616720020771,
-0.008380461484193802,
0.1610291600227356,
-0.10901795327663422,
-0.004472559317946434
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-classification | tobiasmj97/test_bert | [
"transformers",
"safetensors",
"bert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-12T13:25:24+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
46,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06817419826984406,
0.1699906885623932,
-0.003845146857202053,
0.018365124240517616,
0.11478200554847717,
0.00763329304754734,
0.07986336201429367,
0.10738246887922287,
-0.0269484706223011,
0.1267213374376297,
0.03862300142645836,
0.1017010435461998,
0.11044707149267197,
0.18616852164268494,
0.002953584771603346,
-0.2117370218038559,
0.062315817922353745,
-0.11355884373188019,
0.01421935111284256,
0.12174045294523239,
0.14285145699977875,
-0.10472407191991806,
0.07340893894433975,
-0.03533155843615532,
-0.019184017553925514,
-0.029508300125598907,
-0.06138347089290619,
-0.062117863446474075,
0.06899366527795792,
0.06911981105804443,
0.06776530295610428,
0.02535320073366165,
0.07980640977621078,
-0.2927248775959015,
0.019224179908633232,
0.07704847306013107,
0.004596637096256018,
0.06310366839170456,
0.07900875061750412,
-0.06604467332363129,
0.12630145251750946,
-0.0469624362885952,
0.15577000379562378,
0.07483451068401337,
-0.09700790792703629,
-0.1833430528640747,
-0.07868417352437973,
0.08138132095336914,
0.1542958915233612,
0.0575118213891983,
-0.03566069155931473,
0.14360417425632477,
-0.0863327905535698,
0.015191552229225636,
0.06608161330223083,
-0.07603584229946136,
-0.05265629291534424,
0.04255614057183266,
0.07708034664392471,
0.09375373274087906,
-0.1291297972202301,
-0.010211804881691933,
0.04229271039366722,
0.01873886212706566,
0.10347303748130798,
0.02310175821185112,
0.11163661628961563,
0.026270611211657524,
-0.13941870629787445,
-0.06378244608640671,
0.1267201453447342,
0.02999917045235634,
-0.05697820335626602,
-0.23340454697608948,
-0.007031846325844526,
-0.028088124468922615,
-0.024382783100008965,
-0.03983099386096001,
0.03844287618994713,
-0.0294374767690897,
0.07875318825244904,
0.011917876079678535,
-0.07096433639526367,
-0.04893866181373596,
0.08819517493247986,
0.06123629957437515,
0.022971229627728462,
-0.02526908740401268,
0.02413375861942768,
0.11652170121669769,
0.09283795207738876,
-0.11929406225681305,
-0.06425759196281433,
-0.06432286649942398,
-0.08888134360313416,
-0.04847237840294838,
0.03574979677796364,
0.0754702165722847,
0.04938753694295883,
0.19765597581863403,
0.006366121117025614,
0.05646394565701485,
0.0260426327586174,
0.015338202007114887,
0.06355882436037064,
0.07606974244117737,
-0.0483609177172184,
-0.13532373309135437,
-0.041331104934215546,
0.11784996092319489,
0.007102925330400467,
-0.032494835555553436,
-0.03608081117272377,
0.06173410639166832,
0.05820438638329506,
0.1192656010389328,
0.06626396626234055,
0.019241811707615852,
-0.06749388575553894,
-0.03806937485933304,
0.1874811202287674,
-0.1540532261133194,
0.020778683945536613,
0.01720726117491722,
-0.05474008247256279,
-0.043989501893520355,
0.0171356238424778,
0.008756347931921482,
-0.02707439661026001,
0.10765543580055237,
-0.0681026354432106,
-0.03794260695576668,
-0.10775765031576157,
-0.057500679045915604,
0.032596319913864136,
-0.011795170605182648,
-0.030085675418376923,
-0.0443500280380249,
-0.1081358790397644,
-0.07622874528169632,
0.06656987965106964,
-0.06241556629538536,
-0.07165607810020447,
-0.03565853461623192,
-0.05456356331706047,
0.012712954543530941,
0.002376573858782649,
0.12743701040744781,
-0.02916865609586239,
0.04608776792883873,
-0.04567936435341835,
0.06814887374639511,
0.13260088860988617,
0.03273140639066696,
-0.07753180712461472,
0.0658058449625969,
-0.21566881239414215,
0.10687019675970078,
-0.09710393846035004,
0.030530039221048355,
-0.1602926403284073,
-0.027380328625440598,
0.025517668575048447,
0.035233598202466965,
-0.01142354216426611,
0.1405038684606552,
-0.18839864432811737,
-0.036833859980106354,
0.17594264447689056,
-0.13455410301685333,
-0.09238629788160324,
0.06278568506240845,
-0.057844966650009155,
0.12792403995990753,
0.05209182947874069,
-0.027332304045557976,
0.059202857315540314,
-0.13285812735557556,
-0.024411480873823166,
-0.0557100772857666,
-0.0024997375439852476,
0.1512058526277542,
0.06197551265358925,
-0.05537422001361847,
0.02062765136361122,
0.020016051828861237,
-0.024297641590237617,
-0.045233841985464096,
-0.034582652151584625,
-0.0977277010679245,
0.006374812684953213,
-0.07783913612365723,
0.015467152930796146,
-0.014978265389800072,
-0.08572793006896973,
-0.037934768944978714,
-0.15898989140987396,
-0.0011305080261081457,
0.09650373458862305,
0.007345336955040693,
-0.029424650594592094,
-0.09241348505020142,
0.005526319146156311,
0.014208783395588398,
-0.01407501008361578,
-0.15675009787082672,
-0.05031281337141991,
0.03119790367782116,
-0.16866113245487213,
0.033627450466156006,
-0.04903757572174072,
0.03549545630812645,
0.04459671676158905,
-0.04535774141550064,
-0.02160848118364811,
0.0152364457026124,
0.017460787668824196,
-0.02394135482609272,
-0.24046528339385986,
-0.016492176800966263,
-0.049182213842868805,
0.17930001020431519,
-0.24510087072849274,
0.04199686273932457,
0.062341514974832535,
0.12092601507902145,
0.005246761720627546,
-0.047405339777469635,
0.03611646965146065,
-0.04782456159591675,
-0.04614211246371269,
-0.06458985060453415,
-0.004041698761284351,
-0.03005247749388218,
-0.04619463160634041,
0.04105473682284355,
-0.19605930149555206,
-0.029964644461870193,
0.11028317362070084,
0.07146124541759491,
-0.1701718270778656,
-0.07740049809217453,
-0.03032514825463295,
-0.06061795726418495,
-0.09144899994134903,
-0.04754206910729408,
0.10501570999622345,
0.0424359068274498,
0.054926108568906784,
-0.07243066281080246,
-0.047703035175800323,
0.012159520760178566,
-0.008316845633089542,
-0.035265736281871796,
0.0910128578543663,
0.09147894382476807,
-0.1183665320277214,
0.1003284826874733,
0.06719938665628433,
0.061502620577812195,
0.10171586275100708,
0.005867301486432552,
-0.09559345990419388,
-0.012123096734285355,
0.023821083828806877,
0.014739413745701313,
0.13627171516418457,
-0.08041682839393616,
0.03041158802807331,
0.043761420994997025,
-0.03445654734969139,
0.011279189959168434,
-0.10341424494981766,
0.02347799763083458,
0.03186830133199692,
-0.007050554268062115,
0.025736309587955475,
-0.054652560502290726,
0.013161799870431423,
0.1042112186551094,
0.03211836516857147,
0.0227707140147686,
0.015011876821517944,
-0.03876445069909096,
-0.12403564900159836,
0.17888623476028442,
-0.09523385018110275,
-0.25718894600868225,
-0.12982366979122162,
0.0025806569028645754,
0.04723223298788071,
-0.01322246715426445,
0.01721704937517643,
-0.057064954191446304,
-0.10620168596506119,
-0.10562704503536224,
0.017637979239225388,
0.05363597348332405,
-0.08985256403684616,
-0.06360358744859695,
0.05353172495961189,
0.038684699684381485,
-0.12286891043186188,
0.023170825093984604,
0.04556644707918167,
-0.0685787945985794,
0.004107215907424688,
0.05788148567080498,
0.08483386784791946,
0.18220773339271545,
0.013182112947106361,
-0.017085859552025795,
0.012520790100097656,
0.22458304464817047,
-0.14599265158176422,
0.09336943179368973,
0.13670575618743896,
-0.0603153258562088,
0.08385994285345078,
0.20927630364894867,
0.031639765948057175,
-0.09247095137834549,
0.04077373072504997,
0.032938770949840546,
-0.040111273527145386,
-0.23512989282608032,
-0.07784179598093033,
0.0005755177116952837,
-0.07578593492507935,
0.10564399510622025,
0.09113350510597229,
0.11394096910953522,
0.05373004451394081,
-0.10628228634595871,
-0.06785868853330612,
0.04576247185468674,
0.11892180144786835,
-0.020387137308716774,
0.0034232554025948048,
0.09533460438251495,
-0.032669007778167725,
0.016892950981855392,
0.0903218612074852,
0.010076770558953285,
0.18146716058254242,
0.040793538093566895,
0.12895575165748596,
0.08216089755296707,
0.06404399126768112,
0.023877892643213272,
0.01690720207989216,
0.028041476383805275,
0.02853785827755928,
-0.021422842517495155,
-0.08959300816059113,
-0.01811058260500431,
0.14208537340164185,
0.03174193948507309,
0.030387144535779953,
0.009561240673065186,
-0.0344390794634819,
0.0656830444931984,
0.16341377794742584,
0.01373966783285141,
-0.23032663762569427,
-0.06265294551849365,
0.07538370788097382,
-0.07251506298780441,
-0.11472991853952408,
-0.007447437848895788,
0.029569825157523155,
-0.17949488759040833,
0.045079123228788376,
-0.02245110087096691,
0.1028464064002037,
-0.11004801839590073,
-0.024476202204823494,
0.04228143393993378,
0.06811302900314331,
-0.03619502857327461,
0.07936927676200867,
-0.21071307361125946,
0.14414268732070923,
0.0071875168941915035,
0.0627245381474495,
-0.10963346809148788,
0.08230046182870865,
0.02151823230087757,
0.009466269053518772,
0.16101586818695068,
-0.0074920570477843285,
-0.09318114817142487,
-0.07651645690202713,
-0.07556641101837158,
-0.011319656856358051,
0.09559466689825058,
-0.10184428840875626,
0.08486217260360718,
-0.008358954451978207,
-0.03313955292105675,
-0.00388424564152956,
-0.1140027567744255,
-0.13622364401817322,
-0.18601436913013458,
0.05523287504911423,
-0.11181046068668365,
0.03691478446125984,
-0.11166879534721375,
-0.06252610683441162,
-0.02911795862019062,
0.19807842373847961,
-0.1904531568288803,
-0.08140338957309723,
-0.14539870619773865,
-0.07204011082649231,
0.12212951481342316,
-0.04274967685341835,
0.07663191109895706,
0.00015701932716183364,
0.2071707546710968,
-0.004644640255719423,
0.0014644638868048787,
0.0856679305434227,
-0.09557735919952393,
-0.206184521317482,
-0.09439684450626373,
0.13821037113666534,
0.12497473508119583,
0.04596934840083122,
-0.0036321566440165043,
0.024304913356900215,
-0.0027867835015058517,
-0.10976199060678482,
0.02332260087132454,
0.1432444006204605,
0.08416087180376053,
0.03885705769062042,
-0.02675866149365902,
-0.14533737301826477,
-0.1054752767086029,
-0.05289754271507263,
0.019448768347501755,
0.17674845457077026,
-0.07222644239664078,
0.1607094258069992,
0.15837931632995605,
-0.06414622813463211,
-0.20734171569347382,
0.032782182097435,
0.03679283335804939,
-0.011663361452519894,
0.03244366869330406,
-0.20815548300743103,
0.07330463081598282,
0.016213007271289825,
-0.06075131520628929,
0.1363404095172882,
-0.1705039143562317,
-0.14891991019248962,
0.0919104814529419,
0.07189090549945831,
-0.2193969339132309,
-0.13394345343112946,
-0.09907522052526474,
-0.055755600333213806,
-0.10410746932029724,
0.08695419132709503,
0.014253350906074047,
0.004559517838060856,
0.040003977715969086,
0.024713784456253052,
0.021094202995300293,
-0.05303549766540527,
0.19554594159126282,
-0.004308625590056181,
0.041122131049633026,
-0.08143328875303268,
-0.08729361742734909,
0.030160382390022278,
-0.06146852299571037,
0.07429458200931549,
-0.02577015943825245,
0.004456855356693268,
-0.1102396696805954,
-0.06384536623954773,
-0.05289682373404503,
0.03639809414744377,
-0.08915901929140091,
-0.0958789587020874,
-0.05767008289694786,
0.10389325767755508,
0.08919540792703629,
-0.03324571251869202,
-0.058615610003471375,
-0.10058292001485825,
0.0726626068353653,
0.22699709236621857,
0.18807223439216614,
0.07284927368164062,
-0.07015843689441681,
0.0006279588560573757,
-0.022037893533706665,
0.05516184866428375,
-0.20622296631336212,
0.04608523100614548,
0.042553652077913284,
0.028887338936328888,
0.13527612388134003,
-0.02506665140390396,
-0.1602775603532791,
-0.04527048021554947,
0.06014934554696083,
-0.06545355916023254,
-0.1614707112312317,
-0.0005388054414652288,
0.09576781094074249,
-0.16179001331329346,
-0.06273222714662552,
0.024773813784122467,
-0.036137934774160385,
-0.025756290182471275,
0.0013679420808330178,
0.08270203322172165,
0.027825508266687393,
0.11478793621063232,
0.06896458566188812,
0.11150709539651871,
-0.10231363028287888,
0.08406093716621399,
0.09299708157777786,
-0.10971303284168243,
0.03247435390949249,
0.07298728823661804,
-0.0610542818903923,
-0.03390142321586609,
0.023122351616621017,
0.08364028483629227,
0.026266440749168396,
-0.0744837298989296,
-0.0008558011031709611,
-0.1099681630730629,
0.06663114577531815,
0.13796411454677582,
0.032853204756975174,
0.0030810926109552383,
0.04435998201370239,
0.025823330506682396,
-0.09881676733493805,
0.11186433583498001,
0.03916766867041588,
0.03720828518271446,
-0.04767070338129997,
0.004865953233093023,
0.041960928589105606,
-0.01269921287894249,
-0.016253290697932243,
-0.039693526923656464,
-0.06471271812915802,
-0.010708925314247608,
-0.15688052773475647,
0.031037067994475365,
-0.07176970690488815,
0.009115522727370262,
0.018755896016955376,
-0.033779606223106384,
0.0002807097043842077,
0.0073861307464540005,
-0.07919271290302277,
-0.03761441633105278,
-0.006646361667662859,
0.10705258697271347,
-0.15747743844985962,
0.008323745802044868,
0.08949586004018784,
-0.12556882202625275,
0.07766758650541306,
-0.007498627994209528,
-0.010838181711733341,
0.01879316382110119,
-0.14380721747875214,
0.06054820865392685,
-0.008177737705409527,
0.006405212916433811,
0.023949483409523964,
-0.20071232318878174,
0.005702852737158537,
-0.04664513096213341,
-0.053938448429107666,
-0.00976315326988697,
-0.04211960732936859,
-0.11404810100793839,
0.10492629557847977,
0.0196357611566782,
-0.0860515683889389,
-0.018402770161628723,
0.05309472978115082,
0.10592338442802429,
-0.057369641959667206,
0.1371336728334427,
-0.02283608354628086,
0.05825338885188103,
-0.17831756174564362,
-0.016339747235178947,
-0.017454219982028008,
0.012596609070897102,
-0.03102201037108898,
-0.008158523589372635,
0.05483707785606384,
-0.015072896145284176,
0.22714339196681976,
-0.021177595481276512,
0.030790245160460472,
0.06548503786325455,
0.0070373364724218845,
-0.013032838702201843,
0.08790382742881775,
0.04639120027422905,
0.021969040855765343,
0.017426103353500366,
0.016819516196846962,
-0.047575462609529495,
-0.019116412848234177,
-0.12834098935127258,
0.08396804332733154,
0.16439755260944366,
0.08264775574207306,
-0.005125291179865599,
0.053218428045511246,
-0.11920209228992462,
-0.08098750561475754,
0.10049403458833694,
-0.033211447298526764,
-0.001258186181075871,
-0.057700008153915405,
0.14298145473003387,
0.15607422590255737,
-0.1750815361738205,
0.06616412103176117,
-0.07047461718320847,
-0.05687202885746956,
-0.11070677638053894,
-0.17143365740776062,
-0.06694129854440689,
-0.03149404004216194,
-0.005430171266198158,
-0.06143372505903244,
0.06926561146974564,
0.10244123637676239,
0.008475886657834053,
0.002354414900764823,
0.08415096998214722,
-0.033749498426914215,
-0.0007962242234498262,
0.04344722256064415,
0.05283457785844803,
0.021373692899942398,
-0.06691429764032364,
0.0076249162666499615,
0.004598149098455906,
0.038937900215387344,
0.05476561188697815,
0.0317605659365654,
-0.014559607952833176,
0.011871086433529854,
-0.013089693151414394,
-0.09815122187137604,
0.03718226030468941,
-0.029980625957250595,
-0.0468674972653389,
0.14802806079387665,
0.01827765442430973,
0.0034919960889965296,
-0.021031659096479416,
0.23128560185432434,
-0.06903756409883499,
-0.0798255056142807,
-0.14009471237659454,
0.15071772038936615,
-0.04670744761824608,
0.05065378174185753,
0.04940982535481453,
-0.10087474435567856,
0.03407741338014603,
0.14691931009292603,
0.14527682960033417,
-0.02467990294098854,
0.007901503704488277,
0.011187983676791191,
0.0055741616524755955,
-0.025625228881835938,
0.05354921892285347,
0.04412171617150307,
0.12145667523145676,
-0.06669453531503677,
0.09297986328601837,
-0.007810541894286871,
-0.0844663754105568,
-0.02094031497836113,
0.1328510195016861,
0.0014671299140900373,
0.02338746376335621,
-0.0805477648973465,
0.11851188540458679,
-0.06559251248836517,
-0.25864502787590027,
0.061333827674388885,
-0.06666524708271027,
-0.15384668111801147,
-0.018917718902230263,
0.02399173192679882,
0.00401253392919898,
0.024401430040597916,
0.06268756836652756,
-0.06360985338687897,
0.14903949201107025,
0.03688151761889458,
-0.07834678888320923,
-0.07808853685855865,
0.07696148753166199,
-0.08397532254457474,
0.3018210828304291,
0.008228152059018612,
0.04951678216457367,
0.09650786966085434,
-0.03327273949980736,
-0.13361208140850067,
0.04569283500313759,
0.09728528559207916,
-0.06408768892288208,
0.06690182536840439,
0.19748380780220032,
-0.008177485316991806,
0.12026696652173996,
0.07469146698713303,
-0.08128973841667175,
0.057554539293050766,
-0.07613562047481537,
-0.09007242321968079,
-0.09192728251218796,
0.08888110518455505,
-0.060599785298109055,
0.15479759871959686,
0.13393908739089966,
-0.04440179467201233,
-0.001819826546125114,
-0.03071022778749466,
0.05197824910283089,
-0.002023093169555068,
0.1104598417878151,
0.022785736247897148,
-0.19388216733932495,
0.031831543892621994,
-0.014316190034151077,
0.0986877828836441,
-0.2479904145002365,
-0.07837841659784317,
0.0403057225048542,
-0.013808837160468102,
-0.05274871736764908,
0.12204353511333466,
0.052187733352184296,
0.04937480762600899,
-0.05449601635336876,
-0.057812657207250595,
-0.00025569170247763395,
0.16358551383018494,
-0.1094348207116127,
-0.00204258831217885
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.