sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
sequencelengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
sequencelengths
0
25
languages
sequencelengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
sequencelengths
0
352
processed_texts
sequencelengths
1
353
12c8dcc5fb7a739c8b5aa44eb3d310d8c549da4f
This dataset is a processed version of an english wikipedia dump (https://dumps.wikimedia.org/). Articles are ordered from most referenced to least referenced. Information between "{{" and "}}" tags has been removed. The format of the CSV is | Article Name | Metadata | Mentions | Braceless Text | |---------------------|---------------------------------|-----------|-----------------------------------------------------------------------------------------------------| | United States | Timestamp: 2023-03-01T01:56:51Z | 404035.0 | The '''United States of America''' ('''U.S.A.'... | | The New York Times | Timestamp: 2023-02-28T10:41:15Z | 267551.0 | '''''The New York Times''''' ('''''the Times''... | | France | Timestamp: 2023-02-23T19:17:40Z | 238391.0 | '''France''' ()<!-- Do not add English pronunc... | | Germany | Timestamp: 2023-02-25T19:46:11Z | 230283.0 | '''Germany''', officially the '''Federal Repub... | | Billboard (magazine)| Timestamp: 2023-02-24T02:33:29Z | 201064.0 | '''''Billboard''''' (stylized as '''billboard'... | --- license: cc-by-3.0 https://creativecommons.org/licenses/by-sa/3.0/ --- Here is a single sample for the 0th item, the most reference article. ``` The '''United States of America''' ('''U.S.A.''' or '''USA'''), commonly known as the '''United States''' ('''U.S.''' or '''US''') or '''America''', is a country [[Continental United States|primarily located]] in [[North America]]. It consists of 50 [[U.S. state|states]], a [[Washington, D.C.|federal district]], five major [[unincorporated territories]], nine [[United States Minor Outlying Islands|Minor Outlying Islands]], and 326 [[Indian reservation]]s. The United States is also in [[Compact of Free Association|free association]] with three [[Oceania|Pacific Island]] [[Sovereign state|sovereign states]]: the [[Federated States of Micronesia]], the [[Marshall Islands]], and the [[Palau|Republic of Palau]]. It is the world's [[List of countries and dependencies by area|third-largest country]] by both land and total area. It shares land borders [[Canada–United States border|with Canada]] to its north and [[Mexico–United States border|with Mexico]] to its south and has maritime borders with the [[Bahamas]], [[Cuba]], [[Russia]], and other nations. With a population of over 333 million, it is the [[List of countries in the Americas by population|most populous]] country in the [[Americas]] and the [[List of countries and dependencies by population|third most populous]] in the world. The national capital of the United States is [[Washington, D.C.]] and its [[List of United States cities by population|most populous]] city and principal [[Financial centre|financial center]] is [[New York City]]. <!-- History --> [[Paleo-Indians|Paleo-Americans]] [[Settlement of the Americas|migrated from Siberia]] to the North American mainland at least 12,000 years ago, and are the ancestors of modern [[Native Americans in the United States|Native Americans]]. [[Kingdom of Great Britain|Great Britain]]'s [[Thirteen Colonies]] quarreled with the British Crown over taxation and [[No taxation without representation|political representation]], leading to the [[American Revolution]] (1765–1791). After the Revolution, the United States gained independence, the first [[Nation state|nation-state]] founded on [[American Enlightenment|Enlightenment]] principles of [[liberal democracy]]. In the late 18th century, the U.S. began expanding across North America, gradually [[Territorial evolution of the United States|obtaining new territories]], sometimes through war, frequently [[Indian removal|displacing Native Americans]], and admitting new states. By 1848, the United States spanned the [[continent]] from east to west. The controversy surrounding the practice of [[Slavery in the United States|slavery]] culminated in the secession of the [[Confederate States of America]], which fought the remaining states of the [[Union (American Civil War)|Union]] during the [[American Civil War]] (1861–1865). With the Union's victory and preservation, slavery was abolished by the [[Thirteenth Amendment to the United States Constitution|Thirteenth Amendment]]. By 1890, the United States had grown to become the world's largest economy, and the [[Spanish–American War]] and [[World War I]] established the country as a [[Great power|world power]]. After Japan's [[Attack on Pearl Harbor|surprise attack on Pearl Harbor]] in 1941, the U.S. entered [[World War II]] on the [[Allies of World War II|Allied side]]. The aftermath of the war left the United States and the [[Soviet Union]] as the world's two [[superpower]]s and led to the [[Cold War]], which commenced in 1945 and ended in 1991 with the [[Dissolution of the Soviet Union|Soviet Union's dissolution]]. During the Cold War, both countries engaged in a struggle for ideological dominance but avoided direct military conflict. They also competed in the [[Space Race]], which culminated in the [[Apollo 11|1969 American spaceflight]] in which the U.S. was the first nation to land humans on the [[Moon]]. Simultaneously, the [[civil rights movement]] (1954–1968) led to legislation abolishing state and local [[Jim Crow laws]] and other codified racial discrimination against [[African Americans]]. With the Soviet Union's dissolution in 1991 and the end of the Cold War, the United States emerged as the world's sole superpower. In 2001, following the [[September 11 attacks]], the United States became a lead member of the [[War on terror|Global War on Terrorism]], which included the [[War in Afghanistan (2001–2021)|War in Afghanistan]] (2001–2021) and the [[Iraq War]] (2003–2011). <!-- Government and citizens --> The United States is a [[Federal government of the United States|federal republic]] with [[Separation of powers under the United States Constitution|three separate branches of government]], including a [[Bicameralism|bicameral legislature]]. It is a liberal democracy and has a [[market economy]]. It [[International rankings of the United States|ranks very high]] in international measures of [[Human Development Index|quality of life]], [[Income in the United States|income]] and [[Affluence in the United States|wealth]], [[Global Competitiveness Report#2022 rankings|economic competitiveness]], [[Human rights in the United States|human rights]], [[Global Innovation Index|innovation]], and [[Education in the United States|education]]; it has low levels of [[Corruption Perceptions Index|perceived corruption]]. The United States has the highest [[Median income|median income per person]] of any [[polity]] in the world, although it has high levels of [[United States incarceration rate|incarceration]] and [[Inequality in the United States|inequality]] and lacks [[universal health care]]. As a [[melting pot]] of [[Culture of the United States|cultures]] and [[Race and ethnicity in the United States|ethnicities]], the U.S. has been shaped by [[History of immigration to the United States|centuries of immigration]]. <!-- Economy and global perspective --> The United States is a highly [[developed country]], and [[Economy of the United States|its economy]] accounts for approximately a quarter of global [[gross domestic product|GDP]] and is the world's [[List of countries by GDP (nominal)|largest]] by GDP at market exchange rates. By value, the United States is the world's [[List of countries by imports|largest]] importer and [[List of countries by exports|second-largest]] exporter. Although it accounts for just over 4.2% of the world's total population, the U.S. holds [[List of countries by total wealth|over 30%]] of the total wealth in the world, the largest share held by any country. The United States is a founding member of the [[United Nations]], [[World Bank]], [[International Monetary Fund]], [[Organization of American States]], and [[NATO]], and is a [[Permanent members of the United Nations Security Council|permanent member]] of the [[United Nations Security Council]]. The country is responsible for more than a third of [[List of countries by military expenditures|global military spending]] and is the [[United States Armed Forces|foremost military power]] in the world and a leading [[Politics of the United States|political]], cultural, and [[Science and technology in the United States|scientific]] force. ==Etymology== <!-- linked --> The first known use of the name "[[Americas|America]]" dates to 1507, when it appeared on [[Waldseemüller map|a world map]] produced by the German cartographer [[Martin Waldseemüller]] in [[Saint-Dié-des-Vosges|Saint Dié]], [[Duchy of Lorraine|Lorraine]] (now northeastern France). On his map, the name is shown in large letters on what would now be considered [[South America]], honoring [[Amerigo Vespucci]]. The Italian explorer was the first to postulate that the [[West Indies]] did not represent Asia's eastern limit but were part of a previously unknown landmass.<ref></ref> In 1538, the Flemish cartographer [[Gerardus Mercator]] used the name "America" to refer to the entire [[Western Hemisphere]].<ref name="Cohen"></ref> The first documentary evidence of the phrase "United States of America" dates back to a letter from January 2, 1776, written by [[Stephen Moylan]] to [[Joseph Reed (politician)|Joseph Reed]], George Washington's [[aide-de-camp]]. Moylan expressed his wish to go "with full and ample powers from the United States of America to Spain" to seek assistance in the revolutionary war effort.<ref>DeLear, Byron (July 4, 2013) [https://www.csmonitor.com/USA/Politics/2013/0704/Who-coined-United-States-of-America-Mystery-might-have-intriguing-answer Who coined 'United States of America'? Mystery might have intriguing answer.] "Historians have long tried to pinpoint exactly when the name 'United States of America' was first used and by whom&nbsp;... This latest find comes in a letter that Stephen Moylan, Esq., wrote to Col. Joseph Reed from the Continental Army Headquarters in Cambridge, Mass., during the siege of Boston. The two men lived with Washington in Cambridge, with Reed serving as Washington's favorite military secretary and Moylan fulfilling the role during Reed's absence." ''Christian Science Monitor'' (Boston, MA).</ref><ref>Touba, Mariam (November 5, 2014) [https://blog.nyhistory.org/coined-phrase-united-states-america-may-never-guess/ Who Coined the Phrase 'United States of America'? You May Never Guess] "Here, on January 2, 1776, seven months before the Declaration of Independence and a week before the publication of Paine's ''Common Sense'', Stephen Moylan, an acting secretary to General George Washington, spells it out, 'I should like vastly to go with full and ample powers from the United States of America to Spain' to seek foreign assistance for the cause." ''New-York Historical Society Museum & Library''</ref><ref>Fay, John (July 15, 2016) [https://www.irishcentral.com/roots/history/The-forgotten-Irishman-who-named-the-United-States-of-America.html The forgotten Irishman who named the 'United States of America'] "According to the NY Historical Society, Stephen Moylan was the man responsible for the earliest documented use of the phrase 'United States of America'. But who was Stephen Moylan?" ''IrishCentral.com''</ref> The first known publication of the phrase "United States of America" was in an anonymous essay in ''[[The Virginia Gazette]]'' newspaper in [[Williamsburg, Virginia|Williamsburg]], on April&nbsp;6, 1776.<ref></ref> The second draft of the [[Articles of Confederation]] and [[Perpetual Union]], prepared by [[John Dickinson]] and completed no later than June&nbsp;17, 1776, declared "The name of this Confederation shall be the 'United States of America'." The final version of the Articles, sent to the states for ratification in late 1777, stated that "The Stile of this Confederacy shall be 'The United States of America'." In June 1776, [[Thomas Jefferson]] wrote the phrase "UNITED STATES OF AMERICA" in the headline of his "original Rough draught" of the [[United States Declaration of Independence|Declaration of Independence]]. This draft of the document did not surface until June&nbsp;21, 1776, and it is unclear whether it was written before or after Dickinson used the term in his June 17 draft of the Articles of Confederation. The phrase "United States" was originally plural in American usage. It described a collection of states—e.g., "the United States are..." The singular form became popular after the end of the Civil War and is now standard usage. A [[Citizenship of the United States|citizen of the United States]] is called an "[[Americans|American]]". "United States", "American", and "U.S." refer to the country adjectivally ("American values", "U.S.&nbsp;forces"). In English, the word "American" rarely refers to topics or subjects not directly connected with the United States.<ref></ref> ==History== ===Early history=== [[File:Extreme_Makeover,_Mesa_Verde_Edition_-_panoramio.jpg|thumb|right|[[Cliff Palace]], located in present-day [[Montezuma County, Colorado|Colorado]], was built by the [[Ancestral Puebloans]] between AD 1190 and 1260.|alt=Aerial view of the Cliff Palace]] It is generally accepted that the [[Paleo-indians|first inhabitants of North America]] migrated from [[Siberia]] by way of the [[Bering land bridge]] and arrived at least 12,000 years ago; however, some evidence suggests an even earlier date of arrival. The [[Clovis culture]], which appeared around 11,000 BC, is believed to represent the first wave of human settlement of the Americas. This was likely the first of three major waves of migration into North America; later waves brought the ancestors of present-day [[Alaskan Athabaskans|Athabaskans]], [[Aleut]]s, and [[Eskimo]]s. Over time, indigenous cultures in North America grew increasingly sophisticated, and some, such as the pre-Columbian [[Mississippian culture]] in the southeast, developed advanced [[agriculture]], [[architecture]], and complex societies. The city-state of [[Cahokia]] is the largest, most complex pre-Columbian [[archaeological site]] in the modern-day United States. In the [[Four Corners]] region, [[Ancestral Puebloan]] culture developed from centuries of agricultural experimentation. The [[Algonquian peoples|Algonquian]] are one of the most populous and widespread [[North America]]n [[Indigenous peoples of the Americas|native]] language groups. This grouping consists of the peoples who speak [[Algonquian languages]].<ref></ref> Historically, these peoples were prominent along the Atlantic Coast and into the interior along the [[Saint Lawrence River]] and around the [[Great Lakes]]. Before Europeans came into contact, most Algonquian settlements lived by hunting and fishing, although many supplemented their diet by cultivating [[maize|corn]], [[bean]]s and [[Cucurbita|squash]] (the "[[Three Sisters (agriculture)|Three Sisters"]]). The [[Ojibwe]] cultivated [[wild rice]].<ref></ref> The [[Haudenosaunee]] confederation of the [[Iroquoian peoples|Iroquois]], located in the southern Great Lakes region, was established at some point between the twelfth and fifteenth centuries.<ref name="Dean Snow"></ref> [[Population history of Indigenous peoples of the Americas|Estimating the native population of North America]] during European contact is difficult. [[Douglas H. Ubelaker]] of the [[Smithsonian Institution]] estimated a population of 93,000 in the [[South Atlantic states]] and a population of 473,000 in the Gulf states, but most academics regard this figure as too low. Anthropologist [[Henry F. Dobyns]] believed the populations were much higher, suggesting around 1.1 million along the shores of the [[Gulf of Mexico]], 2.2 million people living between [[Florida]] and [[Massachusetts]], 5.2 million in the [[Mississippi Valley]] and tributaries, and around 700,000 people in the [[Florida peninsula]]. ===Colonial America=== [[File:The_Mayflower_Compact_1620_cph.3g07155.jpg|thumb|left|The [[Mayflower Compact]] signed on the ''[[Mayflower]]'' in 1620 set an early precedent for [[self-government]] and [[constitutionalism]].]] Claims of very early colonization of [[New England#Geography|coastal New England]] by the [[Norse colonization of North America|Norse]] are disputed and controversial.<ref></ref><ref></ref> [[Christopher Columbus]] had landed in [[Puerto Rico]] on his [[Columbus's second voyage|1493 voyage]], and [[San Juan, Puerto Rico|San Juan]] was settled by the Spanish a decade later.<ref name="Operé2008"></ref> The first documented arrival of Europeans in the continental United States is that of Spanish [[conquistador]]s such as [[Juan Ponce de León]], who made his first expedition to [[Spanish Florida|Florida]] in 1513.<ref></ref> The Italian explorer [[Giovanni da Verrazzano]], sent by France to the New World in 1525, encountered native inhabitants of what is now [[New York Bay]].<ref></ref> The Spanish set up the first settlements in Florida and New Mexico, such as [[St. Augustine, Florida|Saint Augustine]], often considered the nation's oldest city,<ref></ref> and [[Santa Fe, New Mexico|Santa Fe]]. The French [[New France|established]] their own settlements along the [[Mississippi River]] and [[Gulf of Mexico]], notably [[New Orleans]] and [[Mobile, Alabama|Mobile]].<ref name="Petto2007"></ref> Successful [[British colonization of the Americas|English settlement]] of the eastern coast of North America began with the [[Colony of Virginia|Virginia Colony]] in 1607 at [[Jamestown, Virginia|Jamestown]] and with the [[Pilgrims (Plymouth Colony)|Pilgrims]]' [[Plymouth Colony|colony at Plymouth]] in 1620.<ref name="Jr.Selby2018"></ref><ref name="BellahSullivan2006"></ref> The continent's first elected legislative assembly, Virginia's [[House of Burgesses]], was founded in 1619. [[Harvard College]] was established in the [[Massachusetts Bay Colony]] in 1636 as the first institution of higher education. The [[Mayflower Compact]] and the [[Fundamental Orders of Connecticut]] established precedents for representative self-government and constitutionalism that would develop throughout the American colonies.<ref name="Remini2–3"></ref><ref name="Johnson26–30"></ref> Many English settlers were [[English Dissenters|dissenting Christians]] who came seeking [[Freedom of religion|religious freedom]]. The [[Population history of indigenous peoples of the Americas|native population of America declined]] after European arrival for various reasons,<ref></ref><ref></ref><ref>[[#Stannard|Stannard, 1993]] p. [[iarchive:americanholocaus00stan|xii]]</ref> primarily from diseases such as [[smallpox]] and [[measles]].<ref>"''[https://books.google.com/books?id=qubTdDk1H3IC&pg=PA205 The Cambridge encyclopedia of human paleopathology] ''". Arthur C. Aufderheide, Conrado Rodríguez-Martín, Odin Langsjoen (1998). [[Cambridge University Press]]. p. 205. </ref><ref>[[#Bianchine|Bianchine, Russo, 1992]] pp. 225–232</ref>[[File:Map of territorial growth 1775.svg|upright|thumb|The original [[Thirteen Colonies]] (shown in red) in 1775|alt=Map of the U.S. showing the original Thirteen Colonies along the eastern seaboard]] In the early days of colonization, many European settlers experienced food shortages, disease, and conflicts with [[Indigenous peoples of the Americas|Native Americans]], such as in [[King Philip's War]]. Native Americans were also often fighting neighboring tribes and European settlers. In many cases, however, the natives and settlers came to depend on each other. Settlers [[Columbian exchange|traded]] for food and animal pelts; natives for guns, tools and other European goods.<ref>[[#Ripper2008|Ripper, 2008]] p. 6</ref> Natives taught many settlers to cultivate corn, beans, and other foodstuffs. European missionaries and others felt it was important to "civilize" the Native Americans and urged them to adopt European agricultural practices and lifestyles.<ref>[[#Ripper2008|Ripper, 2008]] p. 5</ref><ref>[[#Calloway1998|Calloway, 1998]], p. 55</ref> However, with the increased European [[settler colonialism|colonization]] of North America, [[Native Americans in the United States|Native Americans]] were displaced and often killed during conflicts. European settlers also began [[human trafficking|trafficking]] [[Slavery in the colonial United States|African slaves]] into Colonial America via the [[Atlantic slave trade|transatlantic slave trade]].<ref></ref> By the turn of the 18th century, slavery had supplanted [[indentured servitude]] as the main source of agricultural labor for the [[cash crop]]s in the American South.<ref name="Quirk2011">[[#Quirk|Quirk, 2011]], p. 195</ref> Colonial society was divided over the religious and moral implications of slavery, and several colonies passed acts for or against the practice.<ref name="Lien522">[[#Lien|Lien, 1913]], p. 522</ref><ref name="Davis7">[[#Davis96|Davis, 1996]], p. 7</ref> The [[Thirteen Colonies]] that would become the United States of America were administered by the British as overseas dependencies.<ref name="BilhartzElliott2007"></ref> [[Colonial government in the Thirteen Colonies|All nonetheless had local governments]] with elections open to most free men.<ref name="Wood1998"></ref> With very high birth rates, low death rates, and steady settlement, the colonial population grew rapidly, eclipsing Native American populations.<ref>[[#Walton|Walton, 2009]], pp. 38–39</ref> The [[Christian revival]]ist movement of the 1730s and 1740s known as the [[First Great Awakening|Great Awakening]] fueled interest both in religion and in religious liberty.<ref></ref> During the [[Seven Years' War]] (1756–1763), known in the U.S. as the [[French and Indian War]], British forces captured Canada from the French. With the creation of the [[Province of Quebec (1763–1791)|Province of Quebec]], Canada's [[French language|francophone]] population would remain isolated from the English-speaking colonial dependencies of [[Nova Scotia]], [[Newfoundland Colony|Newfoundland]] and the Thirteen Colonies. Excluding the Native Americans who lived there, the Thirteen Colonies had a population of over 2.1&nbsp;million in 1770, about a third that of Britain. Despite continuing new arrivals, the rate of natural increase was such that by the 1770s only a small minority of Americans had been born overseas.<ref>[[#Walton|Walton, 2009]], p. 35</ref> The colonies' distance from Britain had allowed the development of self-government, but their unprecedented success motivated British monarchs to periodically seek to reassert royal authority.<ref></ref> ===American Revolution and the early federal republic=== [[File:Declaration independence.jpg|thumb|left|''[[Declaration of Independence (Trumbull)|Declaration of Independence]]'', a painting by [[John Trumbull]], depicts the [[Committee of Five]] presenting the draft of the [[United States Declaration of Independence|Declaration]] to the [[Second Continental Congress|Continental Congress]], June 28, 1776, in [[Philadelphia]].|alt=See caption]] The [[American Revolution]] separated the Thirteen Colonies from the [[British Empire]], and was the first successful [[war of independence]] by a non-European entity against a European power in [[modern history]]. By the 18th century the [[American Enlightenment]] and [[Liberalism in the United States|the political philosophies of liberalism]] were pervasive among leaders. Americans began to develop an ideology of "[[Republicanism in the United States|republicanism]]", asserting that government rested on the [[consent of the governed]]. They demanded their "[[Rights of Englishmen|rights as Englishmen]]" and "[[no taxation without representation]]".<ref></ref><ref>[https://books.google.com/books?id=OBvNHl6UYKsC&pg=PA126&dq=found+little+support+Recreating+the+American+Republic++By+Charles+A.+Kromkowski&hl=en&sa=X&ei=o-ZtUc7wBMSA0AWRzoDgAw&ved=0CDAQ6AEwAA#v=onepage&q=found%20little%20support%20Recreating%20the%20American%20Republic%20%20By%20Charles%20A.%20Kromkowski&f=false Recreating the American Republic – Charles A. Kromkowski]. Retrieved on 2013-07-15.</ref> The British insisted on administering the colonies through a [[Parliament of Great Britain|Parliament]] that did not have a single representative responsible for any American constituency, and the conflict escalated into war.<ref name="Humphrey2003"></ref> In 1774, the [[First Continental Congress]] passed the [[Continental Association]], which mandated a [[Thirteen Colonies|colonies-wide]] boycott of British goods. The [[American Revolutionary War]] began the following year, catalyzed by events like the [[Stamp Act 1765|Stamp Act]] and the [[Boston Tea Party]] that were rooted in colonial disagreement with British governance.<ref></ref><ref></ref> The [[Second Continental Congress]], an assembly representing the [[United Colonies]], unanimously adopted the [[United States Declaration of Independence|Declaration of Independence]] on July&nbsp;4, 1776 (annually celebrated as [[Independence Day (United States)|Independence Day]]).<ref name="YoungNash2011" /> In 1781, the [[Articles of Confederation]] and [[Perpetual Union]] established a decentralized government that operated until 1789.<ref name="YoungNash2011"></ref> A celebrated early turn in the war for the Americans was [[George Washington]] leading the Americans to [[George Washington's crossing of the Delaware River|cross the frozen Delaware River]] in a surprise attack the night of December 25–26, 1776. Another victory, in 1777, at the [[Battles of Saratoga|Battle of Saratoga]] resulted in the capture of a British army, and led to [[France in the American Revolutionary War|France]] and [[Spain in the American Revolutionary War|Spain]] joining in the war against them. After the surrender of a second British army at the [[Siege of Yorktown (1781)|siege of Yorktown]] in 1781, Britain signed a [[Treaty of Paris (1783)|peace treaty]]. American sovereignty became internationally recognized, and the new nation took possession of substantial territory east of the [[Mississippi River]], from what is today [[Canada]] in the north and [[Florida]] in the south.<ref></ref> As it became increasingly apparent that the Confederation was insufficient to govern the new country, [[Nationalism|nationalists]] advocated for and led the [[Constitutional Convention (United States)|Philadelphia Convention]] of 1787 in writing the [[United States Constitution]] to replace it, [[Ratification of the United States Constitution|ratified]] in state conventions in 1788. Going into force in 1789, this constitution reorganized the government into a [[federation]] administered by [[Separation of powers|three equal branches]] (executive, judicial and legislative), on the principle of creating salutary [[checks and balances]]. George Washington, who had led the [[Continental Army]] to victory and then willingly relinquished power, was the first [[President of the United States|president]] elected under the new constitution. The [[United States Bill of Rights|Bill of Rights]], forbidding federal restriction of [[Natural and legal rights|personal freedoms]] and guaranteeing a range of legal protections, was adopted in 1791.<ref name="BoyerJr.2007">[[#Boyer|Boyer, 2007]], pp. 192–193</ref> [[Origins of the War of 1812|Tensions with Britain remained]], however, leading to the [[War of 1812]], which was fought to a draw.<ref name="Wait1999"></ref> Although the federal government [[Act Prohibiting Importation of Slaves|outlawed]] American participation in the [[Atlantic slave trade]] in 1807, after 1820, cultivation of the highly profitable cotton crop exploded in the [[Deep South]], and along with it, the use of [[Slavery in the United States|slave labor]].<ref name="Cogliano2008"></ref><ref>[[#Walton|Walton, 2009]], p. 43</ref><ref>[[#Gordon|Gordon, 2004]], pp. 27,29</ref> The [[Second Great Awakening]], especially in the period 1800–1840, converted millions to [[Evangelicalism in the United States|evangelical]] Protestantism. In the North, it energized multiple social reform movements, including [[Abolitionism in the United States|abolitionism]];<ref name="Clark2012iu"></ref> in the South, [[History of Methodism in the United States|Methodists]] and [[Baptists in the United States|Baptists]] proselytized among slave populations.<ref>Heinemann, Ronald L., et al., Old Dominion, New Commonwealth: a history of Virginia 1607–2007, 2007 , p. 197</ref> [[File:U.S. Territorial Acquisitions.png|alt=Map of the U.S. depicting its westward expansion|thumb|[[Territorial evolution of the United States|Territorial acquisitions of the United States]] between 1783 and 1917]] In the late 18th century, American settlers began to [[Territorial evolution of the United States|expand further westward]], some of them with a sense of [[manifest destiny]].<ref name="MD2007" /><ref name="Morrison1999" /> The 1803 [[Louisiana Purchase]] almost doubled the nation's area,<ref></ref> [[Adams–Onís Treaty|Spain ceded Florida]] and other Gulf Coast territory in 1819,<ref name="KloseJones1994"></ref> the [[Republic of Texas]] was [[Texas annexation|annexed]] in 1845 during a period of expansionism,<ref name="Morrison1999"></ref> and the 1846 [[Oregon Treaty]] with Britain led to U.S. control of the present-day [[Northwestern United States|American Northwest]].<ref name="Kemp2010"></ref> Additionally, the [[Trail of Tears]] in the 1830s exemplified the [[Indian Removal Act|Indian removal policy]] that forcibly resettled Indians. This further expanded acreage under mechanical cultivation, increasing surpluses for international markets. This prompted a long series of [[American Indian Wars]] west of the [[Mississippi River]] from 1810 to at least 1890.<ref></ref> and eventually, conflict with Mexico.<ref name="BillingtonRidge2001j"></ref> Most of these conflicts ended with the cession of Native American territory and their confinement to [[Indian reservation]]s. Victory in the [[Mexican–American War]] resulted in the 1848 [[Mexican Cession]] of [[California]] and much of the present-day [[Southwestern United States|American Southwest]], and the U.S. spanned the continent.<ref name="MD2007"></ref><ref name="McIlwraithMuller2001"></ref> The [[California Gold Rush]] of 1848–1849 spurred migration to the Pacific coast, which led to the [[California Genocide]]<ref></ref> and the creation of additional western states.<ref name="Rawls1999"></ref> Economic development was spurred by giving vast quantities of land, nearly 10% of the total area of the United States, to white European settlers as part of the [[Homestead Acts]], as well as making [[land grants]] to private railroad companies and [[Land-grant university|colleges]].<ref>Paul Frymer, "Building an American Empire: The Era of Territorial and Political Expansion," (Princeton: Princeton University Press, 2017)</ref> Prior to the Civil War, [[Slave states and free states|the prohibition or expansion of slavery into these territories]] exacerbated tensions over [[Origins of the American Civil War|the debate around abolitionism]]. ===The Civil War and Reconstruction=== [[File:US Secession map 1861.svg|thumb| |alt=Map of U.S. showing two kinds of Union states, two phases of secession and territories]] Irreconcilable sectional conflict regarding [[Slavery in the United States|the enslavement]] of Africans and [[African Americans]] ultimately [[Origins of the American Civil War|led to the American Civil War]].<ref><br /></ref> With the [[1860 United States presidential election|1860 election]] of Republican [[Abraham Lincoln]], conventions in eleven slave states declared [[secession]] and formed the [[Confederate States of America]], while the federal government (the "[[Union (American Civil War)|Union]]") maintained that [[Perpetual Union|secession was unconstitutional and illegitimate]].<ref></ref> On April 12, 1861, the Confederacy initiated military conflict by [[Battle of Fort Sumter|bombarding Fort Sumter]], a federal garrison in [[Charleston, South Carolina|Charleston harbor]], South Carolina. This would be the spark of the Civil War, which lasted for four years (1861–1865) and became the deadliest military conflict in American history. The war would result in the deaths of approximately 620,000 soldiers from both sides and upwards of 50,000 civilians, almost all of them in the South.<ref></ref> [[Reconstruction (United States)|Reconstruction]] began in earnest following the war. While President Lincoln attempted to foster friendship and forgiveness between the Union and the former Confederacy, [[Assassination of Abraham Lincoln|his assassination]] on April&nbsp;14, 1865 drove a wedge between North and South again. Republicans in the federal government made it their goal to oversee the rebuilding of the South and to ensure the rights of African Americans. They persisted until the [[Compromise of 1877]], when the Republicans agreed to cease protecting the rights of African Americans in the South in order for Democrats to concede the [[1876 United States presidential election|presidential election of 1876]]. Influential Southern whites, calling themselves "[[Redeemers]]", took control of the South after the end of Reconstruction, beginning the [[nadir of American race relations]]. From 1890 to 1910, the Redeemers established so-called [[Jim Crow laws]], [[Disenfranchisement after the Reconstruction Era|disenfranchising]] almost all blacks and some impoverished whites throughout the region. Blacks would face [[Racial segregation in the United States|racial segregation]] nationwide, especially in the South.<ref></ref> They also lived under constant threat of vigilante violence, including [[Lynching in the United States|lynching]].<ref></ref> ===Industrial Age and the Progressive Era=== [[File:Emigrants (i.e. immigrants) landing at Ellis Island -.webm|thumb|left|Film by [[Edison Studios]] showing immigrants at [[Ellis Island]] in [[New York Harbor]], that was a major entry point for European [[immigration to the United States|immigration into the U.S.]]<ref name="PriceBenton-Short2008"></ref>]] In the North, [[urbanization]] and an unprecedented [[History of immigration to the United States|influx of immigrants]] from [[Southern Europe|Southern]] and [[Eastern Europe]] supplied a surplus of labor for the country's industrialization and transformed its culture.<ref name="Powell2009qwet"></ref> National infrastructure, including [[First Transcontinental Telegraph|telegraph]] and [[First transcontinental railroad|transcontinental railroads]], spurred economic growth and greater settlement and development of the [[American frontier|American Old West]]. After the [[American Civil War]], new transcontinental [[Rail transportation in the United States#History|railways]] made relocation easier for settlers, expanded internal trade, and increased conflicts with Native Americans.<ref name="Black2011kj"></ref> The later inventions of [[Incandescent light bulb|electric light]] and the [[telephone]] would also affect communication and urban life. Mainland expansion also included the [[Alaska Purchase|purchase of Alaska]] from [[Russian Empire|Russia]] in 1867.<ref></ref> In 1893, pro-American elements in Hawaii [[Overthrow of the Kingdom of Hawaii|overthrew]] the [[Kingdom of Hawaii|Hawaiian monarchy]] and formed the [[Republic of Hawaii]], which the U.S. [[Newlands Resolution|annexed]] in 1898. Puerto Rico, [[Guam]], and the [[Philippines]] were ceded by Spain in the same year, following the [[Spanish–American War]].<ref></ref> [[American Samoa]] was acquired by the United States in 1900 after the end of the [[Second Samoan Civil War]].<ref>Ryden, George Herbert. ''The Foreign Policy of the United States in Relation to Samoa''. New York: Octagon Books, 1975.</ref> The [[United States Virgin Islands|U.S. Virgin Islands]] were purchased from [[Denmark]] in 1917.<ref></ref> [[Gilded Age|Rapid economic development]] during the late 19th and early 20th centuries fostered the rise of many prominent industrialists. [[Business magnate|Tycoons]] like [[Cornelius Vanderbilt]], [[John D. Rockefeller]], and [[Andrew Carnegie]] led the nation's progress in the [[Railways|railroad]], [[Petroleum industry|petroleum]], and [[History of the steel industry (1850–1970)|steel]] industries. Banking became a major part of the economy, with [[J. P. Morgan]] playing a notable role. The American economy boomed, becoming the world's largest.<ref></ref> These dramatic changes were accompanied by huge increases in [[Effects of immigration to the United States|immigration]], [[Economic inequality|growing inequality]] and [[List of incidents of civil unrest in the United States|social unrest]], which prompted the rise of [[Labor history of the United States|organized labor]] along with [[Populism in the United States|populist]], [[History of the socialist movement in the United States|socialist]], and [[Anarchism in the United States|anarchist]] movements.<ref>[[#Zinn|Zinn, 2005]], pp. 321–357</ref> This period eventually ended with the advent of the [[Progressive Era]], which saw significant reforms including [[Consumer protection|health and safety regulation]] of consumer goods, the rise of [[Labor unions in the United States|labor unions]], and greater [[United States antitrust law|antitrust measures]] to ensure competition among businesses and attention to worker conditions. ===The rise to world power, the New Deal, and World War II=== [[File:Old timer structural worker.jpg|thumb|Worker during construction of the [[Empire State Building]] in [[New York City]] in 1930]] [[File:Trinity_Detonation_T&B_(cropped).jpg|thumb|[[Mushroom cloud]] formed by the [[Trinity (nuclear test)|Trinity Experiment]] in [[New Mexico]], part of the [[Manhattan Project]], the first detonation of a [[nuclear weapon]] in history, July 1945]] The United States remained neutral from the outbreak of [[World War I]] in 1914 until 1917 when it joined the war as an "associated power" alongside the [[Allies of World War I]], helping to turn the tide against the [[Central Powers]]. In 1919, President [[Woodrow Wilson]] took a leading diplomatic role at the [[Paris Peace Conference, 1919|Paris Peace Conference]] and advocated strongly for the U.S. to join the [[League of Nations]]. However, the Senate refused to approve this and did not ratify the [[Treaty of Versailles]] that established the League of Nations.<ref name="autogenerated418">McDuffie, Jerome; Piggrem, Gary Wayne; Woodworth, Steven E. (2005). ''U.S. History Super Review''. Piscataway, NJ: Research & Education Association. p. 418. .</ref> Around this time, millions of rural African Americans began [[Great Migration (African American)|a mass migration from the South to northern urban centers]]; it would continue until about 1970.<ref></ref> The last vestiges of the Progressive Era resulted in [[women's suffrage]] and [[Prohibition in the United States|alcohol prohibition]].<ref>Paige Meltzer, "The Pulse and Conscience of America" The General Federation and Women's Citizenship, 1945–1960,"&nbsp;''Frontiers: A Journal of Women Studies''&nbsp;(2009), Vol. 30 Issue 3, pp. 52–76.</ref><ref>James Timberlake,&nbsp;''Prohibition and the Progressive Movement, 1900–1920''&nbsp;(Harvard UP, 1963)</ref><ref>George B. Tindall, "Business Progressivism: Southern Politics in the Twenties,"&nbsp;''South Atlantic Quarterly''&nbsp;62 (Winter 1963): 92–106.</ref> In 1920, the women's rights movement won passage of a [[Nineteenth Amendment to the United States Constitution|constitutional amendment]] granting [[Women's suffrage in the United States|women's suffrage]].<ref name="voris"></ref> The 1920s and 1930s saw the rise of [[radio]] for [[mass communication]] and the invention of early [[television]]. The prosperity of the [[Roaring Twenties]] ended with the [[Wall Street Crash of 1929]] and the onset of the [[Great Depression in the United States|Great Depression]]. After his election as president in 1932, [[Franklin D. Roosevelt]] responded with the [[New Deal]].<ref></ref> The [[Dust Bowl]] of the mid-1930s impoverished many farming communities and spurred a new wave of western migration.<ref><br /><br /><br /></ref> At first [[United States non-interventionism before entering World War II|neutral during World War II]], the United States in March 1941 [[Lend-Lease|began supplying materiel]] to the [[Allies of World War II|Allies]]. On December&nbsp;7, 1941, the [[Empire of Japan]] launched a surprise [[attack on Pearl Harbor]], prompting the United States to join the Allies against the [[Axis powers]], and in the following year, to [[Internment of Japanese Americans|intern]] about 120,000 Japanese and Japanese Americans.<ref>The official WRA record from 1946 state it was 120,000 people. See . This number does not include people held in other camps such as those run by the DoJ or U.S. Army. Other sources may give numbers slightly more or less than 120,000.</ref><ref name="Pearl Harbor"></ref> The U.S. pursued a "[[Europe first]]" defense policy,<ref></ref> leaving the [[Philippines]], an [[History of the Philippines (1898–1946)|American colony]], isolated and alone to fight Japan's [[Japanese occupation of the Philippines|invasion and occupation]] until the U.S.-led [[Philippines campaign (1944–1945)]]. During the war, the United States was one of the "[[Four Policemen|Four Powers]]"<ref></ref> who met to plan the postwar world, along with Britain, the Soviet Union, and China. The United States emerged [[World War II casualties#Human losses by country|relatively unscathed]] from the war, and with even greater economic and military influence.<ref>Kennedy, Paul (1989). ''The Rise and Fall of the Great Powers''. New York: Vintage. p. 358. </ref> The United States played a leading role in the [[Bretton Woods Conference|Bretton Woods]] and [[Yalta Conference|Yalta]] conferences, which signed agreements on new international financial institutions and Europe's postwar reorganization. As an [[Victory in Europe Day|Allied victory was won in Europe]], a 1945 [[United Nations Conference on International Organization|international conference]] held in [[San Francisco]] produced the [[United Nations Charter]], which became active after the war.<ref></ref> The United States developed the [[Manhattan Project|first nuclear weapons]] and used them on Japan [[Atomic bombings of Hiroshima and Nagasaki|in the cities of Hiroshima and Nagasaki]] in August 1945; the Japanese [[Surrender of Japan|surrendered]] on September 2, ending [[World War II]].<ref></ref><ref>Pacific War Research Society (2006). ''Japan's Longest Day''. New York: Oxford University Press. .</ref> ===Cold War and late 20th century=== [[File:LevittownPA.jpg|thumb| [[Post–World War II economic expansion]] in the U.S. led to [[Suburbanization|suburban development]] and [[urban sprawl]], as shown in this aerial photograph of [[Levittown, Pennsylvania]], circa 1959.]] After World War II, the United States financed and implemented the [[Marshall Plan]] to help rebuild western Europe; disbursements paid between 1948 and 1952 would total $13 billion ($115 billion in 2021).<ref>See </ref> Also at this time, [[Geopolitics|geopolitical]] tensions between the United States and [[Soviet Union|Russia]] led to the [[Cold War]], driven by an ideological divide between [[capitalism]] and [[communism]].<ref name="WaggAndrews2012"></ref> They dominated the military affairs of Europe, with the U.S. and its [[NATO]] allies on one side and the Soviet Union and its [[Warsaw Pact]] allies on the other.<ref name=":3"></ref> The U.S. often opposed [[Third World]] movements that it viewed as Soviet-sponsored, sometimes pursuing direct action for [[United States involvement in regime change|regime change]] against [[Left-wing politics|left-wing]] governments.<ref>[[#Blakeley|Blakeley, 2009]], [https://books.google.com/books?id=rft8AgAAQBAJ&pg=PA92 p. 92]</ref> American troops fought the communist forces in the [[Korean War]] of 1950–1953,<ref name="Proxy"></ref> and the U.S. became increasingly involved in the [[Vietnam War]] (1955–1975), introducing combat forces in 1965.<ref></ref> Their competition to achieve superior [[spaceflight]] capability led to the [[Space Race]], which culminated in the U.S. becoming the first nation to [[Apollo 11|land people on the Moon]] in 1969.<ref name="Proxy" /> While both countries engaged in [[proxy war]]s and developed powerful [[nuclear weapon]]s, they avoided direct military conflict.<ref name=":3" /> At home, the United States experienced [[Post–World War II economic expansion|sustained economic expansion]], [[Urbanization in the United States|urbanization]], and a [[Post–World War II baby boom|rapid growth of its population]] and [[American middle class|middle class]] following World War II. Construction of an [[Interstate Highway System]] transformed the nation's transportation infrastructure in decades to come.<ref name="IntHighways"></ref> In 1959, the United States admitted [[Alaska]] and [[Hawaii]] to become the 49th and 50th states, formally expanding beyond the [[contiguous United States]].<ref name="Lightner2004"></ref> [[File:Martin_Luther_King_-_March_on_Washington_(cropped).jpg|thumb|[[Martin Luther King Jr.]] gives his famous "[[I Have a Dream]]" speech at the [[Lincoln Memorial]] during the [[March on Washington for Jobs and Freedom|March on Washington]], 1963.|alt=See caption]] The growing [[civil rights movement]] used [[nonviolence]] to confront [[Racism in the United States|racism]], with [[Martin Luther King Jr.]] becoming a prominent leader and figurehead.<ref></ref> President [[Lyndon B. Johnson]] initiated legislation that led to a series of policies addressing poverty and racial inequalities, in what he termed the "[[Great Society]]". The launch of a "[[War on Poverty]]" expanded [[Social programs in the United States|entitlements and welfare]] spending, leading to the creation of the [[Food Stamp Program]], [[Aid to Families with Dependent Children]], along with national [[health insurance]] programs [[Medicare (United States)|Medicare]] and [[Medicaid]].<ref></ref> A combination of court decisions and legislation, culminating in the [[Civil Rights Act of 1968]], made significant improvements.<ref></ref><ref></ref><ref></ref> Meanwhile, a [[counterculture of the 1960s|counterculture movement]] grew, which was fueled by [[Opposition to United States involvement in the Vietnam War|opposition to the Vietnam War]], the [[Black Power movement]], and the [[sexual revolution]].<ref></ref> The [[Women's Movement in the United States (1963-1982)|women's movement]] in the U.S. broadened the debate on women's rights and made [[gender equality]] a major social goal. The [[Sexual revolution in 1960s United States|1960s Sexual Revolution]] liberalized American attitudes to sexuality;<ref></ref> the 1969 [[Stonewall riots]] in New York City marked the beginning of the fledgling [[gay liberation|gay rights]] movement.<ref name=StonewallNYC1></ref><ref>; ; </ref> The United States supported [[Israel]] during the [[Yom Kippur War]]; in response, the country faced an oil [[embargo]] from [[OPEC]] nations, sparking the [[1973 oil crisis]]. After a surge in female labor participation around the 1970s, by 1985, the majority of women aged 16 and over were employed.<ref></ref> The 1970s and early 1980s also saw the onset of [[stagflation]]. The presidency of [[Richard Nixon]] saw the American withdrawal from Vietnam but also the [[Watergate scandal]], which led to [[Nixon resignation|his resignation in disgrace]] and a decline in public trust of government that expanded for decades.<ref name="watergate_committee_final_report">[[Sam Ervin|Ervin, Sam]], et al., ''Final Report of the Watergate Committee]''.</ref> [[File:President Ronald Reagan and Soviet General Secretary Mikhail Gorbachev at the first Summit in Geneva, Switzerland.jpg|thumb|U.S. president [[Ronald Reagan]] (left) and Soviet general secretary [[Mikhail Gorbachev]] at the [[Geneva Summit (1985)|Geneva Summit]] in 1985]] After his election in 1980 President [[Ronald Reagan]] responded to economic stagnation with [[Reaganomics|neoliberal reforms]] and initiated the more aggressive [[rollback|rollback strategy]] towards the Soviet Union.<ref>[[#Soss|Soss, 2010]], p. 277</ref><ref>[[#Fraser|Fraser, 1989]]</ref> During Reagan's presidency, the federal debt held by the public nearly tripled in nominal terms, from $738 billion to $2.1 trillion.<ref></ref> This led to the United States moving from the world's largest international creditor to the world's largest debtor nation.<ref name="U.S. Debt"></ref> The [[dissolution of the Soviet Union]] in 1991 ended the Cold War,<ref></ref><ref><br /><br /></ref><ref>[[#Hayes|Hayes, 2009]]</ref> ensuring a global [[unipolarity]]<ref>[[Charles Krauthammer]], "The Unipolar Moment", ''Foreign Affairs'', 70/1, (Winter 1990/1), 23–33.</ref> in which the U.S. was unchallenged as the world's dominant [[superpower]].<ref><br /><br /><br /><br /><br /><br />[[#Cohen|Cohen, 2004: History and the Hyperpower]]</ref> Fearing the spread of [[Middle East|regional]] international instability from the [[Iraqi invasion of Kuwait]], in August 1991, President [[George H. W. Bush]] launched and led the [[Gulf War]] against Iraq, expelling Iraqi forces and restoring the [[Emir of Kuwait|Kuwaiti monarchy]].<ref></ref> During the administration of President [[Bill Clinton]] in 1994, the U.S. signed the [[North American Free Trade Agreement]] (NAFTA), causing trade among the U.S., Canada, and Mexico to soar.<ref><br /><br /></ref> Due to the [[dot-com bubble|dot-com boom]], stable monetary policy, and [[Personal Responsibility and Work Opportunity Act|reduced social welfare spending]], the 1990s saw the [[1990s United States boom|longest economic expansion]] in modern U.S. history.<ref><br /></ref> ===21st century=== [[File:WTC_21-632.TIFF|upright|thumb|left|The [[World Trade Center (1973–2001)|World Trade Center]] in [[Lower Manhattan]] during the [[September 11 attacks]] by the [[Islamic terrorism|Islamic terrorist]] group [[Al-Qaeda]] in 2001|alt=Dark smoke billows from the Twin Towers over Manhattan]] On [[September 11 attacks|September 11, 2001]], [[al-Qaeda]] terrorist hijackers flew passenger planes into the [[World Trade Center (1973–2001)|World Trade Center]] in New York City and [[the Pentagon]] near Washington, D.C., killing nearly 3,000 people.<ref><br /><br /></ref> In response, President [[George W. Bush]] launched the [[War on Terror]], which included a nearly 20-year [[War in Afghanistan (2001–present)|war in Afghanistan]] from 2001 to 2021 and the 2003–2011 [[Iraq War]].<ref><br /></ref><ref><br /><br /></ref> Government policy designed to promote affordable housing,<ref></ref> widespread failures in corporate and regulatory governance,<ref></ref> and historically low interest rates set by the Federal Reserve<ref></ref> led to a [[United States housing bubble|housing bubble]] in 2006. This culminated in the [[financial crisis of 2007–2008]] and the [[Great Recession]], the nation's largest economic contraction since the Great Depression.<ref></ref> [[Barack Obama]], the first [[Multiracial American|multiracial]]<ref></ref> president with [[African-American]] ancestry, [[2008 United States presidential election|was elected in 2008]] amid the financial crisis.<ref name=":1"></ref> By the end of his second term, the stock market, median household income and net worth, and the number of persons with jobs were all at record levels, while the unemployment rate was well below the historical average.<ref name=RSKrugman1></ref><ref name="Obama_Inequality"></ref><ref name=Politico_Awesome></ref><ref name="Factcheck_Inherits"></ref><ref name="Factcheck_Obama#"></ref> His signature legislative accomplishment was the [[Affordable Care Act]] (ACA), popularly known as "Obamacare". It represented the [[U.S. healthcare system]]'s most significant regulatory overhaul and expansion of coverage since Medicare in 1965. As a result, the uninsured share of the population was cut in half, while the number of newly insured Americans was estimated to be between 20 and 24 million.<ref name="HHS_ASPE16"></ref> After Obama served two terms, Republican [[Donald Trump]] was elected as the [[List of Presidents of the United States|45th president]] in 2016. [[2016 United States presidential election|His election]] is viewed as one of the biggest political upsets in American history.<ref></ref> Trump held office through [[Timeline of the COVID-19 pandemic in the United States|the first waves]] of the [[COVID-19 pandemic]] and the resulting [[COVID-19 recession]] starting in 2020 that exceeded even the Great Recession earlier in the century.<ref></ref> The early 2020s saw the country become more divided, with various social issues sparking debate and protest. The [[murder of George Floyd]] in 2020 led to [[George Floyd protests|widespread civil unrest in urban centers]] and a national debate about [[Police brutality in the United States|police brutality]] and lingering [[institutional racism]].<ref></ref> The nationwide increase in the frequency of instances and number of deaths related to [[Mass shootings in the United States|mass shootings]] added to the societal tensions.<ref></ref> On January 6, 2021, supporters of the outgoing president, Trump, [[2021 United States Capitol attack|stormed the U.S. Capitol]] in an unsuccessful effort to disrupt the [[U.S. Electoral College|Electoral College]] vote count that would confirm Democrat [[Joe Biden]] as the 46th president.<ref></ref> In 2022, the Supreme Court [[Dobbs v. Jackson Women's Health Organization|ruled that there is no constitutional right to an abortion]], causing [[2022 abortion protests|another wave of protests]] across the country and stoking international reactions as well.<ref></ref> Despite these divisions, the country has remained unified against [[Union State|Russia]] after [[Vladimir Putin]]'s [[2022 Russian invasion of Ukraine|2022 invasion of Ukraine]], with politicians and individuals across the political spectrum supporting arms shipments to Ukraine and many large American corporations pulling out of Russia and Belarus altogether.<ref></ref> ==Geography== [[File:Uspaintedrelief.png|thumb|[[Topographic map]] of the United States]] [[File:Wonder Lake and Denali.jpg|thumb|[[Denali]], or Mount McKinley, in [[Alaska]], the highest [[mountain]] peak in [[North America]]]] The [[List of states and territories of the United States|48 contiguous states and the District of Columbia]] occupy a combined area of . Of this area, is contiguous land, composing 83.65% of total U.S. land area.<ref></ref><ref name="urlState Area Measurements and Internal Point Coordinates—Geography—U.S. Census Bureau"></ref> About 15% is occupied by [[Alaska]], a state in northwestern North America, with the remainder in [[Hawaii]], a state and [[archipelago]] in the central [[Pacific Ocean|Pacific]], and the five populated but [[Unincorporated area|unincorporated]] insular territories of [[Puerto Rico]], [[American Samoa]], [[Guam]], the [[Northern Mariana Islands]], and the [[United States Virgin Islands|U.S. Virgin Islands]].<ref></ref> Measured by only land area, the United States is third in size behind Russia and China, and just ahead of Canada.<ref name="CIA Factbook Area"></ref> The United States is the world's [[List of countries and dependencies by area|third- or fourth-largest]] nation by total area (land and water), ranking behind Russia and Canada and nearly equal to China. The ranking varies depending on how two territories disputed by China and India are counted, and how the total size of the United States is measured.<ref name="WF"></ref> The [[Atlantic coastal plain|coastal plain]] of the [[Atlantic Ocean|Atlantic]] seaboard gives way further inland to [[deciduous]] forests and the rolling hills of the [[Piedmont (United States)|Piedmont]].<ref></ref> The [[Appalachian Mountains]] and the [[Adirondack Mountains|Adirondack]] [[massif]] divide the eastern seaboard from the [[Great Lakes]] and the grasslands of the [[Midwestern United States|Midwest]].<ref name="NAU"></ref> The [[Mississippi River|Mississippi]]–[[Missouri River]], the world's [[List of rivers by length|fourth longest river system]], runs mainly north–south through the heart of the country. The flat, fertile [[prairie]] of the [[Great Plains]] stretches to the west, interrupted by [[U.S. Interior Highlands|a highland region]] in the southeast.<ref name="NAU" /> The [[Rocky Mountains]], west of the Great Plains, extend north to south across the country, peaking at over in [[Colorado]].<ref></ref> Farther west are the rocky [[Great Basin]] and deserts such as the [[Chihuahuan Desert|Chihuahua]], [[Sonoran Desert|Sonoran]], and [[Mojave Desert|Mojave]].<ref></ref> The [[Sierra Nevada (U.S.)|Sierra Nevada]] and [[Cascade Range|Cascade]] mountain ranges run close to the [[West Coast of the United States|Pacific coast]], both ranges also reaching altitudes higher than . The [[Extreme points of the United States|lowest and highest points]] in the contiguous United States are in the state of California,<ref></ref> and only about apart.<ref></ref> At an elevation of , Alaska's [[Denali]] is the highest peak in the country and in North America.<ref></ref> Active [[volcano]]es are common throughout Alaska's [[Alexander Archipelago|Alexander]] and [[Aleutian Islands]], and Hawaii consists of volcanic islands. The [[supervolcano]] underlying [[Yellowstone National Park]] in the [[Rockies]] is the continent's largest volcanic feature.<ref></ref> ===Climate=== [[File:Köppen Climate Types US 50.png|thumb|[[Köppen climate classification|Köppen climate types]] of the U.S.]] The United States, with its large size and geographic variety, includes most climate types. To the east of the [[100th meridian west|100th meridian]], the climate ranges from [[humid continental climate|humid continental]] in the north to [[humid subtropical climate|humid subtropical]] in the south.<ref></ref> The Great Plains west of the 100th meridian are [[Semi-arid climate|semi-arid]]. Many mountainous areas of the American West have an [[alpine climate]]. The climate is [[Desert climate|arid]] in the Great Basin, desert in the Southwest, [[Mediterranean climate|Mediterranean]] in [[coastal California]], and [[oceanic climate|oceanic]] in coastal [[Oregon]] and [[Washington (state)|Washington]] and southern Alaska. Most of Alaska is [[Subarctic climate|subarctic]] or [[Polar climate|polar]]. Hawaii and the southern tip of [[Florida]] are [[Tropical climate|tropical]], as well as its territories in the [[Caribbean]] and the Pacific.<ref></ref> States bordering the [[Gulf of Mexico]] are prone to [[Tropical cyclone|hurricanes]], and most of the world's [[tornado]]es occur in the country, mainly in [[Tornado Alley]] areas in the Midwest and South.<ref></ref> Overall, the United States receives more high-impact extreme weather incidents than any other country in the world.<ref></ref> Extreme weather has become more frequent in the U.S., with three times the number of reported [[heat waves]] as in the 1960s. Of the ten warmest years ever recorded in the 48 contiguous states, eight have occurred since 1998. In the [[Southwestern United States|American Southwest]], droughts have become more persistent and more severe.<ref></ref> ===Biodiversity and conservation=== [[File:Bald Eagle (Haliaeetus leucocephalus) in Kachemak Bay, Alaska.jpg|alt=A bald eagle|thumb|left|The [[bald eagle]] has been the [[National bird of the United States|national bird]] of the United States since 1782.<ref name="McDougall2004"></ref>]] The U.S. is one of 17 [[megadiverse countries]] containing large numbers of [[List of endangered species in North America|endemic species]]: about 17,000 species of [[vascular plant]]s occur in the contiguous United States and Alaska, and more than 1,800 species of [[flowering plant]]s are found in Hawaii, few of which occur on the mainland.<ref></ref> The United States is home to 428 [[mammal]] species, 784 [[bird]]s, 311 [[reptile]]s, and 295 [[amphibian]]s,<ref name="Current Results # of native species in the US"></ref> and 91,000 [[insect]] species.<ref></ref> There are 63 [[List of areas in the United States National Park System|national parks]] and hundreds of other federally managed parks, forests, and [[wilderness]] areas, which are managed by the [[National Park Service]].<ref></ref> Altogether, the government owns about 28% of the country's land area,<ref name="NYTimes Federal Land"></ref> mostly in the [[Western United States|western states]].<ref name="AKLeg CRS Federal Land"></ref> Most of this land is [[protected area|protected]], though some is leased for oil and gas drilling, mining, logging, or cattle ranching, and about .86% is used for military purposes.<ref name="Federal Land Ownership"></ref><ref name="Fed Land Uses"></ref> [[Environmental issues in the United States|Environmental issues]] include debates on oil and [[nuclear binding energy|nuclear energy]], dealing with air and water pollution, the economic costs of protecting [[wildlife]], logging and [[deforestation]],<ref></ref><ref></ref> and [[Climate change in the United States|climate change]].<ref>[[#Daynes|Daynes & Sussman, 2010]], pp. 3, 72, 74–76, 78</ref><ref>Hays, Samuel P. (2000). ''A History of Environmental Politics since 1945''.</ref> The most prominent environmental agency is the [[United States Environmental Protection Agency|Environmental Protection Agency]] (EPA), created by presidential order in 1970.<ref name="Collin2006"></ref> The idea of wilderness has shaped the management of public lands since 1964, with the [[Wilderness Act]].<ref>Turner, James Morton (2012). ''The Promise of Wilderness''</ref> The [[Endangered Species Act]] of 1973 is intended to protect threatened and endangered species and their habitats, which are monitored by the [[United States Fish and Wildlife Service]].<ref name="Office"></ref> As of 2020, the U.S. ranked 24th among nations in the [[Environmental Performance Index]].<ref></ref> The country joined the [[Paris Agreement]] on climate change in 2016, and has many other environmental commitments.<ref></ref> It [[United States withdrawal from the Paris Agreement|withdrew]] from the Paris Agreement in 2020<ref></ref> but rejoined it in 2021.<ref></ref> ==Government and politics== [[Image:US Capitol west side.JPG|thumb|The [[United States Capitol]], where [[United States Congress|Congress]] meets: the [[United States Senate|Senate]], left; the [[United States House of Representatives|House]], right]] [[File:White House lawn (long tightly cropped).jpg|thumb|The [[White House]], residence and workplace of the [[President of the United States|U.S. President]]]] [[Image:Panorama of United States Supreme Court Building at Dusk.jpg|thumb|The [[United States Supreme Court Building|Supreme Court Building]], where the [[Supreme Court of the United States|nation's highest court]] sits]] The United States is a [[federal republic]] of 50 [[U.S. state|states]], a [[District of Columbia|federal district]], [[Territories of the United States|five territories]] and several uninhabited [[United States Minor Outlying Islands|island possessions]].<ref></ref> It is the world's oldest surviving [[federation]], and, according to the [[World Economic Forum]], the oldest [[democracy]] as well.<ref>Desjardins, Jeff (August 8, 2019) [https://www.weforum.org/agenda/2019/08/countries-are-the-worlds-oldest-democracies/ "Mapped: The world’s oldest democracies"] [[World Economic Forum]]</ref> It is a [[representative democracy]] "in which [[majority rule]] is tempered by [[minority rights]] protected by [[Law of the United States|law]]."<ref name="Scheb">Scheb, John M.; Scheb, John M. II (2002). ''An Introduction to the American Legal System''. Florence, KY: Delmar, p. 6. .</ref> In the American [[federalism|federal]] system, sovereignty is shared between [[Political divisions of the United States|two levels of government]]: federal and state. Citizens of the states are also governed by local governments, which are administrative divisions of the states. The territories are administrative divisions of the federal government. The [[Constitution of the United States|U.S. Constitution]] serves as the country's supreme legal document. The Constitution establishes the structure and responsibilities of the federal government and its relationship with the individual states. The Constitution has been amended 27 times;<ref>[[#Feldstein|Feldstein, Fabozzi, 2011]], p. 9</ref> the first ten amendments ([[United States Bill of Rights|Bill of Rights]]) and the [[Fourteenth Amendment to the United States Constitution|Fourteenth Amendment]] form the central basis of Americans' individual rights. All laws and governmental procedures are subject to [[judicial review]], and any law can be voided if the courts determine that it violates the Constitution. The principle of judicial review, not explicitly mentioned in the Constitution, was established by the Supreme Court in ''[[Marbury v. Madison]]'' (1803).<ref>[[#Schultz|Schultz, 2009]], pp. 164, 453, 503</ref> The United States has operated under a [[two-party system]] for most of its history,<ref name="twsNovGe"></ref> although what the two parties are has changed over time: the country is currently in either the [[Fifth Party System|Fifth]] or [[Sixth Party System]]. In current American [[political culture]], the [[Center-right politics|center-right]] [[Republican Party (United States)|Republican Party]] is considered "[[Conservatism in the United States|conservative]]" and the [[Centre-left politics|center-left]] [[Democratic Party (United States)|Democratic Party]] is considered "[[Modern liberalism in the United States|liberal]]".<ref></ref><ref></ref> On [[Transparency International]]'s 2019 [[Corruption Perceptions Index]], its [[public sector]] position deteriorated from a score of 76 in 2015 to 69 in 2019.<ref></ref> In 2021, the U.S. ranked 26th on the [[Democracy Index]], and is described as a "flawed democracy".<ref></ref> ===Federal government=== The federal government comprises three branches, which are headquartered in Washington, D.C. and regulated by a system of [[separation of powers|checks and balances]] defined by the Constitution.<ref></ref> * [[Legislature|Legislative]]: The [[United States Congress|bicameral Congress]], made up of the [[United States Senate|Senate]] and the [[United States House of Representatives|House of Representatives]], makes [[federal law]], [[declaration of war|declares war]], approves treaties, has the [[power of the purse]],<ref></ref> and has the power of [[impeachment]], by which it can remove sitting members of the federal government.<ref></ref> * [[Executive (government)|Executive]]: [[President of the United States|The president]] is the [[commander-in-chief]] of the military, can veto [[bill (law)|legislative bills]] before they become law (subject to congressional override), and appoints the [[Cabinet of the United States|members of the Cabinet]] (subject to Senate approval) and other officers, who administer and enforce federal laws and policies.<ref></ref> * [[Judiciary|Judicial]]: The [[Supreme Court of the United States|Supreme Court]] and lower [[Federal judiciary of the United States|federal courts]], whose judges are appointed by the president with Senate approval, interpret laws and overturn those they find [[constitutionality|unconstitutional]].<ref><br /><br /><br /><br /></ref> The [[lower house]], the [[United States House of Representatives|House of Representatives]], has 435 voting members, each representing a [[congressional district]] for a two-year term. House seats are [[United States congressional apportionment|apportioned]] among the states by population. Each state then draws single-member districts to conform with the census apportionment. The District of Columbia and the five major U.S. territories each have [[Non-voting members of the United States House of Representatives|one member of Congress]]—these members are not allowed to vote.<ref name="Territories1" /> The [[upper house]], the [[United States Senate|Senate]], has 100 members with each state having two senators, elected [[at-large|at large]] to six-year terms; one-third of Senate seats are up for election every two years. The District of Columbia and the five major U.S. territories do not have senators.<ref name="Territories1" /> The Senate is unique among upper houses in being the most prestigious and powerful portion of the country's [[Bicameralism|bicameral system]]; political scientists have frequently labeled it the "most powerful upper house" of any government.<ref></ref> The president serves a four-year term and may be elected to the office [[Term limits in the United States|no more than twice]]. The president is [[United States presidential election|not elected by direct vote]], but by an indirect [[Electoral College (United States)|electoral college]] system in which the determining votes are apportioned to the states and the District of Columbia.<ref name="Avaliktos2004"></ref> The Supreme Court, led by the [[Chief Justice of the United States|chief justice of the United States]], has nine members, who serve for life.<ref></ref> ===Political subdivisions=== Each of the 50 states holds jurisdiction over a geographic territory, where it shares [[sovereignty]] with the federal government. They are subdivided into [[List of United States counties and county equivalents|counties or county equivalents]], and further divided into [[Municipality|municipalities]]. The District of Columbia is a [[federal district]] that contains the capital of the United States, the [[Washington, D.C.|city of Washington]].<ref>(a)(36) and (a)(38) U.S. Federal Code, Immigration and Nationality Act. </ref> Each state has the amount [[presidential electors]] equal to the number of their representatives plus senators in Congress, and the District of Columbia has three electors.<ref></ref> Territories of the United States do not have presidential electors, therefore people there cannot vote for the president.<ref name="Territories1"></ref> [[Citizenship of the United States|Citizenship is granted at birth in all states]], the District of Columbia, and all major U.S. territories except American Samoa.<ref></ref><ref name="AS_citizenship"></ref> The United States observes limited [[Tribal sovereignty in the United States|tribal sovereignty]] of the American Indian nations, like states' sovereignty. American Indians are U.S. citizens and tribal lands are subject to the jurisdiction of the U.S. Congress and the federal courts. Like the states, tribes have some autonomy restrictions. They are prohibited from making war, engaging in their own foreign relations, and printing or issuing independent currency.<ref></ref> [[Indian reservation]]s are usually contained within one state, but there are 12 reservations that cross state boundaries.<ref></ref> ===Foreign relations=== [[File:67º_Período_de_Sesiones_de_la_Asamblea_General_de_Naciones_Unidas_(8020913157).jpg|thumb|The [[Headquarters of the United Nations|United Nations headquarters]] has been situated along the [[East River]] in [[Midtown Manhattan]] since 1952. The United States is a founding member of the UN.|alt=see caption|left]] The United States has an established structure of foreign relations, and it had the world's second-largest diplomatic corps in 2019.<ref></ref> It is a [[Permanent members of the United Nations Security Council|permanent member]] of the [[United Nations Security Council]],<ref></ref> and home to the [[Headquarters of the United Nations|United Nations headquarters]].<ref></ref> The United States is also a member of the [[G7]],<ref></ref> [[G-20 major economies|G20]],<ref></ref> and [[OECD]] intergovernmental organizations.<ref></ref> Almost all countries have [[List of diplomatic missions in the United States|embassies]] and many have [[consul (representative)|consulates]] (official representatives) in the country. Likewise, nearly all nations host formal [[diplomatic mission]]s with United States, except [[Iran–United States relations|Iran]],<ref></ref> [[North Korea–United States relations|North Korea]],<ref></ref> and [[Foreign relations of Bhutan#Other countries|Bhutan]].<ref></ref> Though [[Taiwan–United States relations|Taiwan]] does not have formal diplomatic relations with the U.S., it maintains close, if unofficial, relations. The United States also regularly supplies Taiwan with [[Six Assurances|military equipment]].<ref></ref> The United States has a "[[Special Relationship]]" with the [[United Kingdom–United States relations|United Kingdom]]<ref></ref> and strong ties with [[Canada–United States relations|Canada]],<ref></ref> [[Australia–United States relations|Australia]],<ref></ref> [[New Zealand–United States relations|New Zealand]],<ref></ref> the [[Philippines]],<ref></ref> [[Japan–United States relations|Japan]],<ref></ref> [[South Korea–United States relations|South Korea]],<ref></ref> [[Israel–United States relations|Israel]],<ref></ref> and several [[European Union]] countries ([[France–United States relations|France]], [[Italy–United States relations|Italy]], [[Germany–United States relations|Germany]], [[Spain–United States relations|Spain]], and [[Poland–United States relations|Poland]]).<ref></ref> The U.S. works closely with its [[NATO]] allies on military and [[national security]] issues, and with nations in the Americas through the [[Organization of American States]] and the [[United States–Mexico–Canada Agreement|United States–Mexico–Canada Free Trade Agreement]]. In [[South America]], [[Colombia]] is traditionally considered to be the closest ally of the United States.<ref></ref><ref></ref> The U.S. exercises full international defense authority and responsibility for [[Federated States of Micronesia|Micronesia]], the [[Marshall Islands]] and [[Palau]] through the [[Compact of Free Association]].<ref><br /></ref> The U.S. has become a key ally of [[Ukraine]] since [[Russia]] [[Annexation of Crimea by the Russian Federation|annexed Crimea in 2014]] in 2014 and began an [[2022 Russian invasion of Ukraine|invasion of Ukraine in 2022]], significantly deteriorating relations with Russia in the process.<ref></ref> The U.S. has also experienced a deterioration of relations with [[China]] and grown closer to [[Taiwan]].<ref></ref><ref></ref><ref></ref> ===Military=== [[File:B-2 Spirit.jpg|thumb|[[Northrop Grumman B-2 Spirit|B-2 Spirit]], the [[Stealth technology|stealth]] [[Heavy bomber|heavy]] [[strategic bomber]] of the [[United States Air Force|USAF]]]] [[File:Aerial view of the Pentagon, Arlington, VA (38285035892).jpg|thumb|[[The Pentagon]], near Washington, D.C., is home to the [[U.S. Department of Defense]].]] The president is the [[Commander-in-Chief of the United States|commander-in-chief]] of the United States Armed Forces and appoints its leaders, the [[United States Secretary of Defense|secretary of defense]] and the [[Joint Chiefs of Staff]]. The [[United States Department of Defense|Department of Defense]], which is headquartered at [[the Pentagon]] near Washington, D.C., administers five of the six service branches, which are made up of the [[United States Army|Army]], [[United States Marine Corps|Marine Corps]], [[United States Navy|Navy]], [[United States Air Force|Air Force]], and [[United States Space Force|Space Force]]. The [[United States Coast Guard|Coast Guard]] is administered by the [[United States Department of Homeland Security|Department of Homeland Security]] in peacetime and can be transferred to the [[United States Department of the Navy|Department of the Navy]] in wartime.<ref></ref> The United States spent $649 billion on its military in 2019, 36% of global military spending. At 4.7% of GDP, the percentage was the second-highest among all countries, after [[Saudi Arabia]].<ref name="StockSpending"></ref> It also has [[Nuclear weapons of the United States|more than 40% of the world's nuclear weapons]], the second-largest after Russia.<ref></ref> In 2019, all six branches of the U.S. Armed Forces reported 1.4&nbsp;million personnel on active duty.<ref name="IISS"></ref> The [[Reserve components of the United States Armed Forces|Reserves]] and [[National Guard of the United States|National Guard]] brought the total number of troops to 2.3&nbsp;million.<ref name="IISS" /> The Department of Defense also employed about 700,000 civilians, not including [[Military-industrial complex|contractors]].<ref></ref> Military service in the United States is voluntary, although [[Conscription in the United States|conscription]] may occur in wartime through the [[Selective Service System]].<ref></ref> The United States has the third-largest combined armed forces in the world, behind the [[People's Liberation Army|Chinese People's Liberation Army]] and [[Indian Armed Forces]].<ref>[[#IISS2020|IISS 2020]], pp. 46</ref> Today, American forces can be rapidly deployed by the Air Force's large fleet of [[transport aircraft]], the Navy's 11 active [[aircraft carrier]]s, and [[Marine expeditionary unit]]s at sea with the Navy, and Army's [[XVIII Airborne Corps]] and [[75th Ranger Regiment]] deployed by Air Force transport aircraft. The Air Force can strike targets across the globe through its fleet of [[strategic bomber]]s, maintains the [[air defense]] across the United States, and provides [[close air support]] to Army and Marine Corps ground forces.<ref></ref><ref></ref> The Space Force operates the [[Global Positioning System]], operates the [[Eastern Range|Eastern]] and [[Western Range (USSF)|Western Range]]s for all space launches, and operates the United States's [[United States Space Surveillance Network|Space Surveillance]] and [[United States national missile defense|Missile Warning]] networks.<ref></ref><ref></ref><ref></ref> The military operates about 800 bases and facilities abroad,<ref></ref> and maintains [[United States military deployments|deployments greater than 100 active duty personnel]] in 25 foreign countries.<ref></ref> ===Law enforcement and crime=== [[File:US incarceration timeline-clean.svg|thumb|left|Total [[incarceration]] in the United States by year (1920–2014)|alt=Chart depicting a steep increase in the number of incarcerated Americans from the 1980s to the 2000s]] There are about 18,000 U.S. police agencies from local to federal level in the United States.<ref></ref> Law in the United States is mainly [[Law enforcement in the United States|enforced]] by local police departments and [[sheriff]]'s offices. The [[state police]] provides broader services, and [[Federal law enforcement in the United States|federal agencies]] such as the [[Federal Bureau of Investigation]] (FBI) and the [[United States Marshals Service|U.S. Marshals Service]] have specialized duties, such as protecting [[civil rights]], [[National Security of the United States|national security]] and enforcing [[U.S. federal courts]]' rulings and federal laws.<ref></ref> [[State court (United States)|State court]]s conduct most civil and criminal trials,<ref></ref> and federal courts handle designated crimes and appeals from the state criminal courts.<ref></ref> , the United States has an [[List of countries by intentional homicide rate|intentional homicide rate]] of 7 per 100,000 people.<ref></ref> A cross-sectional analysis of the [[World Health Organization]] Mortality Database from 2010 showed that United States homicide rates "were 7.0 times higher than in other high-income countries, driven by a gun homicide rate that was 25.2 times higher."<ref></ref> , the United States has the [[United States incarceration rate|sixth highest documented incarceration rate]] and [[Incarceration in the United States|second largest prison population]] in the world.<ref></ref> In 2019, the total prison population for those sentenced to more than a year is 1,430,800, corresponding to a ratio of 419 per 100,000 residents and the lowest since 1995.<ref></ref> Some estimates place that number higher, such [[Prison Policy Initiative]]'s 2.3 million.<ref name="WholePie2020"></ref> Various states have attempted to [[Decarceration in the United States|reduce their prison populations]] via government policies and grassroots initiatives.<ref></ref> Although most nations have abolished [[capital punishment]],<ref name="Arthur2020"></ref> it is sanctioned in the United States for certain federal and military crimes, and in 27 states out of 50 and in one territory.<ref></ref> Several of these states have [[Moratorium (law)|moratoriums]] on carrying out the penalty, each imposed by the state's governor.<ref></ref><ref></ref><ref></ref> Since 1977, there have been more than 1,500 executions,<ref></ref> giving the U.S. the sixth-highest number of executions in the world, following [[Capital punishment in China|China]], [[Capital punishment in Iran|Iran]], [[Capital punishment in Saudi Arabia|Saudi Arabia]], [[Capital punishment in Iraq|Iraq]], and [[Capital punishment in Egypt|Egypt]].<ref></ref> However, the number is trended down nationally, with [[Capital punishment in the United States#States without capital punishment|several states]] recently abolishing the penalty.<ref>[https://deathpenaltyinfo.org/news/dpic-adds-eleven-cases-to-innocence-list-bringing-national-death-row-exoneration-total-to-185 DPIC adds Eleven cases to the Innocence List bringing national death-row exonerations to 185], ''[[Death Penalty Information Center]]'', Robert Durham, February 18, 2021. Retrieved November 9, 2021.</ref> ==Economy== [[File:US one dollar bill, obverse, series 2009.jpg|thumb|alt=see caption|The [[United States dollar|U.S. dollar]] (featuring [[George Washington]]) is the currency most used in [[international trade|international transactions]] and is the world's foremost [[reserve currency]].<ref name="federalreserve.gov"/>]] [[File:Gaming-Wall-Street_BTS_Prodigium-266.jpg|thumb|The [[New York Stock Exchange]] on [[Wall Street]], the world's largest stock exchange by [[market capitalization]] of its listed companies<ref name=NYSEhighestcap></ref>]] According to the [[International Monetary Fund]], the U.S. [[gross domestic product]] (GDP) of $22.7&nbsp;trillion constitutes 24% of the [[gross world product]] at market exchange rates and over 16% of the gross world product at [[purchasing power parity]] (PPP).<ref></ref><ref name="IMF.WEO.US" /> From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7.<ref name="Hagopian"></ref> The country ranks fifth in the world in [[List of countries by GDP (nominal) per capita|nominal GDP per capita]]<ref></ref> and seventh in [[List of countries by GDP (PPP) per capita|GDP per capita at PPP]].<ref name="IMF.WEO.US" /> The country has been the world's largest economy since at least 1900.<ref></ref> The United States is at or near the forefront of [[Science and technology in the United States|technological advancement]] and [[innovation]]<ref></ref> in many economic sectors, especially in [[artificial intelligence]], [[computers]], [[pharmaceuticals]], and [[medical]], [[aerospace]], and [[military equipment]].<ref></ref> The nation's economy is fueled by abundant [[natural resource]]s, a well-developed infrastructure, and high productivity.<ref name="Wright, Gavin 2007 p. 185">Wright, Gavin, and Jesse Czelusta, "Resource-Based Growth Past and Present", in ''Natural Resources: Neither Curse Nor Destiny'', ed. Daniel Lederman and William Maloney (World Bank, 2007), p. 185. .</ref> It has the second-highest total-estimated value of natural resources, valued at [[United States dollar|US$]]44.98trillion in 2019, although sources differ on their estimates. Americans have the highest average [[Household income|household]] and [[List of countries by average wage|employee]] income among [[OECD]] member states.<ref></ref> In 2013, they had the sixth-highest [[median household income]], down from fourth-highest in 2010.<ref name="Household Income"></ref><ref name=autogenerated4></ref> The [[United States dollar|U.S. dollar]] is the currency most used in [[international trade|international transactions]] and is the world's foremost [[reserve currency]], backed by the country's economy, [[United States Armed Forces|its military]], the [[petrodollar|petrodollar system]] and its linked [[eurodollar]] and large [[U.S. Treasury|U.S. treasuries market]].<ref name="federalreserve.gov"></ref> Several countries [[International use of the US dollar|use it as their official currency]] and in others it is the [[de facto currency|''de facto'' currency]].<ref name="Benjamin J. Cohen 2006, p. 17">Benjamin J. Cohen, ''The Future of Money'', Princeton University Press, 2006, ; ''cf.'' "the dollar is the de facto currency in Cambodia", Charles Agar, ''[[Frommer's]] Vietnam'', 2006, , p. 17</ref><ref></ref> The [[New York Stock Exchange]] and [[Nasdaq]] are the world's [[List of stock exchanges|largest stock exchanges]] by [[market capitalization]] and [[trade volume]].<ref></ref><ref name="sfc.hk">[http://www.sfc.hk/web/doc/EN/research/stat/a01.pdf Table A – Market Capitalization of the World's Top Stock Exchanges (As at end of June 2012)]. Securities and Exchange Commission (China).</ref> The [[List of the largest trading partners of the United States|largest U.S. trading partners]] are [[China]], the [[European Union]], [[Canada]], [[Mexico]], [[India]], [[Japan]], [[South Korea]], the [[United Kingdom]], and [[Taiwan]].<ref name="auto"></ref> The U.S. is the world's [[List of countries by imports|largest]] importer and the [[List of countries by exports|second-largest]] exporter.<ref></ref> It has [[free trade agreements]] with [[United States free-trade agreements|several countries]], including the [[United States–Mexico–Canada Agreement|USMCA]].<ref></ref> The U.S. ranked second in the [[Global Competitiveness Report]] in 2019, after [[Singapore]].<ref name="World Economic Forum"></ref> Of the world's [[Fortune Global 500|500 largest companies]], 124 are headquartered in the U.S.<ref> Number of companies data taken from the "Country" filter.</ref> While its economy has reached a [[post-industrial society|post-industrial]] level of development, the United States remains an industrial power.<ref name="Econ"></ref> It has a smaller [[welfare state]] and redistributes less income through government action than most other [[World Bank high-income economy|high-income]] countries.<ref></ref> The United States ranked the 41st highest in [[income inequality]] among 156 countries in 2017,<ref></ref> and the highest compared to the rest of the [[developed world]].<ref></ref><ref name="OHCHR"/> As of January 1, 2023, the United States had [[National debt of the United States|a national debt]] of $31.4 trillion.<ref></ref> ===Income and poverty=== [[File:US Wealth Inequality - v2.png|thumb|left|[[Congressional Budget Office|CBO]] chart featuring U.S. family wealth between 1989 and 2013. The top 10% of families held 76% of the wealth in 2013 while the bottom 50% of families held 1%. Inequality increased from 1989 to 2013.<ref name="CBO_2013"></ref>]] At $46,625 USD in 2021, American citizens have the highest [[median income]] in the world.<ref></ref> Despite the fact that they only account for 4.24% of the [[World population|global population]], they collectively [[List of countries by total wealth|possess 30.2%]] of the world's total wealth as of 2021, the largest percentage of any country.<ref></ref> The U.S. also ranks first in the number of dollar [[billionaire]]s and [[millionaire]]s in the world, with 724 billionaires (as of 2021)<ref></ref> and nearly 22 million millionaires (2021).<ref></ref> [[Wealth in the United States]] is [[Wealth inequality in the United States|highly concentrated]]; the richest 10% of the adult population own 72% of the country's household wealth, while the bottom 50% own just 2%.<ref> </ref> [[Income inequality in the United States|Income inequality]] in the U.S. remains at record highs,<ref></ref> with the top fifth of earners taking home more than half of all income<ref></ref> and giving the U.S. one of the widest income distributions among OECD members.<ref name="Sme"></ref> The United States is the only [[advanced economy]] that does not [[List of statutory minimum employment leave by country|guarantee its workers paid vacation]]<ref></ref> and is one of a few countries in the world without [[paid family leave]] as a legal right.<ref></ref> The United States also has a higher percentage of low-income workers than almost any other developed nation, largely because of a weak [[collective bargaining]] system and lack of government support for at-risk workers.<ref></ref> There were about 567,715 sheltered and unsheltered [[Homelessness in the United States|homeless persons in the U.S.]] in January 2019, with almost two-thirds staying in an emergency shelter or transitional housing program.<ref name="Culp2013"></ref> Attempts to combat homelessness include the [[Section 8 (housing)|Section 8]] housing voucher program and implementation of the [[Housing First]] strategy across all levels of government.<ref></ref> In 2011, [[Hunger in the United States#Children|16.7&nbsp;million children lived in food-insecure households]], about 35% more than 2007 levels, though only 845,000 U.S. children (1.1%) saw reduced food intake or disrupted eating patterns at some point during the year, and most cases were not chronic.<ref></ref> 40&nbsp;million people, roughly 12.7% of the U.S. population, were living in poverty, including 13.3&nbsp;million children. Of those impoverished, 18.5&nbsp;million live in "deep poverty", family income below one-half of the federal government's poverty threshold.<ref name="OHCHR"></ref> ===Science, technology, and energy=== [[File:Buzz_salutes_the_U.S._Flag-crop.jpg|thumb|U.S. astronaut [[Buzz Aldrin]] saluting the [[United States flag|flag]] on the [[lunar surface|Moon]] during the [[Apollo 11]], 1969. The United States is the only country that has sent [[Moon landing|manned missions to the lunar surface]].]] The United States has been a leader in technological [[innovation]] since the late 19th century and scientific research since the mid-20th century. Methods for producing [[interchangeable parts]] and the establishment of a [[machine tool]] industry enabled the [[American system of manufacturing|U.S. to have large-scale manufacturing]] of sewing machines, bicycles, and other items in the late 19th century. In the early 20th century, factory [[electrification]], the introduction of the [[assembly line]], and other labor-saving techniques created the system of [[mass production]].<ref></ref> In the 21st century, approximately two-thirds of research and development funding comes from the private sector.<ref></ref> In 2020, the United States was the country with the [[List of countries by number of scientific and technical journal articles|second-highest]] number of published scientific papers<ref></ref> and second most patents granted,<ref></ref> both after China. In 2021, the United States launched a total of 51 [[spaceflights]]. (China reported 55.)<ref></ref> The U.S. had 2,944 active [[satellites]] in space in December 2021, the highest number of any country.<ref></ref> In 1876, [[Alexander Graham Bell]] was awarded the first U.S. [[Invention of the telephone|patent for the telephone]]. [[Thomas Edison]]'s [[Research institute|research laboratory]] developed the [[phonograph]], the first [[Incandescent light bulb|long-lasting light bulb]], and the first viable [[Kinetoscope|movie camera]].<ref name="Edison"></ref> The [[Wright brothers]] in 1903 made the [[Wright Flyer|first sustained and controlled heavier-than-air powered flight]], and the automobile companies of [[Ransom E. Olds]] and [[Henry Ford]] popularized the assembly line in the early 20th century.<ref></ref> The rise of [[fascism]] and [[Nazism]] in the 1920s and 30s led many European scientists, such as [[Albert Einstein]], [[Enrico Fermi]], and [[John von Neumann]], to immigrate to the United States.<ref name="fraser"></ref> During World War II, the Manhattan Project developed nuclear weapons, ushering in the [[Atomic Age]]. During the Cold War, competition for superior missile capability ushered in the [[Space Race]] between the U.S. and Soviet Union.<ref></ref><ref></ref> The invention of the [[transistor]] in the 1950s, a key component in almost all modern [[electronics]], led to the development of [[microprocessor]]s, [[software]], [[personal computer]]s and the [[Internet]].<ref name="Sawyer2012"></ref> In 2022, the United States ranked 2nd in the [[Global Innovation Index]].<ref></ref> , the United States receives approximately 80% of its energy from fossil fuels.<ref name="visu"></ref> In 2019, the largest source of the country's energy came from [[petroleum]] (36.6%), followed by [[natural gas]] (32%), [[coal]] (11.4%), renewable sources (11.4%) and [[nuclear power]] (8.4%).<ref name="visu" /><!--(numbers do not add up to 100 due to rounding errors) --> Americans constitute less than 5% of the [[world population|world's population]], but consume 17% of the [[Energy use in the United States|world's energy]].<ref></ref> They account for about 25% of the world's [[Oil consumption|petroleum consumption]], while producing only 6% of the world's annual petroleum supply.<ref name="EIA"></ref> The U.S. ranks as second-highest emitter of greenhouse gases, exceeded only by China.<ref></ref> ===Transportation=== [[File:Bright_Atlanta.jpg|thumb|The [[Downtown Connector]] in [[Atlanta, Georgia]], part of the [[Interstate Highway System]]]] The United States's [[Rail transport in the United States|rail network]], nearly all [[Standard-gauge railway|standard gauge]], is the [[List of countries by rail transport network size|longest in the world]], and exceeds .<ref></ref> It handles mostly freight, with intercity passenger service provided by [[Amtrak]] to all but four states.<ref></ref> The country's [[Inland waterways of the United States|inland waterway]]s are the world's [[List of countries by waterways length|fifth-longest]], and total .<ref></ref> Personal transportation is dominated by automobiles, which operate on a network of of public roads.<ref></ref> The United States has the world's second-largest automobile market,<ref></ref> and has the highest vehicle ownership per capita in the world, with 816.4 vehicles per 1,000 Americans (2014).<ref></ref> In 2017, there were 255 million non-two wheel motor vehicles, or about 910 vehicles per 1,000 people.<ref name="USBTS"></ref> The [[List of airlines of the United States|civil airline industry]] is entirely privately owned and has been largely [[Airline Deregulation Act|deregulated since 1978]], while [[List of airports in the United States|most major airports]] are publicly owned.<ref></ref> The three largest airlines in the world by passengers carried are U.S.-based; [[American Airlines]] is number one after its 2013 acquisition by [[US Airways]].<ref></ref> Of the [[List of the world's busiest airports by passenger traffic|world's 50 busiest passenger airports]], 16 are in the United States, including the busiest, [[Hartsfield–Jackson Atlanta International Airport]].<ref></ref> Of the [[List of busiest container ports|fifty busiest container ports]], four are located in the United States, of which the busiest is the [[Port of Los Angeles]].<ref></ref> ==Demographics== ===Population=== <!--As prose text is preferred, overly detailed statistical charts and diagrams such as economic trends, weather boxes, historical population charts, past elections results, etc. should be reserved for main sub articles on the topic as per WP:DETAIL as outlined at WP:NOTSTATS.--> The [[United States Census Bureau|U.S. Census Bureau]] reported 331,449,281 residents as of April 1, 2020,<ref name=2020CENSUS></ref> making the United States the [[List of countries and dependencies by population|third most populous]] nation in the world, after China and India.<ref name=":2"></ref> According to the Bureau's [[U.S. and World Population Clock|U.S. Population Clock]], on January&nbsp;28, 2021, the U.S. population had a net gain of one person every 100 seconds, or about 864 people per day.<ref></ref> In 2018, 52% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 32% had never been married.<ref></ref> In 2020, the U.S. had a [[total fertility rate]] stood at 1.64 children per woman<ref></ref> and the world's highest rate (23%) of children living in [[Single parents in the United States|single-parent]] households.<ref></ref> The United States of America has a diverse population; 37 [[American ancestries|ancestry groups]] have more than one million members.<ref name="An2000"></ref> [[Non-Hispanic whites|White Americans]] with ancestry from Europe, the Middle East or North Africa, form the largest [[race (human classification)|racial]] and [[ethnic group]] at 57.8% of the United States population.<ref></ref><ref></ref> [[Hispanic and Latino Americans]] form the second-largest group and are 18.7% of the United States population. [[African Americans]] constitute the nation's third-largest ancestry group and are 12.1% of the total United States population.<ref name="An2000" /> [[Asian Americans]] are the country's fourth-largest group, composing 5.9% of the United States population, while the country's 3.7 million [[Native Americans in the United States|Native Americans]] account for about 1%.<ref name="An2000" /> In 2020, the [[median age]] of the United States population was 38.5 years.<ref name=":2" /> In 2018, there were almost 90 million immigrants and [[Second-generation immigrants in the United States|U.S.-born children of immigrants]] in the United States, accounting for 28% of the overall U.S. population.<ref></ref> In 2017, out of the U.S. foreign-born population, some 45% (20.7&nbsp;million) were naturalized citizens, 27% (12.3&nbsp;million) were lawful permanent residents, 6% (2.2&nbsp;million) were temporary lawful residents, and 23% (10.5&nbsp;million) were unauthorized immigrants.<ref name="KeyFindings"></ref> The United States led the world in [[refugee resettlement]] for decades, admitting more refugees than the rest of the world combined.<ref name="PewRefugees"></ref> ===Language=== English (specifically, [[American English]]) is the de facto [[national language]] of the United States. Although there is no [[official language]] at the federal level, some laws—such as [[Naturalized citizen of the United States|U.S. naturalization requirements]]—standardize English, and most states have declared English as the official language.<ref></ref> Three states and four U.S. territories have recognized local or indigenous languages in addition to English, including Hawaii ([[Hawaiian language|Hawaiian]]),<ref></ref> Alaska ([[Alaska Native languages|twenty Native languages]]),<ref></ref> South Dakota ([[Sioux language|Sioux]]),<ref name="LakotaCommon"></ref> American Samoa ([[Samoan language|Samoan]]), Puerto Rico ([[Spanish language|Spanish]]), Guam ([[Chamorro language|Chamorro]]), and the Northern Mariana Islands ([[Carolinian language|Carolinian]] and Chamorro). In Puerto Rico, Spanish is more widely spoken than English.<ref name="PuertoRicoTranslation"></ref> According to the [[American Community Survey]], in 2010 some 229 million people (out of the total U.S. population of 308 million) spoke only English at home. More than 37 million spoke [[Spanish language in the United States|Spanish]] at home, making it the second most commonly used language in the United States. Other languages spoken at home by one million people or more include [[Chinese language|Chinese]] (2.8 million), [[Tagalog language|Tagalog]] (1.6 million), [[Vietnamese language|Vietnamese]] (1.4 million), [[French language|French]] (1.3 million), [[Korean language|Korean]] (1.1 million), and [[German language|German]] (1 million).<ref></ref> The [[List of most commonly learned foreign languages in the United States|most widely taught foreign languages]] in the United States, in terms of enrollment numbers from kindergarten through university [[undergraduate education]], are Spanish (around 7.2&nbsp;million students), French (1.5&nbsp;million), and [[German language in the United States|German]] (500,000). Other commonly taught languages include [[Latin]], [[Japanese language education in the United States|Japanese]], [[American Sign Language]], [[Italian language in the United States|Italian]], and [[Chinese language in the United States|Chinese]].<ref></ref><ref></ref> ===Religion=== A large variety of faiths have historically flourished within the country. According to the [[World Values Survey]] in 2017, the United States is more [[Secularity|secular]] than the median country; they ranked the United States the 32nd least religious country in the world.<ref name=":7"></ref> Until the 1990s, the country was a substantial [[outlier]] among other [[Developed country|highly developed]] countries: uniquely [[Wealth and religion|combining a high level of religiosity and wealth]], although this has lessened significantly since then.<ref name=":7" /><ref name=":4"></ref><ref name=":62"></ref><ref name=":52"></ref> ''[[Gallup, Inc.|Gallup]]'' polls during the early 2020s found that about 81% of Americans believe in some conception of God, 45% report [[Prayer|praying]] on a daily basis, 41% report that religion plays a very important role in their lives, and 31% report attending religious services weekly or near weekly.<ref></ref><ref></ref><ref name=":0"></ref> According to ''[[Gallup, Inc.|Gallup]]'' in December 2022, 58% of Americans report "seldom" or "never" attending religious services.<ref name=":0" /> According to the ''Institute for Family Studies'' in 2022, around 28% of Americans attended at least once or twice a month''.''<ref></ref> In a 2020 survey, about 64% of adults in the United States identified themselves as [[Christianity in the United States|Christians]] making it the country with the [[Christianity by country|largest Christian population]].<ref name="Global Christianity"></ref> [[Protestantism in the United States|Protestantism]] is the largest Christian religious grouping in the United States, accounting for around a third of all Americans. In the so-called [[Bible Belt]], located primarily within the [[Southern United States]], socially conservative [[evangelical Protestantism]] plays a significant role culturally. By contrast, religion plays the least important role in [[New England]] and the Western United States.<ref name="gallup.com"></ref> Another 6% claimed a non-Christian faith;<ref name=":4" /> the largest of which are [[American Jews|Judaism]], [[Islam in the United States|Islam]], [[Hinduism in the United States|Hinduism]], and [[Buddhism in the United States|Buddhism]].<ref name="pew2015"></ref> Around 30% of Americans describe themselves as having [[irreligion|no religion]].<ref name=":4" /> Membership in a house of worship fell from 70% in 1999 to 47% in 2020, much of the decline related to the number of Americans expressing no religious preference. Membership also fell among those who identified with a specific religious group.<ref></ref><ref></ref> According to ''Gallup'', trust in "the church or organized religion" has declined significantly since the 1970s.<ref></ref> The [[First Amendment to the United States Constitution|First Amendment]] of the U.S. Constitution guarantees the [[Free Exercise Clause|free exercise]] of religion and forbids Congress from passing laws respecting its [[Establishment Clause|establishment]].<ref></ref> ===Urbanization=== About 82% of Americans live in [[United States urban area|urban areas]], including suburbs;<ref name="WF" /> about half of those reside in cities with populations over 50,000.<ref></ref> In 2008, 273 [[List of United States cities by population|incorporated municipalities]] had populations over 100,000, nine cities had more than one million residents, and four cities ([[New York City]], [[Los Angeles]], [[Chicago]], and [[Houston]]) had populations exceeding two million.<ref name="PopEstBigCities"></ref> Many U.S. metropolitan populations are growing rapidly, particularly in the South and West.<ref></ref> ===Education=== [[File:University-of-Virginia-Rotunda.jpg|thumb|The [[University of Virginia]], founded by [[Thomas Jefferson]], is one of the many public colleges and universities in the United States.|alt=Photograph of the University of Virginia|right]] American [[state school|public education]] is operated by state and local governments and regulated by the [[United States Department of Education]] through restrictions on federal grants. In most states, children are required to attend school from the age of five or six (beginning with [[kindergarten]] or [[first grade]]) until they turn 18 (generally bringing them through [[twelfth grade]], the end of [[high school]]); some states allow students to leave school at 16 or 17.<ref></ref> Of Americans 25 and older, 84.6% graduated from high school, 52.6% attended some college, 27.2% earned a [[bachelor's degree]], and 9.6% earned graduate degrees.<ref></ref> The basic [[literacy]] rate is approximately 99%.<ref name="WF" /><ref>For more detail on U.S. literacy, see [https://nces.ed.gov/NAAL/PDF/2006470.PDF A First Look at the Literacy of America's Adults in the 21st century], U.S. Department of Education (2003).</ref> The United States has many private and public [[Lists of American institutions of higher education|institutions of higher education]]. The majority of the world's top [[Public university|public]] and [[Private university|private universities]], as listed by various ranking organizations, are in the United States.<ref></ref> There are also local [[community college]]s with generally more open admission policies, shorter academic programs, and lower tuition.<ref></ref> The U.S. spends more on education per student than any nation in the world,<ref></ref> spending an average of $12,794 per year on public elementary and secondary school students in the 2016–2017 school year.<ref></ref> As for [[public expenditure]]s on higher education, the U.S. spends more per student than the [[Organisation for Economic Co-operation and Development|OECD]] average, and more than all nations in combined public and private spending.<ref name="education spending"></ref> Despite some student [[loan forgiveness]] programs in place,<ref></ref> [[Student debt|student loan debt]] has increased by 102% in the last decade,<ref></ref> and exceeded 1.7&nbsp;trillion dollars as of 2022.<ref></ref> ===Health=== [[File:Texas medical center.jpg|thumb|left|The [[Texas Medical Center]] in downtown [[Houston]] is the largest medical complex in the world.<ref></ref> |alt=The Texas Medical Center, a cluster of contemporary skyscrapers, at night]] In a preliminary report, the [[Centers for Disease Control and Prevention]] (CDC) announced that U.S. [[life expectancy]] at birth had dropped to 76.4 years in 2021 (73.2 years for men and 79.1 years for women), down 0.9 years from 2020. This was the second year of overall decline, and the chief causes listed were the [[COVID-19]] pandemic, accidents, drug overdoses, heart and liver disease, and suicides.<ref></ref><ref></ref> Life expectancy was highest among Asians and Hispanics and lowest among Blacks and American Indian–Alaskan Native ([[AIAN (U.S. Census)|AIAN]]) peoples.<ref></ref><ref></ref> Starting in 1998, the average life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since.<ref></ref> The U.S. also has one of the highest [[Suicide in the United States|suicide]] rates among [[high-income countries]],<ref></ref> and approximately one-third of the U.S. adult population is obese and another third is overweight.<ref></ref> In 2010, [[coronary artery disease]], [[lung cancer]], [[stroke]], [[chronic obstructive pulmonary disease]]s, and traffic collisions caused the most years of life lost in the U.S. [[Low back pain]], [[major depressive disorder|depression]], [[musculoskeletal disorder]]s, [[neck pain]], and [[anxiety]] caused the most years lost to disability. The most harmful [[risk factor]]s were poor diet, [[tobacco smoking]], obesity, [[Hypertension|high blood pressure]], [[Hyperglycemia|high blood sugar]], [[physical inactivity]], and [[Alcohol consumption and health|alcohol consumption]]. [[Alzheimer's disease]], [[substance use disorder]]s, [[kidney disease]], [[cancer]], and falls caused the most additional years of life lost over their age-adjusted 1990 per-capita rates.<ref name="Murray2013"></ref> [[Teenage pregnancy in the United States|Teenage pregnancy]] and [[Abortion in the United States|abortion]] rates in the U.S. are substantially higher than in other Western nations, especially among blacks and Hispanics.<ref></ref> The U.S. health care system far [[List of countries by total health expenditure (PPP) per capita|outspends]] that of any other nation, measured both in per capita spending and as a percentage of GDP but attains worse healthcare outcomes when compared to peer nations.<ref></ref> The United States is the only developed nation [[Healthcare reform in the United States|without a system of universal health care]], and a [[Health insurance coverage in the United States|significant proportion of the population that does not carry health insurance]].<ref></ref> The U.S., however, is a global leader in medical innovation, measured either in terms of revenue or the number of new drugs and devices introduced.<ref name="EFPIA"></ref><ref name="Europe">Stats from 2007 Europ.Fed.of Pharm.Indust.and Assoc. Retrieved June 17, 2009, from [http://212.3.246.100/Objects/2/Files/infigures2007.pdf]</ref> Government-funded health care coverage for the poor ([[Medicaid]], established in 1965) and for those age 65 and older ([[Medicare (United States)|Medicare]], begun in 1966) is available to Americans who meet the programs' income or age qualifications. In 2010, former President Obama passed the [[Patient Protection and Affordable Care Act]] or ACA,<ref></ref> which the CDC said that the law roughly halved the uninsured share of the population<ref></ref> and multiple studies have concluded that ACA had reduced the mortality of enrollees.<ref name="NYT20200323GoodnoughAbelsonetAl"></ref><ref></ref><ref></ref> However, its legacy [[Criticism of Obamacare|remains controversial]].<ref></ref> ==Culture and society== [[File:Liberty02.jpg|thumb|upright|The [[Statue of Liberty]] (''Liberty Enlightening the World''), a gift from [[France]], has become an iconic symbol of the [[American Dream]].<ref></ref>|alt=The Statue of Liberty, a large teal bronze sculpture on a stone pedestal]] Americans have traditionally [[Stereotypes of Americans|been characterized]] by a strong [[work ethic]],<ref></ref> competitiveness,<ref></ref> and [[individualism]],<ref></ref> as well as a unifying belief in an "[[American civil religion|American creed]]" emphasizing liberty, [[social equality]], [[property rights]], democracy, [[equality under the law]], and a preference for [[limited government]].<ref>: also see [[American's Creed]], written by [[William Tyler Page]] and adopted by Congress in 1918.</ref> Americans are extremely charitable by global standards: according to a 2016 study by the [[Charities Aid Foundation]], Americans donated 1.44% of total GDP to charity, the [[List of countries by charitable donation|highest]] in the world by a large margin.<ref></ref> The United States is home to a [[Multiculturalism|wide variety]] of ethnic groups, traditions, and values,<ref name="DD"></ref><ref name="Society in Focus"></ref> and exerts major cultural influence [[Americanization|on a global scale]].<ref>[[United States#BBC18may|BBC, April 2008: Country Profile: United States of America]]</ref><ref></ref> The country has been described as a society "built on a [[Moral universalism|universalistic cultural frame]] rooted in the natural laws of science and human rights."<ref></ref> The [[United States Declaration of Independence|Declaration of Independence]] has become a well-known statement on [[human rights]], particularly its second sentence: "We hold these truths to be self-evident, that [[all men are created equal]], that they are endowed by [[Higher Power|their Creator]] with certain unalienable Rights, that among these are [[Life, Liberty and the pursuit of Happiness]]." Stephen Lucas called it "one of the best-known sentences in the English language",<ref></ref> with historian [[Joseph Ellis]] writing that the document contains "the most potent and consequential words in American history".<ref name="American Creation"></ref> The passage has since came to represent a moral standard to which the United States should strive. This view was notably promoted by Lincoln, who considered it to be the foundation of his political philosophy and argued that it is a statement of principles through which the Constitution should be interpreted.<ref name="Second AR"></ref> Aside from the Native American, [[Native Hawaiians|Native Hawaiian]], and [[Alaska Natives|Native Alaskan]] populations, nearly all Americans or their ancestors immigrated or were imported as slaves within the past five centuries.<ref></ref> [[wikt:mainstream|Mainstream]] American culture is a [[Western culture]] largely derived from the [[European American|traditions of European immigrants]] with influences from many other sources, such as [[African-American culture|traditions brought by slaves from Africa]].<ref name="DD" /><ref><br /></ref> More recent immigration from [[Asian American|Asia]] and especially [[Latin American culture|Latin America]] has added to a cultural mix that has been described as a homogenizing [[melting pot]], and a heterogeneous [[salad bowl (cultural idea)|salad bowl]], with immigrants contributing to, and often [[Assimilation (phonology)|assimilating]] into, mainstream American culture.<ref name="DD" /> Nevertheless, there is a high degree of social inequality related to [[Racial inequality in the United States|race]]<ref></ref> and [[Wealth inequality in the United States|wealth]].<ref></ref> The [[American Dream]], or the perception that Americans enjoy high [[Socio-economic mobility in the United States|social mobility]], plays a key role in attracting immigrants.<ref></ref> Whether this perception is accurate has been a topic of debate.<ref name="socialmobility">* * * *</ref><ref name="CAP"></ref><ref name="Schneider"></ref> While mainstream culture holds that the United States is a [[classless society]],<ref></ref> scholars identify significant differences between [[Social class in the United States|the country's social classes]], affecting [[socialization]], language, and values.<ref> </ref> Americans tend to greatly value [[socioeconomics|socioeconomic]] achievement, but being [[Average Joe|ordinary or average]] is promoted by some as a noble condition.<ref></ref> The United States has uniquely broad protections for [[Freedom of speech|free speech]] under the [[First Amendment to the United States Constitution|First Amendment]], with no exceptions for speech that is commonly proscribed in other liberal democracies such as [[Blasphemy law|blasphemy]], [[Hate speech in the United States|hate speech]], and [[Lèse-majesté|lese-majesty]].<ref></ref><ref></ref> A 2016 [[Pew Research Center]] poll found that Americans were the most supportive of free expression of any polity measured.<ref></ref> They were also found to be the "most supportive of [[freedom of the press]] and the [[Right to Internet access|right to use the internet]] without government censorship."<ref></ref> It is a [[Cultural liberalism|socially progressive]] country<ref></ref> with [[Permissive society|permissive]] attitudes surrounding [[human sexuality]].<ref></ref> [[LGBT rights in the United States|LGBT rights]] are among the most advanced in the world,<ref></ref><ref></ref> with [[Public opinion of same-sex marriage in the United States|public opinion]] and [[jurisprudence]] on the issue changing significantly since the late 1980s.<ref></ref> A late 2022 ''Grinnell College'' poll found that 74% of Americans agreed that same-sex marriage should be a guaranteed right while 13% disagreed.<ref></ref><ref></ref> In 2015, the Supreme Court [[Obergefell v. Hodges|ruled that]] same-sex marriage bans were unconstitutional.<ref></ref> ===Literature and visual arts=== [[File:Mark Twain by AF Bradley.jpg|thumb|left|145px|upright|[[Mark Twain]], American author and [[humorist]]|alt=Photograph of Mark Twain]] In the 18th and early 19th centuries, American art and literature took most of their cues from Europe, contributing to Western culture. Writers such as [[Washington Irving]], [[Nathaniel Hawthorne]], [[Edgar Allan Poe]], and [[Henry David Thoreau]] established a distinctive American literary voice by the middle of the 19th century. [[Mark Twain]] and poet [[Walt Whitman]] were major figures in the century's second half; [[Emily Dickinson]], virtually unknown during her lifetime, is recognized as an essential American poet.<ref></ref> A work seen as capturing fundamental aspects of the national experience and character—such as [[Herman Melville]]'s ''[[Moby-Dick]]'' (1851), Twain's ''[[Adventures of Huckleberry Finn|The Adventures of Huckleberry Finn]]'' (1885), [[F. Scott Fitzgerald]]'s ''[[The Great Gatsby]]'' (1925) and [[Harper Lee]]'s ''[[To Kill a Mockingbird]]'' (1960)—may be dubbed the "[[Great American Novel]]."<ref></ref> Thirteen U.S. citizens have won the [[Nobel Prize in Literature]]. [[William Faulkner]], [[Ernest Hemingway]] and [[John Steinbeck]] are often named among the most influential writers of the 20th century.<ref></ref> The [[Beat Generation]] writers opened up new literary approaches, as have [[postmodern literature|postmodernist]] authors such as [[John Barth]], [[Thomas Pynchon]], and [[Don DeLillo]].<ref name="Lesher2000"></ref> In the visual arts, the [[Hudson River School]] was a mid-19th-century movement in the tradition of European [[Realism (arts)|naturalism]]. The 1913 [[Armory Show]] in New York City, an exhibition of European [[modern art|modernist art]], shocked the public and transformed the U.S. art scene.<ref></ref> [[Georgia O'Keeffe]], [[Marsden Hartley]], and others experimented with new, individualistic styles. Major artistic movements such as the [[abstract expressionism]] of [[Jackson Pollock]] and [[Willem de Kooning]] and the [[pop art]] of [[Andy Warhol]] and [[Roy Lichtenstein]] developed largely in the United States. The tide of modernism and then [[postmodernism]] has brought fame to American architects such as [[Frank Lloyd Wright]], [[Philip Johnson]], and [[Frank Gehry]].<ref name="JansonJanson2003"></ref> Americans have long been important in the modern artistic medium of [[photography]], with major photographers including [[Alfred Stieglitz]], [[Edward Steichen]], [[Edward Weston]], and [[Ansel Adams]].<ref name="Davenport1991"></ref> ===Cinema and theater=== [[File:Hollywood Sign (Zuschnitt).jpg|thumb|The [[Hollywood Sign]] in [[Los Angeles]], California|alt=The Hollywood Sign, large white block letters on a hillside]] [[Hollywood, Los Angeles|Hollywood]], a northern district of Los Angeles, California, is the leader in motion picture production and the most recognizable movie industry in the world.<ref></ref><ref></ref><ref></ref> The [[major film studios]] of the United States are the primary source of the [[List of highest grossing films|most commercially successful]] and most ticket selling movies in the world.<ref name="Kerrigan_Page_18"></ref><ref name="Davis"></ref> The world's first commercial motion picture exhibition was given in New York City in 1894, using the [[Kinetoscope]].<ref></ref> Since the early 20th century, the U.S. film industry has largely been based in and around Hollywood, although in the 21st century an increasing number of films are not made there, and film companies have been subject to the forces of globalization.<ref></ref> The [[Academy Awards]], popularly known as the Oscars, have been held annually by the [[Academy of Motion Picture Arts and Sciences]] since 1929,<ref name="DrowneHuber2004"></ref> and the [[Golden Globe Award]]s have been held annually since January 1944.<ref name="Kroon2014"></ref> Director [[D. W. Griffith]], an American [[filmmaker]] during the [[silent film]] period, was central to the development of [[film grammar]], and producer/entrepreneur [[Walt Disney]] was a leader in both [[animation|animated film]] and movie [[merchandising]].<ref name="KrasniewiczDisney2010"></ref> Directors such as [[John Ford]] redefined the image of the American Old West, and, like others such as [[John Huston]], broadened the possibilities of cinema with location shooting. The industry enjoyed its golden years, in what is commonly referred to as the "[[Classical Hollywood cinema|Golden Age of Hollywood]]", from the early sound period until the early 1960s,<ref></ref> with screen actors such as [[John Wayne]] and [[Marilyn Monroe]] becoming iconic figures.<ref></ref><ref></ref> In the 1970s, "[[New Hollywood]]" or the "Hollywood Renaissance"<ref name="Greven2013"></ref> was defined by grittier films influenced by French and Italian realist pictures of the [[Aftermath of World War II|post-war period]].<ref name="Morrison1998"></ref> Theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the [[Theatre of the United Kingdom|British theater]].<ref name="Saxon2011"></ref> The central hub of the American theater scene has been [[Manhattan]], with its divisions of [[Broadway theatre|Broadway]], [[off-Broadway]], and [[off-off-Broadway]].<ref name="LondréWatermeier1998"></ref> Many movie and television stars have gotten their big break working in New York productions. Outside New York City, many cities have professional [[Regional theater in the United States|regional or resident theater companies]] that produce their own seasons, with some works being produced regionally with hopes of eventually moving to New York. The biggest-budget theatrical productions are [[musical theatre|musicals]]. U.S. theater also has an active [[community theater]] culture, which relies mainly on local volunteers who may not be actively pursuing a theatrical career.<ref>Stephen Watt, and Gary A. Richardson, ''American Drama: Colonial to Contemporary'' (1994).</ref> ===Music=== [[File:Country_music_hall_of_fame2.jpg|thumb|The [[Country Music Hall of Fame and Museum]] in [[Nashville, Tennessee]]]] [[American folk music]] encompasses numerous music genres, variously known as traditional music, traditional [[folk music]], contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the [[British Isles]], [[Mainland Europe]], or Africa.<ref name=afc>[https://www.loc.gov/folklife/guide/folkmusicandsong.html "Folk Music and Song", American Folklife Center, Library of Congress]</ref> Among America's earliest composers was a man named [[William Billings]] who, born in Boston, composed patriotic hymns in the 1770s;<ref name="Eggart2007"></ref> Billings was a part of the [[Yankee tunesmiths|First New England School]], who dominated American music during its earliest stages. [[Anthony Heinrich]] was the most prominent composer before the Civil War. From the mid- to late 1800s, [[John Philip Sousa]] of the late [[Romantic music|Romantic era]] composed numerous military songs—[[List of marches by John Philip Sousa|particularly marches]]—and is regarded as one of America's greatest composers.<ref name="Bierley1973"></ref> The rhythmic and lyrical styles of [[African-American music]] have significantly influenced [[Music of the United States|American music]] at large, distinguishing it from European and African traditions. Elements from folk idioms such as the [[blues]] and what is known as [[old-time music]] were adopted and transformed into [[popular music|popular genres]] with global audiences. [[Jazz]] was developed by innovators such as [[Louis Armstrong]] and [[Duke Ellington]] early in the 20th century. [[Country music]] developed in the 1920s, and [[rhythm and blues]] in the 1940s.<ref name="autogenerated2001"></ref> [[Elvis Presley]] and [[Chuck Berry]] were among the pioneers of [[rock and roll]] in the mid-1950s. Rock bands such as [[Metallica]], the [[Eagles (band)|Eagles]], and [[Aerosmith]] are among the [[List of best-selling music artists|highest grossing]] in worldwide sales.<ref></ref><ref></ref><ref></ref> In the 1960s, [[Bob Dylan]] emerged from the [[American folk music revival|folk revival]] to become one of America's most celebrated songwriters.<ref></ref> Mid-20th-century American pop stars such as [[Bing Crosby]], [[Frank Sinatra]],<ref></ref> and Elvis Presley became global celebrities,<ref name="autogenerated2001" /> as have artists of the late 20th century such as [[Prince (musician)|Prince]], [[Michael Jackson]], [[Madonna]], [[Whitney Houston]], and [[Mariah Carey]].<ref></ref><ref></ref> ===Mass media=== [[File:Comcast_Philly.JPG|upright|thumb|left|The [[Comcast Center]] in [[Philadelphia]], headquarters of the nation's largest [[Multinational corporation|multinational]] [[telecommunications]] [[conglomerate (company)|conglomerate]]<ref></ref>]] The four major broadcasters in the U.S. are the [[National Broadcasting Company]] (NBC), [[Columbia Broadcasting System]] (CBS), [[American Broadcasting Company]] (ABC), and [[Fox Broadcasting Company]] (FOX). The four major broadcast [[television network]]s are all commercial entities. [[Cable television in the United States|Cable television]] offers hundreds of channels catering to a variety of niches.<ref></ref> , about 83% of Americans over age 12 listen to [[radio broadcasting|broadcast radio]], while about 41% listen to [[podcast]]s.<ref></ref> , there are 15,433 licensed full-power radio stations in the U.S. according to the U.S. [[Federal Communications Commission]] (FCC).<ref></ref> Much of the public radio broadcasting is supplied by [[NPR]], incorporated in February 1970 under the [[Public Broadcasting Act of 1967]].<ref></ref> Well-known U.S. newspapers include ''[[The Wall Street Journal]]'', ''[[The New York Times]]'', and ''[[USA Today]]''.<ref name="Shaffer2006"></ref> More than 800 publications are produced in Spanish, the second most commonly used language in the United States behind English.<ref></ref><ref></ref> With very few exceptions, all the newspapers in the U.S. are privately owned, either by large chains such as [[Gannett Company|Gannett]] or [[The McClatchy Company|McClatchy]], which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in a situation that is increasingly rare, by individuals or families. Major cities often have [[alternative newspaper]]s to complement the mainstream daily papers, such as New York City's ''[[The Village Voice]]'' or Los Angeles' ''[[LA Weekly]]''. The five most popular websites used in the U.S. are [[Google]], [[YouTube]], [[Amazon (company)|Amazon]], [[Yahoo]], and [[Facebook]].<ref name="alexa-topsitesus"></ref> The [[Video games in the United States|American video game industry]] is the world's 2nd largest by revenue.<ref></ref> It generated $90 billion in annual economic output in 2020. Furthermore, the video game industry contributed $12.6 billion in federal, state, and municipal taxes annually.<ref></ref> Some of the largest video game companies like [[Activision Blizzard]], [[Xbox]], [[Sony Interactive Entertainment]], [[Rockstar Games]], and [[Electronic Arts]] are based in the United States.<ref></ref> Some of the most popular and best selling video games like [[The Elder Scrolls V: Skyrim]], [[Call of Duty: Modern Warfare (2019 video game)|Call of Duty: Modern Warfare]] and [[Diablo III]] are made by American [[Video game developer|developers]].<ref></ref> The American video gaming business is still a significant employer. More than 143,000 individuals are employed directly and indirectly by video game companies throughout 50 states. The national compensation for direct workers is $2.9 billion, or an average wage of $121,000.<ref></ref> ===Food=== [[File:Roast_turkey.jpg|thumb|[[Turkey as food|Roasted turkey]] is a traditional [[Thanksgiving (United States)|Thanksgiving]] dinner dish and is usually the main entree.<ref name="GillespieMechling1995"></ref>|alt=A roasted turkey]] Early settlers were introduced by Native Americans to such indigenous, non-European foods as turkey, [[sweet potato]]es, [[maize|corn]], [[Cucurbita|squash]], and [[maple syrup]]. They and later immigrants combined these with foods they had known, such as [[wheat flour]],<ref name="Wheat"></ref> beef, and milk to create a distinctive American cuisine.<ref></ref><ref></ref> Homegrown foods are part of a shared national menu on one of America's most popular holidays, [[Thanksgiving (United States)|Thanksgiving]], when many Americans make or purchase traditional foods to celebrate the occasion.<ref name="Mintz1996"></ref> The American [[fast food]] industry, the world's largest,<ref></ref> pioneered the [[drive-through]] format in the 1940s.<ref name="drivethru"></ref> Characteristic American dishes such as [[apple pie]], [[fried chicken]], [[doughnuts]], [[french fries]], [[macaroni and cheese]], [[ice cream]], [[Pizza in the United States|pizza]], [[hamburgers]], and [[hot dogs]] derive from the recipes of various immigrants.<ref></ref><ref></ref> [[Mexican cuisine|Mexican]] dishes such as [[burritos]] and [[tacos]] and [[pasta]] dishes freely adapted from [[Italian cuisine|Italian]] sources are widely consumed.<ref name="IFT"></ref> Americans drink three times as much coffee as tea.<ref name="coffeeandtea"></ref> Marketing by U.S. industries is largely responsible for making orange juice and milk standard [[List of breakfast drinks|breakfast beverages]].<ref>[[#Smith2004|Smith, 2004]], pp. 131–132</ref><ref>[[#Levenstein|Levenstein, 2003]], pp. 154–155</ref> ===Sports=== [[File:WFT vs. Cowboys (51751852586).jpg|thumb|[[American football]] is the most popular sport in the United States.]] The most popular sports in the U.S. are [[American football]], [[basketball]], [[baseball]] and [[ice hockey]].<ref></ref> While most major U.S. sports such as [[baseball]] and [[American football]] have evolved out of European practices, [[basketball]], [[volleyball]], [[skateboarding]], and [[snowboarding]] are American inventions, some of which have become popular worldwide.<ref></ref> [[Lacrosse]] and [[surfing]] arose from Native American and Native Hawaiian activities that predate European contact.<ref name="liss">Liss, Howard. ''Lacrosse'' (Funk & Wagnalls, 1970) pg 13.</ref> The market for [[professional sports]] in the United States is roughly $69&nbsp;billion, roughly 50% larger than that of all of Europe, the Middle East, and Africa combined.<ref></ref> American football is by several measures the most popular spectator sport in the United States;<ref> MacCambridge, Michael (2004). ''America's Game: The Epic Story of How Pro Football Captured a Nation''. New York: Random House. .</ref> the [[National Football League]] (NFL) has the highest average attendance of any sports league in the world, and the [[Super Bowl]] is watched by tens of millions globally.<ref></ref> Baseball has been regarded as the U.S. [[national sport]] since the late 19th century, with [[Major League Baseball]] being the top league. Basketball and [[ice hockey]] are the country's next two most popular professional team sports, with the top leagues being the [[National Basketball Association]] and the [[National Hockey League]]. The most-watched [[individual sport]]s in the U.S. are [[golf]] and [[auto racing]], particularly [[NASCAR]] and [[IndyCar]].<ref></ref><ref></ref> Eight [[Olympic Games]] have taken place in the United States. The [[1904 Summer Olympics]] in [[St. Louis]], [[Missouri]], were the first-ever Olympic Games held outside of Europe.<ref></ref> The Olympic Games will be held in the U.S. for a ninth time when [[Los Angeles]] hosts the [[2028 Summer Olympics]]. , the United States has won 2,629 medals at the [[Summer Olympic Games]], more than any other country, and 330 in the [[Winter Olympic Games]], the second most behind Norway.<ref> </ref> In [[Association football|soccer]], the [[United States men's national soccer team|men's national soccer team]] qualified for [[United States at the FIFA World Cup|eleven World Cups]] and the [[United States women's national soccer team|women's team]] has [[United States at the FIFA Women's World Cup|won]] the [[FIFA Women's World Cup]] four times.<ref></ref> The United States hosted the [[1994 FIFA World Cup]] and will host the [[2026 FIFA World Cup]] along with [[Canada]] and [[Mexico]]. On the [[collegiate athletics|collegiate]] level, earnings for the member institutions exceed $1 billion annually,<ref name="si">[https://www.si.com/college-basketball/2018/03/07/ncaa-1-billion-revenue Sports Illustrated: NCAA Reports $1.1 Billion in Revenues]</ref> and [[college football]] and [[College basketball|basketball]] attract large audiences, as the [[NCAA Division I men's basketball tournament|NCAA Final Four]] is one of the most watched sporting events.<ref></ref> ==See also== * [[Index of United States–related articles]] * [[Lists of U.S. state topics]] * [[Outline of the United States]] <!--For wide-screen monitors, prevent image above from punching through the ref section. --> ==Notes== ==References== ==Further reading== * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * '''Internet sources''' * * * * * * * * * * * ==External links== <!-- Please: 1) Follow the [[WP:EL]] guideline where possible and consider discussing on the talk page. The MediaWiki software that powers Wikipedia has several parameters that limit the complexity of a page, thus limiting the number of templates that can be included. 2) Do not turn these bullets into headers! They expand the TOC too much. --> * [https://www.cia.gov/the-world-factbook/countries/united-states/ United States]. ''[[The World Factbook]]''. [[Central Intelligence Agency]]. * [https://www.bbc.co.uk/news/world-us-canada-16761057 United States] from the [[BBC News]] * [https://www.ifs.du.edu/ifs/frm_CountryProfile.aspx?Country=US Key Development Forecasts for the United States] from [[International Futures]] ; Government * [https://www.usa.gov/ Official U.S. Government Web Portal] Gateway to government sites * [https://www.house.gov/ House] Official site of the United States House of Representatives * [https://www.senate.gov/ Senate] Official site of the United States Senate * [https://www.whitehouse.gov/ White House] Official site of the president of the United States * [ Supreme Court] Official site of the Supreme Court of the United States ; History * [https://web.archive.org/web/20080314143240/https://www.nationalcenter.org/HistoricalDocuments.html Historical Documents] Collected by the National Center for Public Policy Research * [https://www.religioustolerance.org/nat_mott.htm U.S. National Mottos: History and Constitutionality] Analysis by the Ontario Consultants on Religious Tolerance * [https://www.historicalstatistics.org/index2.html USA] Collected links to historical data ; Maps * [https://web.archive.org/web/20091021182322/https://www.nationalatlas.gov/ National Atlas of the United States] Official maps from the U.S. Department of the Interior * * * [https://www.measureofamerica.org/maps/ Measure of America] A variety of mapped information relating to health, education, income, and demographics for the U.S. ; Photos * [https://www.flickr.com/search/?text=USA Photos of the USA] <!-- Target for Navbox link at See also section --> [[Category:United States]] [[Category:Countries in North America]] [[Category:English-speaking countries and territories]] [[Category:Federal constitutional republics]] [[Category:Former British colonies and protectorates in the Americas]] [[Category:Former confederations]] [[Category:G20 nations]] [[Category:Member states of NATO]] [[Category:Member states of the United Nations]] [[Category:States and territories established in 1776]] [[Category:Transcontinental countries]] ```
JamesConley/wikitext_en
[ "size_categories:1M<n<10M", "language:en", "region:us" ]
2024-01-18T23:42:42+00:00
{"language": ["en"], "size_categories": ["1M<n<10M"]}
2024-01-19T02:34:10+00:00
[]
[ "en" ]
TAGS #size_categories-1M<n<10M #language-English #region-us
This dataset is a processed version of an english wikipedia dump (URL Articles are ordered from most referenced to least referenced. Information between "{{" and "}}" tags has been removed. The format of the CSV is [[Paleo-Indians|Paleo-Americans]] [[Settlement of the Americas|migrated from Siberia]] to the North American mainland at least 12,000 years ago, and are the ancestors of modern [[Native Americans in the United States|Native Americans]]. [[Kingdom of Great Britain|Great Britain]]'s [[Thirteen Colonies]] quarreled with the British Crown over taxation and [[No taxation without representation|political representation]], leading to the [[American Revolution]] (1765–1791). After the Revolution, the United States gained independence, the first [[Nation state|nation-state]] founded on [[American Enlightenment|Enlightenment]] principles of [[liberal democracy]]. In the late 18th century, the U.S. began expanding across North America, gradually [[Territorial evolution of the United States|obtaining new territories]], sometimes through war, frequently [[Indian removal|displacing Native Americans]], and admitting new states. By 1848, the United States spanned the [[continent]] from east to west. The controversy surrounding the practice of [[Slavery in the United States|slavery]] culminated in the secession of the [[Confederate States of America]], which fought the remaining states of the [[Union (American Civil War)|Union]] during the [[American Civil War]] (1861–1865). With the Union's victory and preservation, slavery was abolished by the [[Thirteenth Amendment to the United States Constitution|Thirteenth Amendment]]. By 1890, the United States had grown to become the world's largest economy, and the [[Spanish–American War]] and [[World War I]] established the country as a [[Great power|world power]]. After Japan's [[Attack on Pearl Harbor|surprise attack on Pearl Harbor]] in 1941, the U.S. entered [[World War II]] on the [[Allies of World War II|Allied side]]. The aftermath of the war left the United States and the [[Soviet Union]] as the world's two [[superpower]]s and led to the [[Cold War]], which commenced in 1945 and ended in 1991 with the [[Dissolution of the Soviet Union|Soviet Union's dissolution]]. During the Cold War, both countries engaged in a struggle for ideological dominance but avoided direct military conflict. They also competed in the [[Space Race]], which culminated in the [[Apollo 11|1969 American spaceflight]] in which the U.S. was the first nation to land humans on the [[Moon]]. Simultaneously, the [[civil rights movement]] (1954–1968) led to legislation abolishing state and local [[Jim Crow laws]] and other codified racial discrimination against [[African Americans]]. With the Soviet Union's dissolution in 1991 and the end of the Cold War, the United States emerged as the world's sole superpower. In 2001, following the [[September 11 attacks]], the United States became a lead member of the [[War on terror|Global War on Terrorism]], which included the [[War in Afghanistan (2001–2021)|War in Afghanistan]] (2001–2021) and the [[Iraq War]] (2003–2011). The United States is a [[Federal government of the United States|federal republic]] with [[Separation of powers under the United States Constitution|three separate branches of government]], including a [[Bicameralism|bicameral legislature]]. It is a liberal democracy and has a [[market economy]]. It [[International rankings of the United States|ranks very high]] in international measures of [[Human Development Index|quality of life]], [[Income in the United States|income]] and [[Affluence in the United States|wealth]], [[Global Competitiveness Report#2022 rankings|economic competitiveness]], [[Human rights in the United States|human rights]], [[Global Innovation Index|innovation]], and [[Education in the United States|education]]; it has low levels of [[Corruption Perceptions Index|perceived corruption]]. The United States has the highest [[Median income|median income per person]] of any [[polity]] in the world, although it has high levels of [[United States incarceration rate|incarceration]] and [[Inequality in the United States|inequality]] and lacks [[universal health care]]. As a [[melting pot]] of [[Culture of the United States|cultures]] and [[Race and ethnicity in the United States|ethnicities]], the U.S. has been shaped by [[History of immigration to the United States|centuries of immigration]]. The United States is a highly [[developed country]], and [[Economy of the United States|its economy]] accounts for approximately a quarter of global [[gross domestic product|GDP]] and is the world's [[List of countries by GDP (nominal)|largest]] by GDP at market exchange rates. By value, the United States is the world's [[List of countries by imports|largest]] importer and [[List of countries by exports|second-largest]] exporter. Although it accounts for just over 4.2% of the world's total population, the U.S. holds [[List of countries by total wealth|over 30%]] of the total wealth in the world, the largest share held by any country. The United States is a founding member of the [[United Nations]], [[World Bank]], [[International Monetary Fund]], [[Organization of American States]], and [[NATO]], and is a [[Permanent members of the United Nations Security Council|permanent member]] of the [[United Nations Security Council]]. The country is responsible for more than a third of [[List of countries by military expenditures|global military spending]] and is the [[United States Armed Forces|foremost military power]] in the world and a leading [[Politics of the United States|political]], cultural, and [[Science and technology in the United States|scientific]] force. ==Etymology== The first known use of the name "[[Americas|America]]" dates to 1507, when it appeared on [[Waldseemüller map|a world map]] produced by the German cartographer [[Martin Waldseemüller]] in [[Saint-Dié-des-Vosges|Saint Dié]], [[Duchy of Lorraine|Lorraine]] (now northeastern France). On his map, the name is shown in large letters on what would now be considered [[South America]], honoring [[Amerigo Vespucci]]. The Italian explorer was the first to postulate that the [[West Indies]] did not represent Asia's eastern limit but were part of a previously unknown landmass. In 1538, the Flemish cartographer [[Gerardus Mercator]] used the name "America" to refer to the entire [[Western Hemisphere]]. The first documentary evidence of the phrase "United States of America" dates back to a letter from January 2, 1776, written by [[Stephen Moylan]] to [[Joseph Reed (politician)|Joseph Reed]], George Washington's [[aide-de-camp]]. Moylan expressed his wish to go "with full and ample powers from the United States of America to Spain" to seek assistance in the revolutionary war effort.DeLear, Byron (July 4, 2013) [URL Who coined 'United States of America'? Mystery might have intriguing answer.] "Historians have long tried to pinpoint exactly when the name 'United States of America' was first used and by whom ... This latest find comes in a letter that Stephen Moylan, Esq., wrote to Col. Joseph Reed from the Continental Army Headquarters in Cambridge, Mass., during the siege of Boston. The two men lived with Washington in Cambridge, with Reed serving as Washington's favorite military secretary and Moylan fulfilling the role during Reed's absence." ''Christian Science Monitor'' (Boston, MA).Touba, Mariam (November 5, 2014) [URL Who Coined the Phrase 'United States of America'? You May Never Guess] "Here, on January 2, 1776, seven months before the Declaration of Independence and a week before the publication of Paine's ''Common Sense'', Stephen Moylan, an acting secretary to General George Washington, spells it out, 'I should like vastly to go with full and ample powers from the United States of America to Spain' to seek foreign assistance for the cause." ''New-York Historical Society Museum & Library''Fay, John (July 15, 2016) [URL The forgotten Irishman who named the 'United States of America'] "According to the NY Historical Society, Stephen Moylan was the man responsible for the earliest documented use of the phrase 'United States of America'. But who was Stephen Moylan?" ''URL'' The first known publication of the phrase "United States of America" was in an anonymous essay in ''[[The Virginia Gazette]]'' newspaper in [[Williamsburg, Virginia|Williamsburg]], on April 6, 1776. The second draft of the [[Articles of Confederation]] and [[Perpetual Union]], prepared by [[John Dickinson]] and completed no later than June 17, 1776, declared "The name of this Confederation shall be the 'United States of America'." The final version of the Articles, sent to the states for ratification in late 1777, stated that "The Stile of this Confederacy shall be 'The United States of America'." In June 1776, [[Thomas Jefferson]] wrote the phrase "UNITED STATES OF AMERICA" in the headline of his "original Rough draught" of the [[United States Declaration of Independence|Declaration of Independence]]. This draft of the document did not surface until June 21, 1776, and it is unclear whether it was written before or after Dickinson used the term in his June 17 draft of the Articles of Confederation. The phrase "United States" was originally plural in American usage. It described a collection of states—e.g., "the United States are..." The singular form became popular after the end of the Civil War and is now standard usage. A [[Citizenship of the United States|citizen of the United States]] is called an "[[Americans|American]]". "United States", "American", and "U.S." refer to the country adjectivally ("American values", "U.S. forces"). In English, the word "American" rarely refers to topics or subjects not directly connected with the United States. ==History== ===Early history=== [[File:Extreme\_Makeover,*Mesa\_Verde\_Edition*-\_panoramio.jpg|thumb|right|[[Cliff Palace]], located in present-day [[Montezuma County, Colorado|Colorado]], was built by the [[Ancestral Puebloans]] between AD 1190 and 1260.|alt=Aerial view of the Cliff Palace]] It is generally accepted that the [[Paleo-indians|first inhabitants of North America]] migrated from [[Siberia]] by way of the [[Bering land bridge]] and arrived at least 12,000 years ago; however, some evidence suggests an even earlier date of arrival. The [[Clovis culture]], which appeared around 11,000 BC, is believed to represent the first wave of human settlement of the Americas. This was likely the first of three major waves of migration into North America; later waves brought the ancestors of present-day [[Alaskan Athabaskans|Athabaskans]], [[Aleut]]s, and [[Eskimo]]s. Over time, indigenous cultures in North America grew increasingly sophisticated, and some, such as the pre-Columbian [[Mississippian culture]] in the southeast, developed advanced [[agriculture]], [[architecture]], and complex societies. The city-state of [[Cahokia]] is the largest, most complex pre-Columbian [[archaeological site]] in the modern-day United States. In the [[Four Corners]] region, [[Ancestral Puebloan]] culture developed from centuries of agricultural experimentation. The [[Algonquian peoples|Algonquian]] are one of the most populous and widespread [[North America]]n [[Indigenous peoples of the Americas|native]] language groups. This grouping consists of the peoples who speak [[Algonquian languages]]. Historically, these peoples were prominent along the Atlantic Coast and into the interior along the [[Saint Lawrence River]] and around the [[Great Lakes]]. Before Europeans came into contact, most Algonquian settlements lived by hunting and fishing, although many supplemented their diet by cultivating [[maize|corn]], [[bean]]s and [[Cucurbita|squash]] (the "[[Three Sisters (agriculture)|Three Sisters"]]). The [[Ojibwe]] cultivated [[wild rice]]. The [[Haudenosaunee]] confederation of the [[Iroquoian peoples|Iroquois]], located in the southern Great Lakes region, was established at some point between the twelfth and fifteenth centuries. [[Population history of Indigenous peoples of the Americas|Estimating the native population of North America]] during European contact is difficult. [[Douglas H. Ubelaker]] of the [[Smithsonian Institution]] estimated a population of 93,000 in the [[South Atlantic states]] and a population of 473,000 in the Gulf states, but most academics regard this figure as too low. Anthropologist [[Henry F. Dobyns]] believed the populations were much higher, suggesting around 1.1 million along the shores of the [[Gulf of Mexico]], 2.2 million people living between [[Florida]] and [[Massachusetts]], 5.2 million in the [[Mississippi Valley]] and tributaries, and around 700,000 people in the [[Florida peninsula]]. ===Colonial America=== [[File:The\_Mayflower\_Compact\_1620\_cph.URL|thumb|left|The [[Mayflower Compact]] signed on the ''[[Mayflower]]'' in 1620 set an early precedent for [[self-government]] and [[constitutionalism]].]] Claims of very early colonization of [[New England#Geography|coastal New England]] by the [[Norse colonization of North America|Norse]] are disputed and controversial. [[Christopher Columbus]] had landed in [[Puerto Rico]] on his [[Columbus's second voyage|1493 voyage]], and [[San Juan, Puerto Rico|San Juan]] was settled by the Spanish a decade later. The first documented arrival of Europeans in the continental United States is that of Spanish [[conquistador]]s such as [[Juan Ponce de León]], who made his first expedition to [[Spanish Florida|Florida]] in 1513. The Italian explorer [[Giovanni da Verrazzano]], sent by France to the New World in 1525, encountered native inhabitants of what is now [[New York Bay]]. The Spanish set up the first settlements in Florida and New Mexico, such as [[St. Augustine, Florida|Saint Augustine]], often considered the nation's oldest city, and [[Santa Fe, New Mexico|Santa Fe]]. The French [[New France|established]] their own settlements along the [[Mississippi River]] and [[Gulf of Mexico]], notably [[New Orleans]] and [[Mobile, Alabama|Mobile]]. Successful [[British colonization of the Americas|English settlement]] of the eastern coast of North America began with the [[Colony of Virginia|Virginia Colony]] in 1607 at [[Jamestown, Virginia|Jamestown]] and with the [[Pilgrims (Plymouth Colony)|Pilgrims]]' [[Plymouth Colony|colony at Plymouth]] in 1620. The continent's first elected legislative assembly, Virginia's [[House of Burgesses]], was founded in 1619. [[Harvard College]] was established in the [[Massachusetts Bay Colony]] in 1636 as the first institution of higher education. The [[Mayflower Compact]] and the [[Fundamental Orders of Connecticut]] established precedents for representative self-government and constitutionalism that would develop throughout the American colonies. Many English settlers were [[English Dissenters|dissenting Christians]] who came seeking [[Freedom of religion|religious freedom]]. The [[Population history of indigenous peoples of the Americas|native population of America declined]] after European arrival for various reasons,[[#Stannard|Stannard, 1993]] p. [[iarchive:americanholocaus00stan|xii]] primarily from diseases such as [[smallpox]] and [[measles]]."''[URL The Cambridge encyclopedia of human paleopathology] ''". Arthur C. Aufderheide, Conrado Rodríguez-Martín, Odin Langsjoen (1998). [[Cambridge University Press]]. p. 205. [[#Bianchine|Bianchine, Russo, 1992]] pp. 225–232[[File:Map of territorial growth URL|upright|thumb|The original [[Thirteen Colonies]] (shown in red) in 1775|alt=Map of the U.S. showing the original Thirteen Colonies along the eastern seaboard]] In the early days of colonization, many European settlers experienced food shortages, disease, and conflicts with [[Indigenous peoples of the Americas|Native Americans]], such as in [[King Philip's War]]. Native Americans were also often fighting neighboring tribes and European settlers. In many cases, however, the natives and settlers came to depend on each other. Settlers [[Columbian exchange|traded]] for food and animal pelts; natives for guns, tools and other European goods.[[#Ripper2008|Ripper, 2008]] p. 6 Natives taught many settlers to cultivate corn, beans, and other foodstuffs. European missionaries and others felt it was important to "civilize" the Native Americans and urged them to adopt European agricultural practices and lifestyles.[[#Ripper2008|Ripper, 2008]] p. 5[[#Calloway1998|Calloway, 1998]], p. 55 However, with the increased European [[settler colonialism|colonization]] of North America, [[Native Americans in the United States|Native Americans]] were displaced and often killed during conflicts. European settlers also began [[human trafficking|trafficking]] [[Slavery in the colonial United States|African slaves]] into Colonial America via the [[Atlantic slave trade|transatlantic slave trade]]. By the turn of the 18th century, slavery had supplanted [[indentured servitude]] as the main source of agricultural labor for the [[cash crop]]s in the American South.[[#Quirk|Quirk, 2011]], p. 195 Colonial society was divided over the religious and moral implications of slavery, and several colonies passed acts for or against the practice.[[#Lien|Lien, 1913]], p. 522[[#Davis96|Davis, 1996]], p. 7 The [[Thirteen Colonies]] that would become the United States of America were administered by the British as overseas dependencies. [[Colonial government in the Thirteen Colonies|All nonetheless had local governments]] with elections open to most free men. With very high birth rates, low death rates, and steady settlement, the colonial population grew rapidly, eclipsing Native American populations.[[#Walton|Walton, 2009]], pp. 38–39 The [[Christian revival]]ist movement of the 1730s and 1740s known as the [[First Great Awakening|Great Awakening]] fueled interest both in religion and in religious liberty. During the [[Seven Years' War]] (1756–1763), known in the U.S. as the [[French and Indian War]], British forces captured Canada from the French. With the creation of the [[Province of Quebec (1763–1791)|Province of Quebec]], Canada's [[French language|francophone]] population would remain isolated from the English-speaking colonial dependencies of [[Nova Scotia]], [[Newfoundland Colony|Newfoundland]] and the Thirteen Colonies. Excluding the Native Americans who lived there, the Thirteen Colonies had a population of over 2.1 million in 1770, about a third that of Britain. Despite continuing new arrivals, the rate of natural increase was such that by the 1770s only a small minority of Americans had been born overseas.[[#Walton|Walton, 2009]], p. 35 The colonies' distance from Britain had allowed the development of self-government, but their unprecedented success motivated British monarchs to periodically seek to reassert royal authority. ===American Revolution and the early federal republic=== [[File:Declaration URL|thumb|left|''[[Declaration of Independence (Trumbull)|Declaration of Independence]]'', a painting by [[John Trumbull]], depicts the [[Committee of Five]] presenting the draft of the [[United States Declaration of Independence|Declaration]] to the [[Second Continental Congress|Continental Congress]], June 28, 1776, in [[Philadelphia]].|alt=See caption]] The [[American Revolution]] separated the Thirteen Colonies from the [[British Empire]], and was the first successful [[war of independence]] by a non-European entity against a European power in [[modern history]]. By the 18th century the [[American Enlightenment]] and [[Liberalism in the United States|the political philosophies of liberalism]] were pervasive among leaders. Americans began to develop an ideology of "[[Republicanism in the United States|republicanism]]", asserting that government rested on the [[consent of the governed]]. They demanded their "[[Rights of Englishmen|rights as Englishmen]]" and "[[no taxation without representation]]".[URL Recreating the American Republic – Charles A. Kromkowski]. Retrieved on 2013-07-15. The British insisted on administering the colonies through a [[Parliament of Great Britain|Parliament]] that did not have a single representative responsible for any American constituency, and the conflict escalated into war. In 1774, the [[First Continental Congress]] passed the [[Continental Association]], which mandated a [[Thirteen Colonies|colonies-wide]] boycott of British goods. The [[American Revolutionary War]] began the following year, catalyzed by events like the [[Stamp Act 1765|Stamp Act]] and the [[Boston Tea Party]] that were rooted in colonial disagreement with British governance. The [[Second Continental Congress]], an assembly representing the [[United Colonies]], unanimously adopted the [[United States Declaration of Independence|Declaration of Independence]] on July 4, 1776 (annually celebrated as [[Independence Day (United States)|Independence Day]]). In 1781, the [[Articles of Confederation]] and [[Perpetual Union]] established a decentralized government that operated until 1789. A celebrated early turn in the war for the Americans was [[George Washington]] leading the Americans to [[George Washington's crossing of the Delaware River|cross the frozen Delaware River]] in a surprise attack the night of December 25–26, 1776. Another victory, in 1777, at the [[Battles of Saratoga|Battle of Saratoga]] resulted in the capture of a British army, and led to [[France in the American Revolutionary War|France]] and [[Spain in the American Revolutionary War|Spain]] joining in the war against them. After the surrender of a second British army at the [[Siege of Yorktown (1781)|siege of Yorktown]] in 1781, Britain signed a [[Treaty of Paris (1783)|peace treaty]]. American sovereignty became internationally recognized, and the new nation took possession of substantial territory east of the [[Mississippi River]], from what is today [[Canada]] in the north and [[Florida]] in the south. As it became increasingly apparent that the Confederation was insufficient to govern the new country, [[Nationalism|nationalists]] advocated for and led the [[Constitutional Convention (United States)|Philadelphia Convention]] of 1787 in writing the [[United States Constitution]] to replace it, [[Ratification of the United States Constitution|ratified]] in state conventions in 1788. Going into force in 1789, this constitution reorganized the government into a [[federation]] administered by [[Separation of powers|three equal branches]] (executive, judicial and legislative), on the principle of creating salutary [[checks and balances]]. George Washington, who had led the [[Continental Army]] to victory and then willingly relinquished power, was the first [[President of the United States|president]] elected under the new constitution. The [[United States Bill of Rights|Bill of Rights]], forbidding federal restriction of [[Natural and legal rights|personal freedoms]] and guaranteeing a range of legal protections, was adopted in 1791.[[#Boyer|Boyer, 2007]], pp. 192–193 [[Origins of the War of 1812|Tensions with Britain remained]], however, leading to the [[War of 1812]], which was fought to a draw. Although the federal government [[Act Prohibiting Importation of Slaves|outlawed]] American participation in the [[Atlantic slave trade]] in 1807, after 1820, cultivation of the highly profitable cotton crop exploded in the [[Deep South]], and along with it, the use of [[Slavery in the United States|slave labor]].[[#Walton|Walton, 2009]], p. 43[[#Gordon|Gordon, 2004]], pp. 27,29 The [[Second Great Awakening]], especially in the period 1800–1840, converted millions to [[Evangelicalism in the United States|evangelical]] Protestantism. In the North, it energized multiple social reform movements, including [[Abolitionism in the United States|abolitionism]]; in the South, [[History of Methodism in the United States|Methodists]] and [[Baptists in the United States|Baptists]] proselytized among slave populations.Heinemann, Ronald L., et al., Old Dominion, New Commonwealth: a history of Virginia 1607–2007, 2007 , p. 197 [[File:U.S. Territorial URL|alt=Map of the U.S. depicting its westward expansion|thumb|[[Territorial evolution of the United States|Territorial acquisitions of the United States]] between 1783 and 1917]] In the late 18th century, American settlers began to [[Territorial evolution of the United States|expand further westward]], some of them with a sense of [[manifest destiny]]. The 1803 [[Louisiana Purchase]] almost doubled the nation's area, [[Adams–Onís Treaty|Spain ceded Florida]] and other Gulf Coast territory in 1819, the [[Republic of Texas]] was [[Texas annexation|annexed]] in 1845 during a period of expansionism, and the 1846 [[Oregon Treaty]] with Britain led to U.S. control of the present-day [[Northwestern United States|American Northwest]]. Additionally, the [[Trail of Tears]] in the 1830s exemplified the [[Indian Removal Act|Indian removal policy]] that forcibly resettled Indians. This further expanded acreage under mechanical cultivation, increasing surpluses for international markets. This prompted a long series of [[American Indian Wars]] west of the [[Mississippi River]] from 1810 to at least 1890. and eventually, conflict with Mexico. Most of these conflicts ended with the cession of Native American territory and their confinement to [[Indian reservation]]s. Victory in the [[Mexican–American War]] resulted in the 1848 [[Mexican Cession]] of [[California]] and much of the present-day [[Southwestern United States|American Southwest]], and the U.S. spanned the continent. The [[California Gold Rush]] of 1848–1849 spurred migration to the Pacific coast, which led to the [[California Genocide]] and the creation of additional western states. Economic development was spurred by giving vast quantities of land, nearly 10% of the total area of the United States, to white European settlers as part of the [[Homestead Acts]], as well as making [[land grants]] to private railroad companies and [[Land-grant university|colleges]].Paul Frymer, "Building an American Empire: The Era of Territorial and Political Expansion," (Princeton: Princeton University Press, 2017) Prior to the Civil War, [[Slave states and free states|the prohibition or expansion of slavery into these territories]] exacerbated tensions over [[Origins of the American Civil War|the debate around abolitionism]]. ===The Civil War and Reconstruction=== [[File:US Secession map URL|thumb| |alt=Map of U.S. showing two kinds of Union states, two phases of secession and territories]] Irreconcilable sectional conflict regarding [[Slavery in the United States|the enslavement]] of Africans and [[African Americans]] ultimately [[Origins of the American Civil War|led to the American Civil War]]. With the [[1860 United States presidential election|1860 election]] of Republican [[Abraham Lincoln]], conventions in eleven slave states declared [[secession]] and formed the [[Confederate States of America]], while the federal government (the "[[Union (American Civil War)|Union]]") maintained that [[Perpetual Union|secession was unconstitutional and illegitimate]]. On April 12, 1861, the Confederacy initiated military conflict by [[Battle of Fort Sumter|bombarding Fort Sumter]], a federal garrison in [[Charleston, South Carolina|Charleston harbor]], South Carolina. This would be the spark of the Civil War, which lasted for four years (1861–1865) and became the deadliest military conflict in American history. The war would result in the deaths of approximately 620,000 soldiers from both sides and upwards of 50,000 civilians, almost all of them in the South. [[Reconstruction (United States)|Reconstruction]] began in earnest following the war. While President Lincoln attempted to foster friendship and forgiveness between the Union and the former Confederacy, [[Assassination of Abraham Lincoln|his assassination]] on April 14, 1865 drove a wedge between North and South again. Republicans in the federal government made it their goal to oversee the rebuilding of the South and to ensure the rights of African Americans. They persisted until the [[Compromise of 1877]], when the Republicans agreed to cease protecting the rights of African Americans in the South in order for Democrats to concede the [[1876 United States presidential election|presidential election of 1876]]. Influential Southern whites, calling themselves "[[Redeemers]]", took control of the South after the end of Reconstruction, beginning the [[nadir of American race relations]]. From 1890 to 1910, the Redeemers established so-called [[Jim Crow laws]], [[Disenfranchisement after the Reconstruction Era|disenfranchising]] almost all blacks and some impoverished whites throughout the region. Blacks would face [[Racial segregation in the United States|racial segregation]] nationwide, especially in the South. They also lived under constant threat of vigilante violence, including [[Lynching in the United States|lynching]]. ===Industrial Age and the Progressive Era=== [[File:Emigrants (i.e. immigrants) landing at Ellis Island -.webm|thumb|left|Film by [[Edison Studios]] showing immigrants at [[Ellis Island]] in [[New York Harbor]], that was a major entry point for European [[immigration to the United States|immigration into the U.S.]]]] In the North, [[urbanization]] and an unprecedented [[History of immigration to the United States|influx of immigrants]] from [[Southern Europe|Southern]] and [[Eastern Europe]] supplied a surplus of labor for the country's industrialization and transformed its culture. National infrastructure, including [[First Transcontinental Telegraph|telegraph]] and [[First transcontinental railroad|transcontinental railroads]], spurred economic growth and greater settlement and development of the [[American frontier|American Old West]]. After the [[American Civil War]], new transcontinental [[Rail transportation in the United States#History|railways]] made relocation easier for settlers, expanded internal trade, and increased conflicts with Native Americans. The later inventions of [[Incandescent light bulb|electric light]] and the [[telephone]] would also affect communication and urban life. Mainland expansion also included the [[Alaska Purchase|purchase of Alaska]] from [[Russian Empire|Russia]] in 1867. In 1893, pro-American elements in Hawaii [[Overthrow of the Kingdom of Hawaii|overthrew]] the [[Kingdom of Hawaii|Hawaiian monarchy]] and formed the [[Republic of Hawaii]], which the U.S. [[Newlands Resolution|annexed]] in 1898. Puerto Rico, [[Guam]], and the [[Philippines]] were ceded by Spain in the same year, following the [[Spanish–American War]]. [[American Samoa]] was acquired by the United States in 1900 after the end of the [[Second Samoan Civil War]].Ryden, George Herbert. ''The Foreign Policy of the United States in Relation to Samoa''. New York: Octagon Books, 1975. The [[United States Virgin Islands|U.S. Virgin Islands]] were purchased from [[Denmark]] in 1917. [[Gilded Age|Rapid economic development]] during the late 19th and early 20th centuries fostered the rise of many prominent industrialists. [[Business magnate|Tycoons]] like [[Cornelius Vanderbilt]], [[John D. Rockefeller]], and [[Andrew Carnegie]] led the nation's progress in the [[Railways|railroad]], [[Petroleum industry|petroleum]], and [[History of the steel industry (1850–1970)|steel]] industries. Banking became a major part of the economy, with [[J. P. Morgan]] playing a notable role. The American economy boomed, becoming the world's largest. These dramatic changes were accompanied by huge increases in [[Effects of immigration to the United States|immigration]], [[Economic inequality|growing inequality]] and [[List of incidents of civil unrest in the United States|social unrest]], which prompted the rise of [[Labor history of the United States|organized labor]] along with [[Populism in the United States|populist]], [[History of the socialist movement in the United States|socialist]], and [[Anarchism in the United States|anarchist]] movements.[[#Zinn|Zinn, 2005]], pp. 321–357 This period eventually ended with the advent of the [[Progressive Era]], which saw significant reforms including [[Consumer protection|health and safety regulation]] of consumer goods, the rise of [[Labor unions in the United States|labor unions]], and greater [[United States antitrust law|antitrust measures]] to ensure competition among businesses and attention to worker conditions. ===The rise to world power, the New Deal, and World War II=== [[File:Old timer structural URL|thumb|Worker during construction of the [[Empire State Building]] in [[New York City]] in 1930]] [[File:Trinity\_Detonation\_T&B\_(cropped).jpg|thumb|[[Mushroom cloud]] formed by the [[Trinity (nuclear test)|Trinity Experiment]] in [[New Mexico]], part of the [[Manhattan Project]], the first detonation of a [[nuclear weapon]] in history, July 1945]] The United States remained neutral from the outbreak of [[World War I]] in 1914 until 1917 when it joined the war as an "associated power" alongside the [[Allies of World War I]], helping to turn the tide against the [[Central Powers]]. In 1919, President [[Woodrow Wilson]] took a leading diplomatic role at the [[Paris Peace Conference, 1919|Paris Peace Conference]] and advocated strongly for the U.S. to join the [[League of Nations]]. However, the Senate refused to approve this and did not ratify the [[Treaty of Versailles]] that established the League of Nations.McDuffie, Jerome; Piggrem, Gary Wayne; Woodworth, Steven E. (2005). ''U.S. History Super Review''. Piscataway, NJ: Research & Education Association. p. 418. . Around this time, millions of rural African Americans began [[Great Migration (African American)|a mass migration from the South to northern urban centers]]; it would continue until about 1970. The last vestiges of the Progressive Era resulted in [[women's suffrage]] and [[Prohibition in the United States|alcohol prohibition]].Paige Meltzer, "The Pulse and Conscience of America" The General Federation and Women's Citizenship, 1945–1960," ''Frontiers: A Journal of Women Studies'' (2009), Vol. 30 Issue 3, pp. 52–76.James Timberlake, ''Prohibition and the Progressive Movement, 1900–1920'' (Harvard UP, 1963)George B. Tindall, "Business Progressivism: Southern Politics in the Twenties," ''South Atlantic Quarterly'' 62 (Winter 1963): 92–106. In 1920, the women's rights movement won passage of a [[Nineteenth Amendment to the United States Constitution|constitutional amendment]] granting [[Women's suffrage in the United States|women's suffrage]]. The 1920s and 1930s saw the rise of [[radio]] for [[mass communication]] and the invention of early [[television]]. The prosperity of the [[Roaring Twenties]] ended with the [[Wall Street Crash of 1929]] and the onset of the [[Great Depression in the United States|Great Depression]]. After his election as president in 1932, [[Franklin D. Roosevelt]] responded with the [[New Deal]]. The [[Dust Bowl]] of the mid-1930s impoverished many farming communities and spurred a new wave of western migration. At first [[United States non-interventionism before entering World War II|neutral during World War II]], the United States in March 1941 [[Lend-Lease|began supplying materiel]] to the [[Allies of World War II|Allies]]. On December 7, 1941, the [[Empire of Japan]] launched a surprise [[attack on Pearl Harbor]], prompting the United States to join the Allies against the [[Axis powers]], and in the following year, to [[Internment of Japanese Americans|intern]] about 120,000 Japanese and Japanese Americans.The official WRA record from 1946 state it was 120,000 people. See . This number does not include people held in other camps such as those run by the DoJ or U.S. Army. Other sources may give numbers slightly more or less than 120,000. The U.S. pursued a "[[Europe first]]" defense policy, leaving the [[Philippines]], an [[History of the Philippines (1898–1946)|American colony]], isolated and alone to fight Japan's [[Japanese occupation of the Philippines|invasion and occupation]] until the U.S.-led [[Philippines campaign (1944–1945)]]. During the war, the United States was one of the "[[Four Policemen|Four Powers]]" who met to plan the postwar world, along with Britain, the Soviet Union, and China. The United States emerged [[World War II casualties#Human losses by country|relatively unscathed]] from the war, and with even greater economic and military influence.Kennedy, Paul (1989). ''The Rise and Fall of the Great Powers''. New York: Vintage. p. 358. The United States played a leading role in the [[Bretton Woods Conference|Bretton Woods]] and [[Yalta Conference|Yalta]] conferences, which signed agreements on new international financial institutions and Europe's postwar reorganization. As an [[Victory in Europe Day|Allied victory was won in Europe]], a 1945 [[United Nations Conference on International Organization|international conference]] held in [[San Francisco]] produced the [[United Nations Charter]], which became active after the war. The United States developed the [[Manhattan Project|first nuclear weapons]] and used them on Japan [[Atomic bombings of Hiroshima and Nagasaki|in the cities of Hiroshima and Nagasaki]] in August 1945; the Japanese [[Surrender of Japan|surrendered]] on September 2, ending [[World War II]].Pacific War Research Society (2006). ''Japan's Longest Day''. New York: Oxford University Press. . ===Cold War and late 20th century=== [[File:URL|thumb| [[Post–World War II economic expansion]] in the U.S. led to [[Suburbanization|suburban development]] and [[urban sprawl]], as shown in this aerial photograph of [[Levittown, Pennsylvania]], circa 1959.]] After World War II, the United States financed and implemented the [[Marshall Plan]] to help rebuild western Europe; disbursements paid between 1948 and 1952 would total $13 billion ($115 billion in 2021).See Also at this time, [[Geopolitics|geopolitical]] tensions between the United States and [[Soviet Union|Russia]] led to the [[Cold War]], driven by an ideological divide between [[capitalism]] and [[communism]]. They dominated the military affairs of Europe, with the U.S. and its [[NATO]] allies on one side and the Soviet Union and its [[Warsaw Pact]] allies on the other. The U.S. often opposed [[Third World]] movements that it viewed as Soviet-sponsored, sometimes pursuing direct action for [[United States involvement in regime change|regime change]] against [[Left-wing politics|left-wing]] governments.[[#Blakeley|Blakeley, 2009]], [URL p. 92] American troops fought the communist forces in the [[Korean War]] of 1950–1953, and the U.S. became increasingly involved in the [[Vietnam War]] (1955–1975), introducing combat forces in 1965. Their competition to achieve superior [[spaceflight]] capability led to the [[Space Race]], which culminated in the U.S. becoming the first nation to [[Apollo 11|land people on the Moon]] in 1969. While both countries engaged in [[proxy war]]s and developed powerful [[nuclear weapon]]s, they avoided direct military conflict. At home, the United States experienced [[Post–World War II economic expansion|sustained economic expansion]], [[Urbanization in the United States|urbanization]], and a [[Post–World War II baby boom|rapid growth of its population]] and [[American middle class|middle class]] following World War II. Construction of an [[Interstate Highway System]] transformed the nation's transportation infrastructure in decades to come. In 1959, the United States admitted [[Alaska]] and [[Hawaii]] to become the 49th and 50th states, formally expanding beyond the [[contiguous United States]]. [[File:Martin\_Luther\_King\_-*March\_on\_Washington*(cropped).jpg|thumb|[[Martin Luther King Jr.]] gives his famous "[[I Have a Dream]]" speech at the [[Lincoln Memorial]] during the [[March on Washington for Jobs and Freedom|March on Washington]], 1963.|alt=See caption]] The growing [[civil rights movement]] used [[nonviolence]] to confront [[Racism in the United States|racism]], with [[Martin Luther King Jr.]] becoming a prominent leader and figurehead. President [[Lyndon B. Johnson]] initiated legislation that led to a series of policies addressing poverty and racial inequalities, in what he termed the "[[Great Society]]". The launch of a "[[War on Poverty]]" expanded [[Social programs in the United States|entitlements and welfare]] spending, leading to the creation of the [[Food Stamp Program]], [[Aid to Families with Dependent Children]], along with national [[health insurance]] programs [[Medicare (United States)|Medicare]] and [[Medicaid]]. A combination of court decisions and legislation, culminating in the [[Civil Rights Act of 1968]], made significant improvements. Meanwhile, a [[counterculture of the 1960s|counterculture movement]] grew, which was fueled by [[Opposition to United States involvement in the Vietnam War|opposition to the Vietnam War]], the [[Black Power movement]], and the [[sexual revolution]]. The [[Women's Movement in the United States (1963-1982)|women's movement]] in the U.S. broadened the debate on women's rights and made [[gender equality]] a major social goal. The [[Sexual revolution in 1960s United States|1960s Sexual Revolution]] liberalized American attitudes to sexuality; the 1969 [[Stonewall riots]] in New York City marked the beginning of the fledgling [[gay liberation|gay rights]] movement.; ; The United States supported [[Israel]] during the [[Yom Kippur War]]; in response, the country faced an oil [[embargo]] from [[OPEC]] nations, sparking the [[1973 oil crisis]]. After a surge in female labor participation around the 1970s, by 1985, the majority of women aged 16 and over were employed. The 1970s and early 1980s also saw the onset of [[stagflation]]. The presidency of [[Richard Nixon]] saw the American withdrawal from Vietnam but also the [[Watergate scandal]], which led to [[Nixon resignation|his resignation in disgrace]] and a decline in public trust of government that expanded for decades.[[Sam Ervin|Ervin, Sam]], et al., ''Final Report of the Watergate Committee]''. [[File:President Ronald Reagan and Soviet General Secretary Mikhail Gorbachev at the first Summit in Geneva, URL|thumb|U.S. president [[Ronald Reagan]] (left) and Soviet general secretary [[Mikhail Gorbachev]] at the [[Geneva Summit (1985)|Geneva Summit]] in 1985]] After his election in 1980 President [[Ronald Reagan]] responded to economic stagnation with [[Reaganomics|neoliberal reforms]] and initiated the more aggressive [[rollback|rollback strategy]] towards the Soviet Union.[[#Soss|Soss, 2010]], p. 277[[#Fraser|Fraser, 1989]] During Reagan's presidency, the federal debt held by the public nearly tripled in nominal terms, from $738 billion to $2.1 trillion. This led to the United States moving from the world's largest international creditor to the world's largest debtor nation. The [[dissolution of the Soviet Union]] in 1991 ended the Cold War, [[#Hayes|Hayes, 2009]] ensuring a global [[unipolarity]][[Charles Krauthammer]], "The Unipolar Moment", ''Foreign Affairs'', 70/1, (Winter 1990/1), 23–33. in which the U.S. was unchallenged as the world's dominant [[superpower]]. [[#Cohen|Cohen, 2004: History and the Hyperpower]] Fearing the spread of [[Middle East|regional]] international instability from the [[Iraqi invasion of Kuwait]], in August 1991, President [[George H. W. Bush]] launched and led the [[Gulf War]] against Iraq, expelling Iraqi forces and restoring the [[Emir of Kuwait|Kuwaiti monarchy]]. During the administration of President [[Bill Clinton]] in 1994, the U.S. signed the [[North American Free Trade Agreement]] (NAFTA), causing trade among the U.S., Canada, and Mexico to soar. Due to the [[dot-com bubble|dot-com boom]], stable monetary policy, and [[Personal Responsibility and Work Opportunity Act|reduced social welfare spending]], the 1990s saw the [[1990s United States boom|longest economic expansion]] in modern U.S. history. ===21st century=== [[File:WTC\_21-632.TIFF|upright|thumb|left|The [[World Trade Center (1973–2001)|World Trade Center]] in [[Lower Manhattan]] during the [[September 11 attacks]] by the [[Islamic terrorism|Islamic terrorist]] group [[Al-Qaeda]] in 2001|alt=Dark smoke billows from the Twin Towers over Manhattan]] On [[September 11 attacks|September 11, 2001]], [[al-Qaeda]] terrorist hijackers flew passenger planes into the [[World Trade Center (1973–2001)|World Trade Center]] in New York City and [[the Pentagon]] near Washington, D.C., killing nearly 3,000 people. In response, President [[George W. Bush]] launched the [[War on Terror]], which included a nearly 20-year [[War in Afghanistan (2001–present)|war in Afghanistan]] from 2001 to 2021 and the 2003–2011 [[Iraq War]]. Government policy designed to promote affordable housing, widespread failures in corporate and regulatory governance, and historically low interest rates set by the Federal Reserve led to a [[United States housing bubble|housing bubble]] in 2006. This culminated in the [[financial crisis of 2007–2008]] and the [[Great Recession]], the nation's largest economic contraction since the Great Depression. [[Barack Obama]], the first [[Multiracial American|multiracial]] president with [[African-American]] ancestry, [[2008 United States presidential election|was elected in 2008]] amid the financial crisis. By the end of his second term, the stock market, median household income and net worth, and the number of persons with jobs were all at record levels, while the unemployment rate was well below the historical average. His signature legislative accomplishment was the [[Affordable Care Act]] (ACA), popularly known as "Obamacare". It represented the [[U.S. healthcare system]]'s most significant regulatory overhaul and expansion of coverage since Medicare in 1965. As a result, the uninsured share of the population was cut in half, while the number of newly insured Americans was estimated to be between 20 and 24 million. After Obama served two terms, Republican [[Donald Trump]] was elected as the [[List of Presidents of the United States|45th president]] in 2016. [[2016 United States presidential election|His election]] is viewed as one of the biggest political upsets in American history. Trump held office through [[Timeline of the COVID-19 pandemic in the United States|the first waves]] of the [[COVID-19 pandemic]] and the resulting [[COVID-19 recession]] starting in 2020 that exceeded even the Great Recession earlier in the century. The early 2020s saw the country become more divided, with various social issues sparking debate and protest. The [[murder of George Floyd]] in 2020 led to [[George Floyd protests|widespread civil unrest in urban centers]] and a national debate about [[Police brutality in the United States|police brutality]] and lingering [[institutional racism]]. The nationwide increase in the frequency of instances and number of deaths related to [[Mass shootings in the United States|mass shootings]] added to the societal tensions. On January 6, 2021, supporters of the outgoing president, Trump, [[2021 United States Capitol attack|stormed the U.S. Capitol]] in an unsuccessful effort to disrupt the [[U.S. Electoral College|Electoral College]] vote count that would confirm Democrat [[Joe Biden]] as the 46th president. In 2022, the Supreme Court [[Dobbs v. Jackson Women's Health Organization|ruled that there is no constitutional right to an abortion]], causing [[2022 abortion protests|another wave of protests]] across the country and stoking international reactions as well. Despite these divisions, the country has remained unified against [[Union State|Russia]] after [[Vladimir Putin]]'s [[2022 Russian invasion of Ukraine|2022 invasion of Ukraine]], with politicians and individuals across the political spectrum supporting arms shipments to Ukraine and many large American corporations pulling out of Russia and Belarus altogether. ==Geography== [[File:URL|thumb|[[Topographic map]] of the United States]] [[File:Wonder Lake and URL|thumb|[[Denali]], or Mount McKinley, in [[Alaska]], the highest [[mountain]] peak in [[North America]]]] The [[List of states and territories of the United States|48 contiguous states and the District of Columbia]] occupy a combined area of . Of this area, is contiguous land, composing 83.65% of total U.S. land area. About 15% is occupied by [[Alaska]], a state in northwestern North America, with the remainder in [[Hawaii]], a state and [[archipelago]] in the central [[Pacific Ocean|Pacific]], and the five populated but [[Unincorporated area|unincorporated]] insular territories of [[Puerto Rico]], [[American Samoa]], [[Guam]], the [[Northern Mariana Islands]], and the [[United States Virgin Islands|U.S. Virgin Islands]]. Measured by only land area, the United States is third in size behind Russia and China, and just ahead of Canada. The United States is the world's [[List of countries and dependencies by area|third- or fourth-largest]] nation by total area (land and water), ranking behind Russia and Canada and nearly equal to China. The ranking varies depending on how two territories disputed by China and India are counted, and how the total size of the United States is measured. The [[Atlantic coastal plain|coastal plain]] of the [[Atlantic Ocean|Atlantic]] seaboard gives way further inland to [[deciduous]] forests and the rolling hills of the [[Piedmont (United States)|Piedmont]]. The [[Appalachian Mountains]] and the [[Adirondack Mountains|Adirondack]] [[massif]] divide the eastern seaboard from the [[Great Lakes]] and the grasslands of the [[Midwestern United States|Midwest]]. The [[Mississippi River|Mississippi]]–[[Missouri River]], the world's [[List of rivers by length|fourth longest river system]], runs mainly north–south through the heart of the country. The flat, fertile [[prairie]] of the [[Great Plains]] stretches to the west, interrupted by [[U.S. Interior Highlands|a highland region]] in the southeast. The [[Rocky Mountains]], west of the Great Plains, extend north to south across the country, peaking at over in [[Colorado]]. Farther west are the rocky [[Great Basin]] and deserts such as the [[Chihuahuan Desert|Chihuahua]], [[Sonoran Desert|Sonoran]], and [[Mojave Desert|Mojave]]. The [[Sierra Nevada (U.S.)|Sierra Nevada]] and [[Cascade Range|Cascade]] mountain ranges run close to the [[West Coast of the United States|Pacific coast]], both ranges also reaching altitudes higher than . The [[Extreme points of the United States|lowest and highest points]] in the contiguous United States are in the state of California, and only about apart. At an elevation of , Alaska's [[Denali]] is the highest peak in the country and in North America. Active [[volcano]]es are common throughout Alaska's [[Alexander Archipelago|Alexander]] and [[Aleutian Islands]], and Hawaii consists of volcanic islands. The [[supervolcano]] underlying [[Yellowstone National Park]] in the [[Rockies]] is the continent's largest volcanic feature. ===Climate=== [[File:Köppen Climate Types US URL|thumb|[[Köppen climate classification|Köppen climate types]] of the U.S.]] The United States, with its large size and geographic variety, includes most climate types. To the east of the [[100th meridian west|100th meridian]], the climate ranges from [[humid continental climate|humid continental]] in the north to [[humid subtropical climate|humid subtropical]] in the south. The Great Plains west of the 100th meridian are [[Semi-arid climate|semi-arid]]. Many mountainous areas of the American West have an [[alpine climate]]. The climate is [[Desert climate|arid]] in the Great Basin, desert in the Southwest, [[Mediterranean climate|Mediterranean]] in [[coastal California]], and [[oceanic climate|oceanic]] in coastal [[Oregon]] and [[Washington (state)|Washington]] and southern Alaska. Most of Alaska is [[Subarctic climate|subarctic]] or [[Polar climate|polar]]. Hawaii and the southern tip of [[Florida]] are [[Tropical climate|tropical]], as well as its territories in the [[Caribbean]] and the Pacific. States bordering the [[Gulf of Mexico]] are prone to [[Tropical cyclone|hurricanes]], and most of the world's [[tornado]]es occur in the country, mainly in [[Tornado Alley]] areas in the Midwest and South. Overall, the United States receives more high-impact extreme weather incidents than any other country in the world. Extreme weather has become more frequent in the U.S., with three times the number of reported [[heat waves]] as in the 1960s. Of the ten warmest years ever recorded in the 48 contiguous states, eight have occurred since 1998. In the [[Southwestern United States|American Southwest]], droughts have become more persistent and more severe. ===Biodiversity and conservation=== [[File:Bald Eagle (Haliaeetus leucocephalus) in Kachemak Bay, URL|alt=A bald eagle|thumb|left|The [[bald eagle]] has been the [[National bird of the United States|national bird]] of the United States since 1782.]] The U.S. is one of 17 [[megadiverse countries]] containing large numbers of [[List of endangered species in North America|endemic species]]: about 17,000 species of [[vascular plant]]s occur in the contiguous United States and Alaska, and more than 1,800 species of [[flowering plant]]s are found in Hawaii, few of which occur on the mainland. The United States is home to 428 [[mammal]] species, 784 [[bird]]s, 311 [[reptile]]s, and 295 [[amphibian]]s, and 91,000 [[insect]] species. There are 63 [[List of areas in the United States National Park System|national parks]] and hundreds of other federally managed parks, forests, and [[wilderness]] areas, which are managed by the [[National Park Service]]. Altogether, the government owns about 28% of the country's land area, mostly in the [[Western United States|western states]]. Most of this land is [[protected area|protected]], though some is leased for oil and gas drilling, mining, logging, or cattle ranching, and about .86% is used for military purposes. [[Environmental issues in the United States|Environmental issues]] include debates on oil and [[nuclear binding energy|nuclear energy]], dealing with air and water pollution, the economic costs of protecting [[wildlife]], logging and [[deforestation]], and [[Climate change in the United States|climate change]].[[#Daynes|Daynes & Sussman, 2010]], pp. 3, 72, 74–76, 78Hays, Samuel P. (2000). ''A History of Environmental Politics since 1945''. The most prominent environmental agency is the [[United States Environmental Protection Agency|Environmental Protection Agency]] (EPA), created by presidential order in 1970. The idea of wilderness has shaped the management of public lands since 1964, with the [[Wilderness Act]].Turner, James Morton (2012). ''The Promise of Wilderness'' The [[Endangered Species Act]] of 1973 is intended to protect threatened and endangered species and their habitats, which are monitored by the [[United States Fish and Wildlife Service]]. As of 2020, the U.S. ranked 24th among nations in the [[Environmental Performance Index]]. The country joined the [[Paris Agreement]] on climate change in 2016, and has many other environmental commitments. It [[United States withdrawal from the Paris Agreement|withdrew]] from the Paris Agreement in 2020 but rejoined it in 2021. ==Government and politics== [[Image:US Capitol west side.JPG|thumb|The [[United States Capitol]], where [[United States Congress|Congress]] meets: the [[United States Senate|Senate]], left; the [[United States House of Representatives|House]], right]] [[File:White House lawn (long tightly cropped).jpg|thumb|The [[White House]], residence and workplace of the [[President of the United States|U.S. President]]]] [[Image:Panorama of United States Supreme Court Building at URL|thumb|The [[United States Supreme Court Building|Supreme Court Building]], where the [[Supreme Court of the United States|nation's highest court]] sits]] The United States is a [[federal republic]] of 50 [[U.S. state|states]], a [[District of Columbia|federal district]], [[Territories of the United States|five territories]] and several uninhabited [[United States Minor Outlying Islands|island possessions]]. It is the world's oldest surviving [[federation]], and, according to the [[World Economic Forum]], the oldest [[democracy]] as well.Desjardins, Jeff (August 8, 2019) [URL "Mapped: The world’s oldest democracies"] [[World Economic Forum]] It is a [[representative democracy]] "in which [[majority rule]] is tempered by [[minority rights]] protected by [[Law of the United States|law]]."Scheb, John M.; Scheb, John M. II (2002). ''An Introduction to the American Legal System''. Florence, KY: Delmar, p. 6. . In the American [[federalism|federal]] system, sovereignty is shared between [[Political divisions of the United States|two levels of government]]: federal and state. Citizens of the states are also governed by local governments, which are administrative divisions of the states. The territories are administrative divisions of the federal government. The [[Constitution of the United States|U.S. Constitution]] serves as the country's supreme legal document. The Constitution establishes the structure and responsibilities of the federal government and its relationship with the individual states. The Constitution has been amended 27 times;[[#Feldstein|Feldstein, Fabozzi, 2011]], p. 9 the first ten amendments ([[United States Bill of Rights|Bill of Rights]]) and the [[Fourteenth Amendment to the United States Constitution|Fourteenth Amendment]] form the central basis of Americans' individual rights. All laws and governmental procedures are subject to [[judicial review]], and any law can be voided if the courts determine that it violates the Constitution. The principle of judicial review, not explicitly mentioned in the Constitution, was established by the Supreme Court in ''[[Marbury v. Madison]]'' (1803).[[#Schultz|Schultz, 2009]], pp. 164, 453, 503 The United States has operated under a [[two-party system]] for most of its history, although what the two parties are has changed over time: the country is currently in either the [[Fifth Party System|Fifth]] or [[Sixth Party System]]. In current American [[political culture]], the [[Center-right politics|center-right]] [[Republican Party (United States)|Republican Party]] is considered "[[Conservatism in the United States|conservative]]" and the [[Centre-left politics|center-left]] [[Democratic Party (United States)|Democratic Party]] is considered "[[Modern liberalism in the United States|liberal]]". On [[Transparency International]]'s 2019 [[Corruption Perceptions Index]], its [[public sector]] position deteriorated from a score of 76 in 2015 to 69 in 2019. In 2021, the U.S. ranked 26th on the [[Democracy Index]], and is described as a "flawed democracy". ===Federal government=== The federal government comprises three branches, which are headquartered in Washington, D.C. and regulated by a system of [[separation of powers|checks and balances]] defined by the Constitution. * [[Legislature|Legislative]]: The [[United States Congress|bicameral Congress]], made up of the [[United States Senate|Senate]] and the [[United States House of Representatives|House of Representatives]], makes [[federal law]], [[declaration of war|declares war]], approves treaties, has the [[power of the purse]], and has the power of [[impeachment]], by which it can remove sitting members of the federal government. * [[Executive (government)|Executive]]: [[President of the United States|The president]] is the [[commander-in-chief]] of the military, can veto [[bill (law)|legislative bills]] before they become law (subject to congressional override), and appoints the [[Cabinet of the United States|members of the Cabinet]] (subject to Senate approval) and other officers, who administer and enforce federal laws and policies. * [[Judiciary|Judicial]]: The [[Supreme Court of the United States|Supreme Court]] and lower [[Federal judiciary of the United States|federal courts]], whose judges are appointed by the president with Senate approval, interpret laws and overturn those they find [[constitutionality|unconstitutional]]. The [[lower house]], the [[United States House of Representatives|House of Representatives]], has 435 voting members, each representing a [[congressional district]] for a two-year term. House seats are [[United States congressional apportionment|apportioned]] among the states by population. Each state then draws single-member districts to conform with the census apportionment. The District of Columbia and the five major U.S. territories each have [[Non-voting members of the United States House of Representatives|one member of Congress]]—these members are not allowed to vote. The [[upper house]], the [[United States Senate|Senate]], has 100 members with each state having two senators, elected [[at-large|at large]] to six-year terms; one-third of Senate seats are up for election every two years. The District of Columbia and the five major U.S. territories do not have senators. The Senate is unique among upper houses in being the most prestigious and powerful portion of the country's [[Bicameralism|bicameral system]]; political scientists have frequently labeled it the "most powerful upper house" of any government. The president serves a four-year term and may be elected to the office [[Term limits in the United States|no more than twice]]. The president is [[United States presidential election|not elected by direct vote]], but by an indirect [[Electoral College (United States)|electoral college]] system in which the determining votes are apportioned to the states and the District of Columbia. The Supreme Court, led by the [[Chief Justice of the United States|chief justice of the United States]], has nine members, who serve for life. ===Political subdivisions=== Each of the 50 states holds jurisdiction over a geographic territory, where it shares [[sovereignty]] with the federal government. They are subdivided into [[List of United States counties and county equivalents|counties or county equivalents]], and further divided into [[Municipality|municipalities]]. The District of Columbia is a [[federal district]] that contains the capital of the United States, the [[Washington, D.C.|city of Washington]].(a)(36) and (a)(38) U.S. Federal Code, Immigration and Nationality Act. Each state has the amount [[presidential electors]] equal to the number of their representatives plus senators in Congress, and the District of Columbia has three electors. Territories of the United States do not have presidential electors, therefore people there cannot vote for the president. [[Citizenship of the United States|Citizenship is granted at birth in all states]], the District of Columbia, and all major U.S. territories except American Samoa. The United States observes limited [[Tribal sovereignty in the United States|tribal sovereignty]] of the American Indian nations, like states' sovereignty. American Indians are U.S. citizens and tribal lands are subject to the jurisdiction of the U.S. Congress and the federal courts. Like the states, tribes have some autonomy restrictions. They are prohibited from making war, engaging in their own foreign relations, and printing or issuing independent currency. [[Indian reservation]]s are usually contained within one state, but there are 12 reservations that cross state boundaries. ===Foreign relations=== [[File:67º\_Período\_de\_Sesiones\_de\_la\_Asamblea\_General\_de\_Naciones\_Unidas\_(8020913157).jpg|thumb|The [[Headquarters of the United Nations|United Nations headquarters]] has been situated along the [[East River]] in [[Midtown Manhattan]] since 1952. The United States is a founding member of the UN.|alt=see caption|left]] The United States has an established structure of foreign relations, and it had the world's second-largest diplomatic corps in 2019. It is a [[Permanent members of the United Nations Security Council|permanent member]] of the [[United Nations Security Council]], and home to the [[Headquarters of the United Nations|United Nations headquarters]]. The United States is also a member of the [[G7]], [[G-20 major economies|G20]], and [[OECD]] intergovernmental organizations. Almost all countries have [[List of diplomatic missions in the United States|embassies]] and many have [[consul (representative)|consulates]] (official representatives) in the country. Likewise, nearly all nations host formal [[diplomatic mission]]s with United States, except [[Iran–United States relations|Iran]], [[North Korea–United States relations|North Korea]], and [[Foreign relations of Bhutan#Other countries|Bhutan]]. Though [[Taiwan–United States relations|Taiwan]] does not have formal diplomatic relations with the U.S., it maintains close, if unofficial, relations. The United States also regularly supplies Taiwan with [[Six Assurances|military equipment]]. The United States has a "[[Special Relationship]]" with the [[United Kingdom–United States relations|United Kingdom]] and strong ties with [[Canada–United States relations|Canada]], [[Australia–United States relations|Australia]], [[New Zealand–United States relations|New Zealand]], the [[Philippines]], [[Japan–United States relations|Japan]], [[South Korea–United States relations|South Korea]], [[Israel–United States relations|Israel]], and several [[European Union]] countries ([[France–United States relations|France]], [[Italy–United States relations|Italy]], [[Germany–United States relations|Germany]], [[Spain–United States relations|Spain]], and [[Poland–United States relations|Poland]]). The U.S. works closely with its [[NATO]] allies on military and [[national security]] issues, and with nations in the Americas through the [[Organization of American States]] and the [[United States–Mexico–Canada Agreement|United States–Mexico–Canada Free Trade Agreement]]. In [[South America]], [[Colombia]] is traditionally considered to be the closest ally of the United States. The U.S. exercises full international defense authority and responsibility for [[Federated States of Micronesia|Micronesia]], the [[Marshall Islands]] and [[Palau]] through the [[Compact of Free Association]]. The U.S. has become a key ally of [[Ukraine]] since [[Russia]] [[Annexation of Crimea by the Russian Federation|annexed Crimea in 2014]] in 2014 and began an [[2022 Russian invasion of Ukraine|invasion of Ukraine in 2022]], significantly deteriorating relations with Russia in the process. The U.S. has also experienced a deterioration of relations with [[China]] and grown closer to [[Taiwan]]. ===Military=== [[File:B-2 URL|thumb|[[Northrop Grumman B-2 Spirit|B-2 Spirit]], the [[Stealth technology|stealth]] [[Heavy bomber|heavy]] [[strategic bomber]] of the [[United States Air Force|USAF]]]] [[File:Aerial view of the Pentagon, Arlington, VA (38285035892).jpg|thumb|[[The Pentagon]], near Washington, D.C., is home to the [[U.S. Department of Defense]].]] The president is the [[Commander-in-Chief of the United States|commander-in-chief]] of the United States Armed Forces and appoints its leaders, the [[United States Secretary of Defense|secretary of defense]] and the [[Joint Chiefs of Staff]]. The [[United States Department of Defense|Department of Defense]], which is headquartered at [[the Pentagon]] near Washington, D.C., administers five of the six service branches, which are made up of the [[United States Army|Army]], [[United States Marine Corps|Marine Corps]], [[United States Navy|Navy]], [[United States Air Force|Air Force]], and [[United States Space Force|Space Force]]. The [[United States Coast Guard|Coast Guard]] is administered by the [[United States Department of Homeland Security|Department of Homeland Security]] in peacetime and can be transferred to the [[United States Department of the Navy|Department of the Navy]] in wartime. The United States spent $649 billion on its military in 2019, 36% of global military spending. At 4.7% of GDP, the percentage was the second-highest among all countries, after [[Saudi Arabia]]. It also has [[Nuclear weapons of the United States|more than 40% of the world's nuclear weapons]], the second-largest after Russia. In 2019, all six branches of the U.S. Armed Forces reported 1.4 million personnel on active duty. The [[Reserve components of the United States Armed Forces|Reserves]] and [[National Guard of the United States|National Guard]] brought the total number of troops to 2.3 million. The Department of Defense also employed about 700,000 civilians, not including [[Military-industrial complex|contractors]]. Military service in the United States is voluntary, although [[Conscription in the United States|conscription]] may occur in wartime through the [[Selective Service System]]. The United States has the third-largest combined armed forces in the world, behind the [[People's Liberation Army|Chinese People's Liberation Army]] and [[Indian Armed Forces]].[[#IISS2020|IISS 2020]], pp. 46 Today, American forces can be rapidly deployed by the Air Force's large fleet of [[transport aircraft]], the Navy's 11 active [[aircraft carrier]]s, and [[Marine expeditionary unit]]s at sea with the Navy, and Army's [[XVIII Airborne Corps]] and [[75th Ranger Regiment]] deployed by Air Force transport aircraft. The Air Force can strike targets across the globe through its fleet of [[strategic bomber]]s, maintains the [[air defense]] across the United States, and provides [[close air support]] to Army and Marine Corps ground forces. The Space Force operates the [[Global Positioning System]], operates the [[Eastern Range|Eastern]] and [[Western Range (USSF)|Western Range]]s for all space launches, and operates the United States's [[United States Space Surveillance Network|Space Surveillance]] and [[United States national missile defense|Missile Warning]] networks. The military operates about 800 bases and facilities abroad, and maintains [[United States military deployments|deployments greater than 100 active duty personnel]] in 25 foreign countries. ===Law enforcement and crime=== [[File:US incarceration URL|thumb|left|Total [[incarceration]] in the United States by year (1920–2014)|alt=Chart depicting a steep increase in the number of incarcerated Americans from the 1980s to the 2000s]] There are about 18,000 U.S. police agencies from local to federal level in the United States. Law in the United States is mainly [[Law enforcement in the United States|enforced]] by local police departments and [[sheriff]]'s offices. The [[state police]] provides broader services, and [[Federal law enforcement in the United States|federal agencies]] such as the [[Federal Bureau of Investigation]] (FBI) and the [[United States Marshals Service|U.S. Marshals Service]] have specialized duties, such as protecting [[civil rights]], [[National Security of the United States|national security]] and enforcing [[U.S. federal courts]]' rulings and federal laws. [[State court (United States)|State court]]s conduct most civil and criminal trials, and federal courts handle designated crimes and appeals from the state criminal courts. , the United States has an [[List of countries by intentional homicide rate|intentional homicide rate]] of 7 per 100,000 people. A cross-sectional analysis of the [[World Health Organization]] Mortality Database from 2010 showed that United States homicide rates "were 7.0 times higher than in other high-income countries, driven by a gun homicide rate that was 25.2 times higher." , the United States has the [[United States incarceration rate|sixth highest documented incarceration rate]] and [[Incarceration in the United States|second largest prison population]] in the world. In 2019, the total prison population for those sentenced to more than a year is 1,430,800, corresponding to a ratio of 419 per 100,000 residents and the lowest since 1995. Some estimates place that number higher, such [[Prison Policy Initiative]]'s 2.3 million. Various states have attempted to [[Decarceration in the United States|reduce their prison populations]] via government policies and grassroots initiatives. Although most nations have abolished [[capital punishment]], it is sanctioned in the United States for certain federal and military crimes, and in 27 states out of 50 and in one territory. Several of these states have [[Moratorium (law)|moratoriums]] on carrying out the penalty, each imposed by the state's governor. Since 1977, there have been more than 1,500 executions, giving the U.S. the sixth-highest number of executions in the world, following [[Capital punishment in China|China]], [[Capital punishment in Iran|Iran]], [[Capital punishment in Saudi Arabia|Saudi Arabia]], [[Capital punishment in Iraq|Iraq]], and [[Capital punishment in Egypt|Egypt]]. However, the number is trended down nationally, with [[Capital punishment in the United States#States without capital punishment|several states]] recently abolishing the penalty.[URL DPIC adds Eleven cases to the Innocence List bringing national death-row exonerations to 185], ''[[Death Penalty Information Center]]'', Robert Durham, February 18, 2021. Retrieved November 9, 2021. ==Economy== [[File:US one dollar bill, obverse, series URL|thumb|alt=see caption|The [[United States dollar|U.S. dollar]] (featuring [[George Washington]]) is the currency most used in [[international trade|international transactions]] and is the world's foremost [[reserve currency]].]] [[File:Gaming-Wall-Street\_BTS\_Prodigium-URL|thumb|The [[New York Stock Exchange]] on [[Wall Street]], the world's largest stock exchange by [[market capitalization]] of its listed companies]] According to the [[International Monetary Fund]], the U.S. [[gross domestic product]] (GDP) of $22.7 trillion constitutes 24% of the [[gross world product]] at market exchange rates and over 16% of the gross world product at [[purchasing power parity]] (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks fifth in the world in [[List of countries by GDP (nominal) per capita|nominal GDP per capita]] and seventh in [[List of countries by GDP (PPP) per capita|GDP per capita at PPP]]. The country has been the world's largest economy since at least 1900. The United States is at or near the forefront of [[Science and technology in the United States|technological advancement]] and [[innovation]] in many economic sectors, especially in [[artificial intelligence]], [[computers]], [[pharmaceuticals]], and [[medical]], [[aerospace]], and [[military equipment]]. The nation's economy is fueled by abundant [[natural resource]]s, a well-developed infrastructure, and high productivity.Wright, Gavin, and Jesse Czelusta, "Resource-Based Growth Past and Present", in ''Natural Resources: Neither Curse Nor Destiny'', ed. Daniel Lederman and William Maloney (World Bank, 2007), p. 185. . It has the second-highest total-estimated value of natural resources, valued at [[United States dollar|US$]]44.98trillion in 2019, although sources differ on their estimates. Americans have the highest average [[Household income|household]] and [[List of countries by average wage|employee]] income among [[OECD]] member states. In 2013, they had the sixth-highest [[median household income]], down from fourth-highest in 2010. The [[United States dollar|U.S. dollar]] is the currency most used in [[international trade|international transactions]] and is the world's foremost [[reserve currency]], backed by the country's economy, [[United States Armed Forces|its military]], the [[petrodollar|petrodollar system]] and its linked [[eurodollar]] and large [[U.S. Treasury|U.S. treasuries market]]. Several countries [[International use of the US dollar|use it as their official currency]] and in others it is the [[de facto currency|''de facto'' currency]].Benjamin J. Cohen, ''The Future of Money'', Princeton University Press, 2006, ; ''cf.'' "the dollar is the de facto currency in Cambodia", Charles Agar, ''[[Frommer's]] Vietnam'', 2006, , p. 17 The [[New York Stock Exchange]] and [[Nasdaq]] are the world's [[List of stock exchanges|largest stock exchanges]] by [[market capitalization]] and [[trade volume]].[URL Table A – Market Capitalization of the World's Top Stock Exchanges (As at end of June 2012)]. Securities and Exchange Commission (China). The [[List of the largest trading partners of the United States|largest U.S. trading partners]] are [[China]], the [[European Union]], [[Canada]], [[Mexico]], [[India]], [[Japan]], [[South Korea]], the [[United Kingdom]], and [[Taiwan]]. The U.S. is the world's [[List of countries by imports|largest]] importer and the [[List of countries by exports|second-largest]] exporter. It has [[free trade agreements]] with [[United States free-trade agreements|several countries]], including the [[United States–Mexico–Canada Agreement|USMCA]]. The U.S. ranked second in the [[Global Competitiveness Report]] in 2019, after [[Singapore]]. Of the world's [[Fortune Global 500|500 largest companies]], 124 are headquartered in the U.S. Number of companies data taken from the "Country" filter. While its economy has reached a [[post-industrial society|post-industrial]] level of development, the United States remains an industrial power. It has a smaller [[welfare state]] and redistributes less income through government action than most other [[World Bank high-income economy|high-income]] countries. The United States ranked the 41st highest in [[income inequality]] among 156 countries in 2017, and the highest compared to the rest of the [[developed world]]. As of January 1, 2023, the United States had [[National debt of the United States|a national debt]] of $31.4 trillion. ===Income and poverty=== [[File:US Wealth Inequality - URL|thumb|left|[[Congressional Budget Office|CBO]] chart featuring U.S. family wealth between 1989 and 2013. The top 10% of families held 76% of the wealth in 2013 while the bottom 50% of families held 1%. Inequality increased from 1989 to 2013.]] At $46,625 USD in 2021, American citizens have the highest [[median income]] in the world. Despite the fact that they only account for 4.24% of the [[World population|global population]], they collectively [[List of countries by total wealth|possess 30.2%]] of the world's total wealth as of 2021, the largest percentage of any country. The U.S. also ranks first in the number of dollar [[billionaire]]s and [[millionaire]]s in the world, with 724 billionaires (as of 2021) and nearly 22 million millionaires (2021). [[Wealth in the United States]] is [[Wealth inequality in the United States|highly concentrated]]; the richest 10% of the adult population own 72% of the country's household wealth, while the bottom 50% own just 2%. [[Income inequality in the United States|Income inequality]] in the U.S. remains at record highs, with the top fifth of earners taking home more than half of all income and giving the U.S. one of the widest income distributions among OECD members. The United States is the only [[advanced economy]] that does not [[List of statutory minimum employment leave by country|guarantee its workers paid vacation]] and is one of a few countries in the world without [[paid family leave]] as a legal right. The United States also has a higher percentage of low-income workers than almost any other developed nation, largely because of a weak [[collective bargaining]] system and lack of government support for at-risk workers. There were about 567,715 sheltered and unsheltered [[Homelessness in the United States|homeless persons in the U.S.]] in January 2019, with almost two-thirds staying in an emergency shelter or transitional housing program. Attempts to combat homelessness include the [[Section 8 (housing)|Section 8]] housing voucher program and implementation of the [[Housing First]] strategy across all levels of government. In 2011, [[Hunger in the United States#Children|16.7 million children lived in food-insecure households]], about 35% more than 2007 levels, though only 845,000 U.S. children (1.1%) saw reduced food intake or disrupted eating patterns at some point during the year, and most cases were not chronic. 40 million people, roughly 12.7% of the U.S. population, were living in poverty, including 13.3 million children. Of those impoverished, 18.5 million live in "deep poverty", family income below one-half of the federal government's poverty threshold. ===Science, technology, and energy=== [[File:Buzz\_salutes\_the\_U.S.\_Flag-URL|thumb|U.S. astronaut [[Buzz Aldrin]] saluting the [[United States flag|flag]] on the [[lunar surface|Moon]] during the [[Apollo 11]], 1969. The United States is the only country that has sent [[Moon landing|manned missions to the lunar surface]].]] The United States has been a leader in technological [[innovation]] since the late 19th century and scientific research since the mid-20th century. Methods for producing [[interchangeable parts]] and the establishment of a [[machine tool]] industry enabled the [[American system of manufacturing|U.S. to have large-scale manufacturing]] of sewing machines, bicycles, and other items in the late 19th century. In the early 20th century, factory [[electrification]], the introduction of the [[assembly line]], and other labor-saving techniques created the system of [[mass production]]. In the 21st century, approximately two-thirds of research and development funding comes from the private sector. In 2020, the United States was the country with the [[List of countries by number of scientific and technical journal articles|second-highest]] number of published scientific papers and second most patents granted, both after China. In 2021, the United States launched a total of 51 [[spaceflights]]. (China reported 55.) The U.S. had 2,944 active [[satellites]] in space in December 2021, the highest number of any country. In 1876, [[Alexander Graham Bell]] was awarded the first U.S. [[Invention of the telephone|patent for the telephone]]. [[Thomas Edison]]'s [[Research institute|research laboratory]] developed the [[phonograph]], the first [[Incandescent light bulb|long-lasting light bulb]], and the first viable [[Kinetoscope|movie camera]]. The [[Wright brothers]] in 1903 made the [[Wright Flyer|first sustained and controlled heavier-than-air powered flight]], and the automobile companies of [[Ransom E. Olds]] and [[Henry Ford]] popularized the assembly line in the early 20th century. The rise of [[fascism]] and [[Nazism]] in the 1920s and 30s led many European scientists, such as [[Albert Einstein]], [[Enrico Fermi]], and [[John von Neumann]], to immigrate to the United States. During World War II, the Manhattan Project developed nuclear weapons, ushering in the [[Atomic Age]]. During the Cold War, competition for superior missile capability ushered in the [[Space Race]] between the U.S. and Soviet Union. The invention of the [[transistor]] in the 1950s, a key component in almost all modern [[electronics]], led to the development of [[microprocessor]]s, [[software]], [[personal computer]]s and the [[Internet]]. In 2022, the United States ranked 2nd in the [[Global Innovation Index]]. , the United States receives approximately 80% of its energy from fossil fuels. In 2019, the largest source of the country's energy came from [[petroleum]] (36.6%), followed by [[natural gas]] (32%), [[coal]] (11.4%), renewable sources (11.4%) and [[nuclear power]] (8.4%). Americans constitute less than 5% of the [[world population|world's population]], but consume 17% of the [[Energy use in the United States|world's energy]]. They account for about 25% of the world's [[Oil consumption|petroleum consumption]], while producing only 6% of the world's annual petroleum supply. The U.S. ranks as second-highest emitter of greenhouse gases, exceeded only by China. ===Transportation=== [[File:Bright\_Atlanta.jpg|thumb|The [[Downtown Connector]] in [[Atlanta, Georgia]], part of the [[Interstate Highway System]]]] The United States's [[Rail transport in the United States|rail network]], nearly all [[Standard-gauge railway|standard gauge]], is the [[List of countries by rail transport network size|longest in the world]], and exceeds . It handles mostly freight, with intercity passenger service provided by [[Amtrak]] to all but four states. The country's [[Inland waterways of the United States|inland waterway]]s are the world's [[List of countries by waterways length|fifth-longest]], and total . Personal transportation is dominated by automobiles, which operate on a network of of public roads. The United States has the world's second-largest automobile market, and has the highest vehicle ownership per capita in the world, with 816.4 vehicles per 1,000 Americans (2014). In 2017, there were 255 million non-two wheel motor vehicles, or about 910 vehicles per 1,000 people. The [[List of airlines of the United States|civil airline industry]] is entirely privately owned and has been largely [[Airline Deregulation Act|deregulated since 1978]], while [[List of airports in the United States|most major airports]] are publicly owned. The three largest airlines in the world by passengers carried are U.S.-based; [[American Airlines]] is number one after its 2013 acquisition by [[US Airways]]. Of the [[List of the world's busiest airports by passenger traffic|world's 50 busiest passenger airports]], 16 are in the United States, including the busiest, [[Hartsfield–Jackson Atlanta International Airport]]. Of the [[List of busiest container ports|fifty busiest container ports]], four are located in the United States, of which the busiest is the [[Port of Los Angeles]]. ==Demographics== ===Population=== The [[United States Census Bureau|U.S. Census Bureau]] reported 331,449,281 residents as of April 1, 2020, making the United States the [[List of countries and dependencies by population|third most populous]] nation in the world, after China and India. According to the Bureau's [[U.S. and World Population Clock|U.S. Population Clock]], on January 28, 2021, the U.S. population had a net gain of one person every 100 seconds, or about 864 people per day. In 2018, 52% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 32% had never been married. In 2020, the U.S. had a [[total fertility rate]] stood at 1.64 children per woman and the world's highest rate (23%) of children living in [[Single parents in the United States|single-parent]] households. The United States of America has a diverse population; 37 [[American ancestries|ancestry groups]] have more than one million members. [[Non-Hispanic whites|White Americans]] with ancestry from Europe, the Middle East or North Africa, form the largest [[race (human classification)|racial]] and [[ethnic group]] at 57.8% of the United States population. [[Hispanic and Latino Americans]] form the second-largest group and are 18.7% of the United States population. [[African Americans]] constitute the nation's third-largest ancestry group and are 12.1% of the total United States population. [[Asian Americans]] are the country's fourth-largest group, composing 5.9% of the United States population, while the country's 3.7 million [[Native Americans in the United States|Native Americans]] account for about 1%. In 2020, the [[median age]] of the United States population was 38.5 years. In 2018, there were almost 90 million immigrants and [[Second-generation immigrants in the United States|U.S.-born children of immigrants]] in the United States, accounting for 28% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. The United States led the world in [[refugee resettlement]] for decades, admitting more refugees than the rest of the world combined. ===Language=== English (specifically, [[American English]]) is the de facto [[national language]] of the United States. Although there is no [[official language]] at the federal level, some laws—such as [[Naturalized citizen of the United States|U.S. naturalization requirements]]—standardize English, and most states have declared English as the official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English, including Hawaii ([[Hawaiian language|Hawaiian]]), Alaska ([[Alaska Native languages|twenty Native languages]]), South Dakota ([[Sioux language|Sioux]]), American Samoa ([[Samoan language|Samoan]]), Puerto Rico ([[Spanish language|Spanish]]), Guam ([[Chamorro language|Chamorro]]), and the Northern Mariana Islands ([[Carolinian language|Carolinian]] and Chamorro). In Puerto Rico, Spanish is more widely spoken than English. According to the [[American Community Survey]], in 2010 some 229 million people (out of the total U.S. population of 308 million) spoke only English at home. More than 37 million spoke [[Spanish language in the United States|Spanish]] at home, making it the second most commonly used language in the United States. Other languages spoken at home by one million people or more include [[Chinese language|Chinese]] (2.8 million), [[Tagalog language|Tagalog]] (1.6 million), [[Vietnamese language|Vietnamese]] (1.4 million), [[French language|French]] (1.3 million), [[Korean language|Korean]] (1.1 million), and [[German language|German]] (1 million). The [[List of most commonly learned foreign languages in the United States|most widely taught foreign languages]] in the United States, in terms of enrollment numbers from kindergarten through university [[undergraduate education]], are Spanish (around 7.2 million students), French (1.5 million), and [[German language in the United States|German]] (500,000). Other commonly taught languages include [[Latin]], [[Japanese language education in the United States|Japanese]], [[American Sign Language]], [[Italian language in the United States|Italian]], and [[Chinese language in the United States|Chinese]]. ===Religion=== A large variety of faiths have historically flourished within the country. According to the [[World Values Survey]] in 2017, the United States is more [[Secularity|secular]] than the median country; they ranked the United States the 32nd least religious country in the world. Until the 1990s, the country was a substantial [[outlier]] among other [[Developed country|highly developed]] countries: uniquely [[Wealth and religion|combining a high level of religiosity and wealth]], although this has lessened significantly since then. ''[[Gallup, Inc.|Gallup]]'' polls during the early 2020s found that about 81% of Americans believe in some conception of God, 45% report [[Prayer|praying]] on a daily basis, 41% report that religion plays a very important role in their lives, and 31% report attending religious services weekly or near weekly. According to ''[[Gallup, Inc.|Gallup]]'' in December 2022, 58% of Americans report "seldom" or "never" attending religious services. According to the ''Institute for Family Studies'' in 2022, around 28% of Americans attended at least once or twice a month''.'' In a 2020 survey, about 64% of adults in the United States identified themselves as [[Christianity in the United States|Christians]] making it the country with the [[Christianity by country|largest Christian population]]. [[Protestantism in the United States|Protestantism]] is the largest Christian religious grouping in the United States, accounting for around a third of all Americans. In the so-called [[Bible Belt]], located primarily within the [[Southern United States]], socially conservative [[evangelical Protestantism]] plays a significant role culturally. By contrast, religion plays the least important role in [[New England]] and the Western United States. Another 6% claimed a non-Christian faith; the largest of which are [[American Jews|Judaism]], [[Islam in the United States|Islam]], [[Hinduism in the United States|Hinduism]], and [[Buddhism in the United States|Buddhism]]. Around 30% of Americans describe themselves as having [[irreligion|no religion]]. Membership in a house of worship fell from 70% in 1999 to 47% in 2020, much of the decline related to the number of Americans expressing no religious preference. Membership also fell among those who identified with a specific religious group. According to ''Gallup'', trust in "the church or organized religion" has declined significantly since the 1970s. The [[First Amendment to the United States Constitution|First Amendment]] of the U.S. Constitution guarantees the [[Free Exercise Clause|free exercise]] of religion and forbids Congress from passing laws respecting its [[Establishment Clause|establishment]]. ===Urbanization=== About 82% of Americans live in [[United States urban area|urban areas]], including suburbs; about half of those reside in cities with populations over 50,000. In 2008, 273 [[List of United States cities by population|incorporated municipalities]] had populations over 100,000, nine cities had more than one million residents, and four cities ([[New York City]], [[Los Angeles]], [[Chicago]], and [[Houston]]) had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. ===Education=== [[File:URL|thumb|The [[University of Virginia]], founded by [[Thomas Jefferson]], is one of the many public colleges and universities in the United States.|alt=Photograph of the University of Virginia|right]] American [[state school|public education]] is operated by state and local governments and regulated by the [[United States Department of Education]] through restrictions on federal grants. In most states, children are required to attend school from the age of five or six (beginning with [[kindergarten]] or [[first grade]]) until they turn 18 (generally bringing them through [[twelfth grade]], the end of [[high school]]); some states allow students to leave school at 16 or 17. Of Americans 25 and older, 84.6% graduated from high school, 52.6% attended some college, 27.2% earned a [[bachelor's degree]], and 9.6% earned graduate degrees. The basic [[literacy]] rate is approximately 99%.For more detail on U.S. literacy, see [URL A First Look at the Literacy of America's Adults in the 21st century], U.S. Department of Education (2003). The United States has many private and public [[Lists of American institutions of higher education|institutions of higher education]]. The majority of the world's top [[Public university|public]] and [[Private university|private universities]], as listed by various ranking organizations, are in the United States. There are also local [[community college]]s with generally more open admission policies, shorter academic programs, and lower tuition. The U.S. spends more on education per student than any nation in the world, spending an average of $12,794 per year on public elementary and secondary school students in the 2016–2017 school year. As for [[public expenditure]]s on higher education, the U.S. spends more per student than the [[Organisation for Economic Co-operation and Development|OECD]] average, and more than all nations in combined public and private spending. Despite some student [[loan forgiveness]] programs in place, [[Student debt|student loan debt]] has increased by 102% in the last decade, and exceeded 1.7 trillion dollars as of 2022. ===Health=== [[File:Texas medical URL|thumb|left|The [[Texas Medical Center]] in downtown [[Houston]] is the largest medical complex in the world. |alt=The Texas Medical Center, a cluster of contemporary skyscrapers, at night]] In a preliminary report, the [[Centers for Disease Control and Prevention]] (CDC) announced that U.S. [[life expectancy]] at birth had dropped to 76.4 years in 2021 (73.2 years for men and 79.1 years for women), down 0.9 years from 2020. This was the second year of overall decline, and the chief causes listed were the [[COVID-19]] pandemic, accidents, drug overdoses, heart and liver disease, and suicides. Life expectancy was highest among Asians and Hispanics and lowest among Blacks and American Indian–Alaskan Native ([[AIAN (U.S. Census)|AIAN]]) peoples. Starting in 1998, the average life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The U.S. also has one of the highest [[Suicide in the United States|suicide]] rates among [[high-income countries]], and approximately one-third of the U.S. adult population is obese and another third is overweight. In 2010, [[coronary artery disease]], [[lung cancer]], [[stroke]], [[chronic obstructive pulmonary disease]]s, and traffic collisions caused the most years of life lost in the U.S. [[Low back pain]], [[major depressive disorder|depression]], [[musculoskeletal disorder]]s, [[neck pain]], and [[anxiety]] caused the most years lost to disability. The most harmful [[risk factor]]s were poor diet, [[tobacco smoking]], obesity, [[Hypertension|high blood pressure]], [[Hyperglycemia|high blood sugar]], [[physical inactivity]], and [[Alcohol consumption and health|alcohol consumption]]. [[Alzheimer's disease]], [[substance use disorder]]s, [[kidney disease]], [[cancer]], and falls caused the most additional years of life lost over their age-adjusted 1990 per-capita rates. [[Teenage pregnancy in the United States|Teenage pregnancy]] and [[Abortion in the United States|abortion]] rates in the U.S. are substantially higher than in other Western nations, especially among blacks and Hispanics. The U.S. health care system far [[List of countries by total health expenditure (PPP) per capita|outspends]] that of any other nation, measured both in per capita spending and as a percentage of GDP but attains worse healthcare outcomes when compared to peer nations. The United States is the only developed nation [[Healthcare reform in the United States|without a system of universal health care]], and a [[Health insurance coverage in the United States|significant proportion of the population that does not carry health insurance]]. The U.S., however, is a global leader in medical innovation, measured either in terms of revenue or the number of new drugs and devices introduced.Stats from 2007 URL URL Assoc. Retrieved June 17, 2009, from [http://212.3.246.100/Objects/2/Files/URL] Government-funded health care coverage for the poor ([[Medicaid]], established in 1965) and for those age 65 and older ([[Medicare (United States)|Medicare]], begun in 1966) is available to Americans who meet the programs' income or age qualifications. In 2010, former President Obama passed the [[Patient Protection and Affordable Care Act]] or ACA, which the CDC said that the law roughly halved the uninsured share of the population and multiple studies have concluded that ACA had reduced the mortality of enrollees. However, its legacy [[Criticism of Obamacare|remains controversial]]. ==Culture and society== [[File:URL|thumb|upright|The [[Statue of Liberty]] (''Liberty Enlightening the World''), a gift from [[France]], has become an iconic symbol of the [[American Dream]].|alt=The Statue of Liberty, a large teal bronze sculpture on a stone pedestal]] Americans have traditionally [[Stereotypes of Americans|been characterized]] by a strong [[work ethic]], competitiveness, and [[individualism]], as well as a unifying belief in an "[[American civil religion|American creed]]" emphasizing liberty, [[social equality]], [[property rights]], democracy, [[equality under the law]], and a preference for [[limited government]].: also see [[American's Creed]], written by [[William Tyler Page]] and adopted by Congress in 1918. Americans are extremely charitable by global standards: according to a 2016 study by the [[Charities Aid Foundation]], Americans donated 1.44% of total GDP to charity, the [[List of countries by charitable donation|highest]] in the world by a large margin. The United States is home to a [[Multiculturalism|wide variety]] of ethnic groups, traditions, and values, and exerts major cultural influence [[Americanization|on a global scale]].[[United States#BBC18may|BBC, April 2008: Country Profile: United States of America]] The country has been described as a society "built on a [[Moral universalism|universalistic cultural frame]] rooted in the natural laws of science and human rights." The [[United States Declaration of Independence|Declaration of Independence]] has become a well-known statement on [[human rights]], particularly its second sentence: "We hold these truths to be self-evident, that [[all men are created equal]], that they are endowed by [[Higher Power|their Creator]] with certain unalienable Rights, that among these are [[Life, Liberty and the pursuit of Happiness]]." Stephen Lucas called it "one of the best-known sentences in the English language", with historian [[Joseph Ellis]] writing that the document contains "the most potent and consequential words in American history". The passage has since came to represent a moral standard to which the United States should strive. This view was notably promoted by Lincoln, who considered it to be the foundation of his political philosophy and argued that it is a statement of principles through which the Constitution should be interpreted. Aside from the Native American, [[Native Hawaiians|Native Hawaiian]], and [[Alaska Natives|Native Alaskan]] populations, nearly all Americans or their ancestors immigrated or were imported as slaves within the past five centuries. [[wikt:mainstream|Mainstream]] American culture is a [[Western culture]] largely derived from the [[European American|traditions of European immigrants]] with influences from many other sources, such as [[African-American culture|traditions brought by slaves from Africa]]. More recent immigration from [[Asian American|Asia]] and especially [[Latin American culture|Latin America]] has added to a cultural mix that has been described as a homogenizing [[melting pot]], and a heterogeneous [[salad bowl (cultural idea)|salad bowl]], with immigrants contributing to, and often [[Assimilation (phonology)|assimilating]] into, mainstream American culture. Nevertheless, there is a high degree of social inequality related to [[Racial inequality in the United States|race]] and [[Wealth inequality in the United States|wealth]]. The [[American Dream]], or the perception that Americans enjoy high [[Socio-economic mobility in the United States|social mobility]], plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate.\* \* \* \* While mainstream culture holds that the United States is a [[classless society]], scholars identify significant differences between [[Social class in the United States|the country's social classes]], affecting [[socialization]], language, and values. Americans tend to greatly value [[socioeconomics|socioeconomic]] achievement, but being [[Average Joe|ordinary or average]] is promoted by some as a noble condition. The United States has uniquely broad protections for [[Freedom of speech|free speech]] under the [[First Amendment to the United States Constitution|First Amendment]], with no exceptions for speech that is commonly proscribed in other liberal democracies such as [[Blasphemy law|blasphemy]], [[Hate speech in the United States|hate speech]], and [[Lèse-majesté|lese-majesty]]. A 2016 [[Pew Research Center]] poll found that Americans were the most supportive of free expression of any polity measured. They were also found to be the "most supportive of [[freedom of the press]] and the [[Right to Internet access|right to use the internet]] without government censorship." It is a [[Cultural liberalism|socially progressive]] country with [[Permissive society|permissive]] attitudes surrounding [[human sexuality]]. [[LGBT rights in the United States|LGBT rights]] are among the most advanced in the world, with [[Public opinion of same-sex marriage in the United States|public opinion]] and [[jurisprudence]] on the issue changing significantly since the late 1980s. A late 2022 ''Grinnell College'' poll found that 74% of Americans agreed that same-sex marriage should be a guaranteed right while 13% disagreed. In 2015, the Supreme Court [[Obergefell v. Hodges|ruled that]] same-sex marriage bans were unconstitutional. ===Literature and visual arts=== [[File:Mark Twain by AF URL|thumb|left|145px|upright|[[Mark Twain]], American author and [[humorist]]|alt=Photograph of Mark Twain]] In the 18th and early 19th centuries, American art and literature took most of their cues from Europe, contributing to Western culture. Writers such as [[Washington Irving]], [[Nathaniel Hawthorne]], [[Edgar Allan Poe]], and [[Henry David Thoreau]] established a distinctive American literary voice by the middle of the 19th century. [[Mark Twain]] and poet [[Walt Whitman]] were major figures in the century's second half; [[Emily Dickinson]], virtually unknown during her lifetime, is recognized as an essential American poet. A work seen as capturing fundamental aspects of the national experience and character—such as [[Herman Melville]]'s ''[[Moby-Dick]]'' (1851), Twain's ''[[Adventures of Huckleberry Finn|The Adventures of Huckleberry Finn]]'' (1885), [[F. Scott Fitzgerald]]'s ''[[The Great Gatsby]]'' (1925) and [[Harper Lee]]'s ''[[To Kill a Mockingbird]]'' (1960)—may be dubbed the "[[Great American Novel]]." Thirteen U.S. citizens have won the [[Nobel Prize in Literature]]. [[William Faulkner]], [[Ernest Hemingway]] and [[John Steinbeck]] are often named among the most influential writers of the 20th century. The [[Beat Generation]] writers opened up new literary approaches, as have [[postmodern literature|postmodernist]] authors such as [[John Barth]], [[Thomas Pynchon]], and [[Don DeLillo]]. In the visual arts, the [[Hudson River School]] was a mid-19th-century movement in the tradition of European [[Realism (arts)|naturalism]]. The 1913 [[Armory Show]] in New York City, an exhibition of European [[modern art|modernist art]], shocked the public and transformed the U.S. art scene. [[Georgia O'Keeffe]], [[Marsden Hartley]], and others experimented with new, individualistic styles. Major artistic movements such as the [[abstract expressionism]] of [[Jackson Pollock]] and [[Willem de Kooning]] and the [[pop art]] of [[Andy Warhol]] and [[Roy Lichtenstein]] developed largely in the United States. The tide of modernism and then [[postmodernism]] has brought fame to American architects such as [[Frank Lloyd Wright]], [[Philip Johnson]], and [[Frank Gehry]]. Americans have long been important in the modern artistic medium of [[photography]], with major photographers including [[Alfred Stieglitz]], [[Edward Steichen]], [[Edward Weston]], and [[Ansel Adams]]. ===Cinema and theater=== [[File:Hollywood Sign (Zuschnitt).jpg|thumb|The [[Hollywood Sign]] in [[Los Angeles]], California|alt=The Hollywood Sign, large white block letters on a hillside]] [[Hollywood, Los Angeles|Hollywood]], a northern district of Los Angeles, California, is the leader in motion picture production and the most recognizable movie industry in the world. The [[major film studios]] of the United States are the primary source of the [[List of highest grossing films|most commercially successful]] and most ticket selling movies in the world. The world's first commercial motion picture exhibition was given in New York City in 1894, using the [[Kinetoscope]]. Since the early 20th century, the U.S. film industry has largely been based in and around Hollywood, although in the 21st century an increasing number of films are not made there, and film companies have been subject to the forces of globalization. The [[Academy Awards]], popularly known as the Oscars, have been held annually by the [[Academy of Motion Picture Arts and Sciences]] since 1929, and the [[Golden Globe Award]]s have been held annually since January 1944. Director [[D. W. Griffith]], an American [[filmmaker]] during the [[silent film]] period, was central to the development of [[film grammar]], and producer/entrepreneur [[Walt Disney]] was a leader in both [[animation|animated film]] and movie [[merchandising]]. Directors such as [[John Ford]] redefined the image of the American Old West, and, like others such as [[John Huston]], broadened the possibilities of cinema with location shooting. The industry enjoyed its golden years, in what is commonly referred to as the "[[Classical Hollywood cinema|Golden Age of Hollywood]]", from the early sound period until the early 1960s, with screen actors such as [[John Wayne]] and [[Marilyn Monroe]] becoming iconic figures. In the 1970s, "[[New Hollywood]]" or the "Hollywood Renaissance" was defined by grittier films influenced by French and Italian realist pictures of the [[Aftermath of World War II|post-war period]]. Theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the [[Theatre of the United Kingdom|British theater]]. The central hub of the American theater scene has been [[Manhattan]], with its divisions of [[Broadway theatre|Broadway]], [[off-Broadway]], and [[off-off-Broadway]]. Many movie and television stars have gotten their big break working in New York productions. Outside New York City, many cities have professional [[Regional theater in the United States|regional or resident theater companies]] that produce their own seasons, with some works being produced regionally with hopes of eventually moving to New York. The biggest-budget theatrical productions are [[musical theatre|musicals]]. U.S. theater also has an active [[community theater]] culture, which relies mainly on local volunteers who may not be actively pursuing a theatrical career.Stephen Watt, and Gary A. Richardson, ''American Drama: Colonial to Contemporary'' (1994). ===Music=== [[File:Country\_music\_hall\_of\_fame2.jpg|thumb|The [[Country Music Hall of Fame and Museum]] in [[Nashville, Tennessee]]]] [[American folk music]] encompasses numerous music genres, variously known as traditional music, traditional [[folk music]], contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the [[British Isles]], [[Mainland Europe]], or Africa.[URL "Folk Music and Song", American Folklife Center, Library of Congress] Among America's earliest composers was a man named [[William Billings]] who, born in Boston, composed patriotic hymns in the 1770s; Billings was a part of the [[Yankee tunesmiths|First New England School]], who dominated American music during its earliest stages. [[Anthony Heinrich]] was the most prominent composer before the Civil War. From the mid- to late 1800s, [[John Philip Sousa]] of the late [[Romantic music|Romantic era]] composed numerous military songs—[[List of marches by John Philip Sousa|particularly marches]]—and is regarded as one of America's greatest composers. The rhythmic and lyrical styles of [[African-American music]] have significantly influenced [[Music of the United States|American music]] at large, distinguishing it from European and African traditions. Elements from folk idioms such as the [[blues]] and what is known as [[old-time music]] were adopted and transformed into [[popular music|popular genres]] with global audiences. [[Jazz]] was developed by innovators such as [[Louis Armstrong]] and [[Duke Ellington]] early in the 20th century. [[Country music]] developed in the 1920s, and [[rhythm and blues]] in the 1940s. [[Elvis Presley]] and [[Chuck Berry]] were among the pioneers of [[rock and roll]] in the mid-1950s. Rock bands such as [[Metallica]], the [[Eagles (band)|Eagles]], and [[Aerosmith]] are among the [[List of best-selling music artists|highest grossing]] in worldwide sales. In the 1960s, [[Bob Dylan]] emerged from the [[American folk music revival|folk revival]] to become one of America's most celebrated songwriters. Mid-20th-century American pop stars such as [[Bing Crosby]], [[Frank Sinatra]], and Elvis Presley became global celebrities, as have artists of the late 20th century such as [[Prince (musician)|Prince]], [[Michael Jackson]], [[Madonna]], [[Whitney Houston]], and [[Mariah Carey]]. ===Mass media=== [[File:Comcast\_Philly.JPG|upright|thumb|left|The [[Comcast Center]] in [[Philadelphia]], headquarters of the nation's largest [[Multinational corporation|multinational]] [[telecommunications]] [[conglomerate (company)|conglomerate]]]] The four major broadcasters in the U.S. are the [[National Broadcasting Company]] (NBC), [[Columbia Broadcasting System]] (CBS), [[American Broadcasting Company]] (ABC), and [[Fox Broadcasting Company]] (FOX). The four major broadcast [[television network]]s are all commercial entities. [[Cable television in the United States|Cable television]] offers hundreds of channels catering to a variety of niches. , about 83% of Americans over age 12 listen to [[radio broadcasting|broadcast radio]], while about 41% listen to [[podcast]]s. , there are 15,433 licensed full-power radio stations in the U.S. according to the U.S. [[Federal Communications Commission]] (FCC). Much of the public radio broadcasting is supplied by [[NPR]], incorporated in February 1970 under the [[Public Broadcasting Act of 1967]]. Well-known U.S. newspapers include ''[[The Wall Street Journal]]'', ''[[The New York Times]]'', and ''[[USA Today]]''. More than 800 publications are produced in Spanish, the second most commonly used language in the United States behind English. With very few exceptions, all the newspapers in the U.S. are privately owned, either by large chains such as [[Gannett Company|Gannett]] or [[The McClatchy Company|McClatchy]], which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in a situation that is increasingly rare, by individuals or families. Major cities often have [[alternative newspaper]]s to complement the mainstream daily papers, such as New York City's ''[[The Village Voice]]'' or Los Angeles' ''[[LA Weekly]]''. The five most popular websites used in the U.S. are [[Google]], [[YouTube]], [[Amazon (company)|Amazon]], [[Yahoo]], and [[Facebook]]. The [[Video games in the United States|American video game industry]] is the world's 2nd largest by revenue. It generated $90 billion in annual economic output in 2020. Furthermore, the video game industry contributed $12.6 billion in federal, state, and municipal taxes annually. Some of the largest video game companies like [[Activision Blizzard]], [[Xbox]], [[Sony Interactive Entertainment]], [[Rockstar Games]], and [[Electronic Arts]] are based in the United States. Some of the most popular and best selling video games like [[The Elder Scrolls V: Skyrim]], [[Call of Duty: Modern Warfare (2019 video game)|Call of Duty: Modern Warfare]] and [[Diablo III]] are made by American [[Video game developer|developers]]. The American video gaming business is still a significant employer. More than 143,000 individuals are employed directly and indirectly by video game companies throughout 50 states. The national compensation for direct workers is $2.9 billion, or an average wage of $121,000. ===Food=== [[File:Roast\_turkey.jpg|thumb|[[Turkey as food|Roasted turkey]] is a traditional [[Thanksgiving (United States)|Thanksgiving]] dinner dish and is usually the main entree.|alt=A roasted turkey]] Early settlers were introduced by Native Americans to such indigenous, non-European foods as turkey, [[sweet potato]]es, [[maize|corn]], [[Cucurbita|squash]], and [[maple syrup]]. They and later immigrants combined these with foods they had known, such as [[wheat flour]], beef, and milk to create a distinctive American cuisine. Homegrown foods are part of a shared national menu on one of America's most popular holidays, [[Thanksgiving (United States)|Thanksgiving]], when many Americans make or purchase traditional foods to celebrate the occasion. The American [[fast food]] industry, the world's largest, pioneered the [[drive-through]] format in the 1940s. Characteristic American dishes such as [[apple pie]], [[fried chicken]], [[doughnuts]], [[french fries]], [[macaroni and cheese]], [[ice cream]], [[Pizza in the United States|pizza]], [[hamburgers]], and [[hot dogs]] derive from the recipes of various immigrants. [[Mexican cuisine|Mexican]] dishes such as [[burritos]] and [[tacos]] and [[pasta]] dishes freely adapted from [[Italian cuisine|Italian]] sources are widely consumed. Americans drink three times as much coffee as tea. Marketing by U.S. industries is largely responsible for making orange juice and milk standard [[List of breakfast drinks|breakfast beverages]].[[#Smith2004|Smith, 2004]], pp. 131–132[[#Levenstein|Levenstein, 2003]], pp. 154–155 ===Sports=== [[File:WFT vs. Cowboys (51751852586).jpg|thumb|[[American football]] is the most popular sport in the United States.]] The most popular sports in the U.S. are [[American football]], [[basketball]], [[baseball]] and [[ice hockey]]. While most major U.S. sports such as [[baseball]] and [[American football]] have evolved out of European practices, [[basketball]], [[volleyball]], [[skateboarding]], and [[snowboarding]] are American inventions, some of which have become popular worldwide. [[Lacrosse]] and [[surfing]] arose from Native American and Native Hawaiian activities that predate European contact.Liss, Howard. ''Lacrosse'' (Funk & Wagnalls, 1970) pg 13. The market for [[professional sports]] in the United States is roughly $69 billion, roughly 50% larger than that of all of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States; MacCambridge, Michael (2004). ''America's Game: The Epic Story of How Pro Football Captured a Nation''. New York: Random House. . the [[National Football League]] (NFL) has the highest average attendance of any sports league in the world, and the [[Super Bowl]] is watched by tens of millions globally. Baseball has been regarded as the U.S. [[national sport]] since the late 19th century, with [[Major League Baseball]] being the top league. Basketball and [[ice hockey]] are the country's next two most popular professional team sports, with the top leagues being the [[National Basketball Association]] and the [[National Hockey League]]. The most-watched [[individual sport]]s in the U.S. are [[golf]] and [[auto racing]], particularly [[NASCAR]] and [[IndyCar]]. Eight [[Olympic Games]] have taken place in the United States. The [[1904 Summer Olympics]] in [[St. Louis]], [[Missouri]], were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when [[Los Angeles]] hosts the [[2028 Summer Olympics]]. , the United States has won 2,629 medals at the [[Summer Olympic Games]], more than any other country, and 330 in the [[Winter Olympic Games]], the second most behind Norway. In [[Association football|soccer]], the [[United States men's national soccer team|men's national soccer team]] qualified for [[United States at the FIFA World Cup|eleven World Cups]] and the [[United States women's national soccer team|women's team]] has [[United States at the FIFA Women's World Cup|won]] the [[FIFA Women's World Cup]] four times. The United States hosted the [[1994 FIFA World Cup]] and will host the [[2026 FIFA World Cup]] along with [[Canada]] and [[Mexico]]. On the [[collegiate athletics|collegiate]] level, earnings for the member institutions exceed $1 billion annually,[URL Sports Illustrated: NCAA Reports $1.1 Billion in Revenues] and [[college football]] and [[College basketball|basketball]] attract large audiences, as the [[NCAA Division I men's basketball tournament|NCAA Final Four]] is one of the most watched sporting events. ==See also== * [[Index of United States–related articles]] * [[Lists of U.S. state topics]] * [[Outline of the United States]] ==Notes== ==References== ==Further reading== * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * '''Internet sources''' * * * * * * * * * * * ==External links== * [URL United States]. ''[[The World Factbook]]''. [[Central Intelligence Agency]]. * [URL United States] from the [[BBC News]] * [URL Key Development Forecasts for the United States] from [[International Futures]] ; Government * [URL Official U.S. Government Web Portal] Gateway to government sites * [URL House] Official site of the United States House of Representatives * [URL Senate] Official site of the United States Senate * [URL White House] Official site of the president of the United States * [ Supreme Court] Official site of the Supreme Court of the United States ; History * [URL/URL Historical Documents] Collected by the National Center for Public Policy Research * [URL U.S. National Mottos: History and Constitutionality] Analysis by the Ontario Consultants on Religious Tolerance * [URL USA] Collected links to historical data ; Maps * [URL/URL National Atlas of the United States] Official maps from the U.S. Department of the Interior * * * [URL Measure of America] A variety of mapped information relating to health, education, income, and demographics for the U.S. ; Photos * [URL Photos of the USA] [[Category:United States]] [[Category:Countries in North America]] [[Category:English-speaking countries and territories]] [[Category:Federal constitutional republics]] [[Category:Former British colonies and protectorates in the Americas]] [[Category:Former confederations]] [[Category:G20 nations]] [[Category:Member states of NATO]] [[Category:Member states of the United Nations]] [[Category:States and territories established in 1776]] [[Category:Transcontinental countries]] '''
[]
[ "TAGS\n#size_categories-1M<n<10M #language-English #region-us \n" ]
893cc3ae74bdc357b64da9f170783afe79b7f1da
# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-v2-base <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [LoSboccacc/orthogonal-2x7B-v2-base](https://huggingface.co/LoSboccacc/orthogonal-2x7B-v2-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T00:22:47.991764](https://huggingface.co/datasets/open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base/blob/main/results_2024-01-19T00-22-47.991764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6277587211591188, "acc_stderr": 0.032809227406765666, "acc_norm": 0.6311375541821904, "acc_norm_stderr": 0.03346155242000715, "mc1": 0.5042839657282742, "mc1_stderr": 0.01750285857737127, "mc2": 0.6680405227339233, "mc2_stderr": 0.01518478311873851 }, "harness|arc:challenge|25": { "acc": 0.6237201365187713, "acc_stderr": 0.014157022555407163, "acc_norm": 0.6689419795221843, "acc_norm_stderr": 0.013752062419817834 }, "harness|hellaswag|10": { "acc": 0.6707827126070504, "acc_stderr": 0.004689685978155166, "acc_norm": 0.8569010157339175, "acc_norm_stderr": 0.0034945810763985373 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03782728980865469, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03782728980865469 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411018, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411018 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.41228070175438597, "acc_stderr": 0.046306532033665956, "acc_norm": 0.41228070175438597, "acc_norm_stderr": 0.046306532033665956 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.0407032901370707, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.0407032901370707 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404904, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404904 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.02672949906834996, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.02672949906834996 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5974358974358974, "acc_stderr": 0.02486499515976775, "acc_norm": 0.5974358974358974, "acc_norm_stderr": 0.02486499515976775 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.02857834836547308, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.02857834836547308 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.818348623853211, "acc_stderr": 0.016530617409266847, "acc_norm": 0.818348623853211, "acc_norm_stderr": 0.016530617409266847 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4861111111111111, "acc_stderr": 0.03408655867977748, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.03408655867977748 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967408, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967408 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389087, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389087 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.032100621541349864, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596913, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596913 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.035208939510976534, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.035208939510976534 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.034878251684978906, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.034878251684978906 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260595, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260595 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179333, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8033205619412516, "acc_stderr": 0.014214138556913915, "acc_norm": 0.8033205619412516, "acc_norm_stderr": 0.014214138556913915 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.708092485549133, "acc_stderr": 0.024476994076247333, "acc_norm": 0.708092485549133, "acc_norm_stderr": 0.024476994076247333 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43687150837988825, "acc_stderr": 0.01658868086453063, "acc_norm": 0.43687150837988825, "acc_norm_stderr": 0.01658868086453063 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.696078431372549, "acc_stderr": 0.026336613469046623, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.026336613469046623 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.02608270069539966, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.02608270069539966 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.02517104191530968, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4471968709256845, "acc_stderr": 0.012698825252435108, "acc_norm": 0.4471968709256845, "acc_norm_stderr": 0.012698825252435108 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6544117647058824, "acc_stderr": 0.028888193103988633, "acc_norm": 0.6544117647058824, "acc_norm_stderr": 0.028888193103988633 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0190709855896875, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0190709855896875 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6766169154228856, "acc_stderr": 0.03307615947979033, "acc_norm": 0.6766169154228856, "acc_norm_stderr": 0.03307615947979033 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.5042839657282742, "mc1_stderr": 0.01750285857737127, "mc2": 0.6680405227339233, "mc2_stderr": 0.01518478311873851 }, "harness|winogrande|5": { "acc": 0.7734806629834254, "acc_stderr": 0.011764149054698336 }, "harness|gsm8k|5": { "acc": 0.5140257771038665, "acc_stderr": 0.013767064940239285 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base
[ "region:us" ]
2024-01-19T00:25:02+00:00
{"pretty_name": "Evaluation run of LoSboccacc/orthogonal-2x7B-v2-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [LoSboccacc/orthogonal-2x7B-v2-base](https://huggingface.co/LoSboccacc/orthogonal-2x7B-v2-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T00:22:47.991764](https://huggingface.co/datasets/open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-v2-base/blob/main/results_2024-01-19T00-22-47.991764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6277587211591188,\n \"acc_stderr\": 0.032809227406765666,\n \"acc_norm\": 0.6311375541821904,\n \"acc_norm_stderr\": 0.03346155242000715,\n \"mc1\": 0.5042839657282742,\n \"mc1_stderr\": 0.01750285857737127,\n \"mc2\": 0.6680405227339233,\n \"mc2_stderr\": 0.01518478311873851\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407163,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817834\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6707827126070504,\n \"acc_stderr\": 0.004689685978155166,\n \"acc_norm\": 0.8569010157339175,\n \"acc_norm_stderr\": 0.0034945810763985373\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266847,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266847\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389087,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389087\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.014214138556913915,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.014214138556913915\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n \"acc_stderr\": 0.01658868086453063,\n \"acc_norm\": 0.43687150837988825,\n \"acc_norm_stderr\": 0.01658868086453063\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046623,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046623\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n \"acc_stderr\": 0.012698825252435108,\n \"acc_norm\": 0.4471968709256845,\n \"acc_norm_stderr\": 0.012698825252435108\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5042839657282742,\n \"mc1_stderr\": 0.01750285857737127,\n \"mc2\": 0.6680405227339233,\n \"mc2_stderr\": 0.01518478311873851\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698336\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5140257771038665,\n \"acc_stderr\": 0.013767064940239285\n }\n}\n```", "repo_url": "https://huggingface.co/LoSboccacc/orthogonal-2x7B-v2-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|arc:challenge|25_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|gsm8k|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hellaswag|10_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T00-22-47.991764.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["**/details_harness|winogrande|5_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T00-22-47.991764.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T00_22_47.991764", "path": ["results_2024-01-19T00-22-47.991764.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T00-22-47.991764.parquet"]}]}]}
2024-01-19T00:25:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-v2-base Dataset automatically created during the evaluation run of model LoSboccacc/orthogonal-2x7B-v2-base on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T00:22:47.991764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-v2-base\n\n\n\nDataset automatically created during the evaluation run of model LoSboccacc/orthogonal-2x7B-v2-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T00:22:47.991764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-v2-base\n\n\n\nDataset automatically created during the evaluation run of model LoSboccacc/orthogonal-2x7B-v2-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T00:22:47.991764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4634c72f37faa4737cffbb0494c50f916df9fc31
# Dataset Card for Evaluation run of NeuralNovel/Tiger-7b-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [NeuralNovel/Tiger-7b-v0.1](https://huggingface.co/NeuralNovel/Tiger-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T00:30:17.528076](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1/blob/main/results_2024-01-19T00-30-17.528076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6140276670516231, "acc_stderr": 0.033170625938141664, "acc_norm": 0.6176750246888549, "acc_norm_stderr": 0.03384254890386933, "mc1": 0.44920440636474906, "mc1_stderr": 0.017412941986115305, "mc2": 0.6103468565333238, "mc2_stderr": 0.015326695061753768 }, "harness|arc:challenge|25": { "acc": 0.5588737201365188, "acc_stderr": 0.014509747749064663, "acc_norm": 0.5998293515358362, "acc_norm_stderr": 0.014317197787809169 }, "harness|hellaswag|10": { "acc": 0.6415056761601274, "acc_stderr": 0.004785781979354866, "acc_norm": 0.832105158334993, "acc_norm_stderr": 0.0037300899105375805 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03942082639927213, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03942082639927213 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956913, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.0373362665538351, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.0373362665538351 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.032600385118357715, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3835978835978836, "acc_stderr": 0.0250437573185202, "acc_norm": 0.3835978835978836, "acc_norm_stderr": 0.0250437573185202 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6580645161290323, "acc_stderr": 0.026985289576552732, "acc_norm": 0.6580645161290323, "acc_norm_stderr": 0.026985289576552732 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.033744026441394036, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.033744026441394036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5871794871794872, "acc_stderr": 0.024962683564331796, "acc_norm": 0.5871794871794872, "acc_norm_stderr": 0.024962683564331796 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.02911661760608301, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.02911661760608301 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.03135709599613591, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.03135709599613591 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8146788990825689, "acc_stderr": 0.01665927970029582, "acc_norm": 0.8146788990825689, "acc_norm_stderr": 0.01665927970029582 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4861111111111111, "acc_stderr": 0.03408655867977748, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.03408655867977748 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967408, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967408 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7637130801687764, "acc_stderr": 0.027652153144159263, "acc_norm": 0.7637130801687764, "acc_norm_stderr": 0.027652153144159263 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6412556053811659, "acc_stderr": 0.03219079200419996, "acc_norm": 0.6412556053811659, "acc_norm_stderr": 0.03219079200419996 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6717557251908397, "acc_stderr": 0.041184385658062976, "acc_norm": 0.6717557251908397, "acc_norm_stderr": 0.041184385658062976 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097654, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097654 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7055214723926381, "acc_stderr": 0.03581165790474082, "acc_norm": 0.7055214723926381, "acc_norm_stderr": 0.03581165790474082 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.02220930907316561, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.02220930907316561 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7994891443167306, "acc_stderr": 0.014317653708594202, "acc_norm": 0.7994891443167306, "acc_norm_stderr": 0.014317653708594202 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6791907514450867, "acc_stderr": 0.025131000233647886, "acc_norm": 0.6791907514450867, "acc_norm_stderr": 0.025131000233647886 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.441340782122905, "acc_stderr": 0.016607021781050876, "acc_norm": 0.441340782122905, "acc_norm_stderr": 0.016607021781050876 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7124183006535948, "acc_stderr": 0.02591780611714716, "acc_norm": 0.7124183006535948, "acc_norm_stderr": 0.02591780611714716 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.02623696588115327, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.02623696588115327 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6728395061728395, "acc_stderr": 0.026105673861409828, "acc_norm": 0.6728395061728395, "acc_norm_stderr": 0.026105673861409828 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.45390070921985815, "acc_stderr": 0.029700453247291463, "acc_norm": 0.45390070921985815, "acc_norm_stderr": 0.029700453247291463 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4276401564537158, "acc_stderr": 0.012635799922765846, "acc_norm": 0.4276401564537158, "acc_norm_stderr": 0.012635799922765846 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6286764705882353, "acc_stderr": 0.029349803139765873, "acc_norm": 0.6286764705882353, "acc_norm_stderr": 0.029349803139765873 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6356209150326797, "acc_stderr": 0.019469518221573695, "acc_norm": 0.6356209150326797, "acc_norm_stderr": 0.019469518221573695 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.0289205832206756, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.0289205832206756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7661691542288557, "acc_stderr": 0.029929415408348384, "acc_norm": 0.7661691542288557, "acc_norm_stderr": 0.029929415408348384 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333047, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333047 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.44920440636474906, "mc1_stderr": 0.017412941986115305, "mc2": 0.6103468565333238, "mc2_stderr": 0.015326695061753768 }, "harness|winogrande|5": { "acc": 0.77663772691397, "acc_stderr": 0.01170569756520521 }, "harness|gsm8k|5": { "acc": 0.467778620166793, "acc_stderr": 0.013743857303073802 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1
[ "region:us" ]
2024-01-19T00:32:35+00:00
{"pretty_name": "Evaluation run of NeuralNovel/Tiger-7b-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeuralNovel/Tiger-7b-v0.1](https://huggingface.co/NeuralNovel/Tiger-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T00:30:17.528076](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1/blob/main/results_2024-01-19T00-30-17.528076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6140276670516231,\n \"acc_stderr\": 0.033170625938141664,\n \"acc_norm\": 0.6176750246888549,\n \"acc_norm_stderr\": 0.03384254890386933,\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.017412941986115305,\n \"mc2\": 0.6103468565333238,\n \"mc2_stderr\": 0.015326695061753768\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064663,\n \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6415056761601274,\n \"acc_stderr\": 0.004785781979354866,\n \"acc_norm\": 0.832105158334993,\n \"acc_norm_stderr\": 0.0037300899105375805\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552732,\n \"acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552732\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.014317653708594202,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.014317653708594202\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647886,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647886\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050876,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050876\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115327,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115327\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n \"acc_stderr\": 0.012635799922765846,\n \"acc_norm\": 0.4276401564537158,\n \"acc_norm_stderr\": 0.012635799922765846\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573695,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573695\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.017412941986115305,\n \"mc2\": 0.6103468565333238,\n \"mc2_stderr\": 0.015326695061753768\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.01170569756520521\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.467778620166793,\n \"acc_stderr\": 0.013743857303073802\n }\n}\n```", "repo_url": "https://huggingface.co/NeuralNovel/Tiger-7b-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|arc:challenge|25_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|gsm8k|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hellaswag|10_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T00-30-17.528076.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["**/details_harness|winogrande|5_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T00-30-17.528076.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T00_30_17.528076", "path": ["results_2024-01-19T00-30-17.528076.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T00-30-17.528076.parquet"]}]}]}
2024-01-19T00:33:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of NeuralNovel/Tiger-7b-v0.1 Dataset automatically created during the evaluation run of model NeuralNovel/Tiger-7b-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T00:30:17.528076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of NeuralNovel/Tiger-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Tiger-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T00:30:17.528076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of NeuralNovel/Tiger-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Tiger-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T00:30:17.528076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
ae1ea9f603bbaacc26864dd0037bf3f14749fad3
# Dataset Card for "doupo-dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
greyfoss/doupo-dataset
[ "region:us" ]
2024-01-19T01:29:44+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "token_type_ids", "sequence": "int8"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 21387729.172121048, "num_examples": 6936}, {"name": "valid", "num_bytes": 1128591.2452416816, "num_examples": 366}], "download_size": 6213826, "dataset_size": 22516320.41736273}}
2024-01-25T06:31:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "doupo-dataset" More Information needed
[ "# Dataset Card for \"doupo-dataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"doupo-dataset\"\n\nMore Information needed" ]
d77f2d37b06e4561207d24fd6f31518cc126a4c0
# Dataset Card for "cai-conversation-dev1705622085" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705622085
[ "region:us" ]
2024-01-19T01:33:59+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "sequence": "string"}, {"name": "chosen", "sequence": "string"}, {"name": "rejected", "sequence": "string"}], "splits": [{"name": "train_sft", "num_bytes": 80685581, "num_examples": 21268}, {"name": "train_prefs", "num_bytes": 80873453, "num_examples": 21269}, {"name": "test_sft", "num_bytes": 4369948, "num_examples": 1156}, {"name": "test_prefs", "num_bytes": 4440767, "num_examples": 1156}], "download_size": 74867178, "dataset_size": 170369749}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T01:34:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705622085" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705622085\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705622085\"\n\nMore Information needed" ]
699c6d30781594c59f039a1c0f32441edb88c417
# Dataset Card for "cai-conversation-dev1705628758" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705628758
[ "region:us" ]
2024-01-19T01:47:15+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 240834, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 238847, "num_examples": 64}, {"name": "test_sft", "num_bytes": 254525, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 257663, "num_examples": 64}], "download_size": 538984, "dataset_size": 991869}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T01:47:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705628758" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705628758\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705628758\"\n\nMore Information needed" ]
f5fe4b66f8aebe6a4fd4d911b88aa04b95cd382e
# Dataset Card for "cai-conversation-dev1705631776" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705631776
[ "region:us" ]
2024-01-19T02:38:12+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 206490, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 214255, "num_examples": 64}, {"name": "test_sft", "num_bytes": 228418, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 226504, "num_examples": 64}], "download_size": 476891, "dataset_size": 875667}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T02:38:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705631776" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705631776\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705631776\"\n\nMore Information needed" ]
f888a5baaadab2277f752b8781d04fff8fd6ca35
# Dataset Card for "cai-conversation-dev1705632182" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705632182
[ "region:us" ]
2024-01-19T02:44:50+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 297047, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 271314, "num_examples": 64}, {"name": "test_sft", "num_bytes": 290007, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 277290, "num_examples": 64}], "download_size": 622616, "dataset_size": 1135658}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T02:45:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705632182" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705632182\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705632182\"\n\nMore Information needed" ]
6bf1f1ec1d5bec1d719865172fe33e4685289a51
# Dataset Card for "cai-conversation-dev1705633234" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705633234
[ "region:us" ]
2024-01-19T03:02:39+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 272153, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 281152, "num_examples": 64}, {"name": "test_sft", "num_bytes": 256066, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 260074, "num_examples": 64}], "download_size": 538053, "dataset_size": 1069445}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T03:02:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705633234" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705633234\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705633234\"\n\nMore Information needed" ]
53298e665b11a05742c863d42bcabfcfb8be027c
# Dataset Card for Evaluation run of CausalLM/72B-preview-llamafied-qwen-llamafy <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CausalLM/72B-preview-llamafied-qwen-llamafy](https://huggingface.co/CausalLM/72B-preview-llamafied-qwen-llamafy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CausalLM__72B-preview-llamafied-qwen-llamafy", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T03:04:27.948723](https://huggingface.co/datasets/open-llm-leaderboard/details_CausalLM__72B-preview-llamafied-qwen-llamafy/blob/main/results_2024-01-19T03-04-27.948723.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.765694970151033, "acc_stderr": 0.02794684069092645, "acc_norm": 0.769421113917392, "acc_norm_stderr": 0.02847875554958271, "mc1": 0.3671970624235006, "mc1_stderr": 0.01687480500145318, "mc2": 0.5254959632468497, "mc2_stderr": 0.014732861007836748 }, "harness|arc:challenge|25": { "acc": 0.6083617747440273, "acc_stderr": 0.014264122124938213, "acc_norm": 0.6518771331058021, "acc_norm_stderr": 0.013921008595179344 }, "harness|hellaswag|10": { "acc": 0.6477793268273252, "acc_stderr": 0.004766860907171532, "acc_norm": 0.8324039036048596, "acc_norm_stderr": 0.003727438786513392 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7407407407407407, "acc_stderr": 0.03785714465066653, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.03785714465066653 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.9078947368421053, "acc_stderr": 0.02353268597044349, "acc_norm": 0.9078947368421053, "acc_norm_stderr": 0.02353268597044349 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8264150943396227, "acc_stderr": 0.02331058302600625, "acc_norm": 0.8264150943396227, "acc_norm_stderr": 0.02331058302600625 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8958333333333334, "acc_stderr": 0.025545239210256917, "acc_norm": 0.8958333333333334, "acc_norm_stderr": 0.025545239210256917 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7861271676300579, "acc_stderr": 0.031265112061730445, "acc_norm": 0.7861271676300579, "acc_norm_stderr": 0.031265112061730445 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5392156862745098, "acc_stderr": 0.04959859966384181, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.04959859966384181 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.8, "acc_stderr": 0.026148818018424502, "acc_norm": 0.8, "acc_norm_stderr": 0.026148818018424502 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5614035087719298, "acc_stderr": 0.04668000738510455, "acc_norm": 0.5614035087719298, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.8, "acc_stderr": 0.0333333333333333, "acc_norm": 0.8, "acc_norm_stderr": 0.0333333333333333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6878306878306878, "acc_stderr": 0.023865206836972585, "acc_norm": 0.6878306878306878, "acc_norm_stderr": 0.023865206836972585 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5476190476190477, "acc_stderr": 0.044518079590553275, "acc_norm": 0.5476190476190477, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8935483870967742, "acc_stderr": 0.01754510295165663, "acc_norm": 0.8935483870967742, "acc_norm_stderr": 0.01754510295165663 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6600985221674877, "acc_stderr": 0.033327690684107895, "acc_norm": 0.6600985221674877, "acc_norm_stderr": 0.033327690684107895 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8606060606060606, "acc_stderr": 0.0270459488258654, "acc_norm": 0.8606060606060606, "acc_norm_stderr": 0.0270459488258654 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9393939393939394, "acc_stderr": 0.01699999492742161, "acc_norm": 0.9393939393939394, "acc_norm_stderr": 0.01699999492742161 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9896373056994818, "acc_stderr": 0.007308424386792194, "acc_norm": 0.9896373056994818, "acc_norm_stderr": 0.007308424386792194 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8102564102564103, "acc_stderr": 0.019880165406588768, "acc_norm": 0.8102564102564103, "acc_norm_stderr": 0.019880165406588768 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.030464621718895322, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.030464621718895322 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8277310924369747, "acc_stderr": 0.02452866497130543, "acc_norm": 0.8277310924369747, "acc_norm_stderr": 0.02452866497130543 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5364238410596026, "acc_stderr": 0.04071636065944217, "acc_norm": 0.5364238410596026, "acc_norm_stderr": 0.04071636065944217 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9302752293577982, "acc_stderr": 0.010919426411848605, "acc_norm": 0.9302752293577982, "acc_norm_stderr": 0.010919426411848605 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6851851851851852, "acc_stderr": 0.0316746870682898, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.0316746870682898 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9215686274509803, "acc_stderr": 0.018869514646658928, "acc_norm": 0.9215686274509803, "acc_norm_stderr": 0.018869514646658928 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8945147679324894, "acc_stderr": 0.019995560723758535, "acc_norm": 0.8945147679324894, "acc_norm_stderr": 0.019995560723758535 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8116591928251121, "acc_stderr": 0.026241132996407252, "acc_norm": 0.8116591928251121, "acc_norm_stderr": 0.026241132996407252 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8778625954198473, "acc_stderr": 0.02871877688934232, "acc_norm": 0.8778625954198473, "acc_norm_stderr": 0.02871877688934232 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.0309227883204458, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.0309227883204458 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8518518518518519, "acc_stderr": 0.03434300243630999, "acc_norm": 0.8518518518518519, "acc_norm_stderr": 0.03434300243630999 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8588957055214724, "acc_stderr": 0.027351605518389752, "acc_norm": 0.8588957055214724, "acc_norm_stderr": 0.027351605518389752 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6696428571428571, "acc_stderr": 0.044642857142857116, "acc_norm": 0.6696428571428571, "acc_norm_stderr": 0.044642857142857116 }, "harness|hendrycksTest-management|5": { "acc": 0.8640776699029126, "acc_stderr": 0.03393295729761011, "acc_norm": 0.8640776699029126, "acc_norm_stderr": 0.03393295729761011 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9401709401709402, "acc_stderr": 0.015537514263253878, "acc_norm": 0.9401709401709402, "acc_norm_stderr": 0.015537514263253878 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.88, "acc_stderr": 0.032659863237109066, "acc_norm": 0.88, "acc_norm_stderr": 0.032659863237109066 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9195402298850575, "acc_stderr": 0.009726831316141866, "acc_norm": 0.9195402298850575, "acc_norm_stderr": 0.009726831316141866 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8497109826589595, "acc_stderr": 0.019239318783904717, "acc_norm": 0.8497109826589595, "acc_norm_stderr": 0.019239318783904717 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5664804469273743, "acc_stderr": 0.016574027219517635, "acc_norm": 0.5664804469273743, "acc_norm_stderr": 0.016574027219517635 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8562091503267973, "acc_stderr": 0.020091188936043714, "acc_norm": 0.8562091503267973, "acc_norm_stderr": 0.020091188936043714 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8520900321543409, "acc_stderr": 0.020163253806284125, "acc_norm": 0.8520900321543409, "acc_norm_stderr": 0.020163253806284125 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8858024691358025, "acc_stderr": 0.017696832447213894, "acc_norm": 0.8858024691358025, "acc_norm_stderr": 0.017696832447213894 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6347517730496454, "acc_stderr": 0.02872386385328127, "acc_norm": 0.6347517730496454, "acc_norm_stderr": 0.02872386385328127 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6271186440677966, "acc_stderr": 0.012350630058333364, "acc_norm": 0.6271186440677966, "acc_norm_stderr": 0.012350630058333364 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8272058823529411, "acc_stderr": 0.02296606758558181, "acc_norm": 0.8272058823529411, "acc_norm_stderr": 0.02296606758558181 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8186274509803921, "acc_stderr": 0.015588643495370457, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.015588643495370457 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7545454545454545, "acc_stderr": 0.041220665028782855, "acc_norm": 0.7545454545454545, "acc_norm_stderr": 0.041220665028782855 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7918367346938775, "acc_stderr": 0.025991117672813296, "acc_norm": 0.7918367346938775, "acc_norm_stderr": 0.025991117672813296 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8855721393034826, "acc_stderr": 0.022509345325101706, "acc_norm": 0.8855721393034826, "acc_norm_stderr": 0.022509345325101706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.96, "acc_stderr": 0.01969463855669321, "acc_norm": 0.96, "acc_norm_stderr": 0.01969463855669321 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685515, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8830409356725146, "acc_stderr": 0.02464806896136616, "acc_norm": 0.8830409356725146, "acc_norm_stderr": 0.02464806896136616 }, "harness|truthfulqa:mc|0": { "mc1": 0.3671970624235006, "mc1_stderr": 0.01687480500145318, "mc2": 0.5254959632468497, "mc2_stderr": 0.014732861007836748 }, "harness|winogrande|5": { "acc": 0.823993685872139, "acc_stderr": 0.010703090882320705 }, "harness|gsm8k|5": { "acc": 0.7156937073540561, "acc_stderr": 0.012425078188395977 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_CausalLM__72B-preview-llamafied-qwen-llamafy
[ "region:us" ]
2024-01-19T03:06:37+00:00
{"pretty_name": "Evaluation run of CausalLM/72B-preview-llamafied-qwen-llamafy", "dataset_summary": "Dataset automatically created during the evaluation run of model [CausalLM/72B-preview-llamafied-qwen-llamafy](https://huggingface.co/CausalLM/72B-preview-llamafied-qwen-llamafy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CausalLM__72B-preview-llamafied-qwen-llamafy\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T03:04:27.948723](https://huggingface.co/datasets/open-llm-leaderboard/details_CausalLM__72B-preview-llamafied-qwen-llamafy/blob/main/results_2024-01-19T03-04-27.948723.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.765694970151033,\n \"acc_stderr\": 0.02794684069092645,\n \"acc_norm\": 0.769421113917392,\n \"acc_norm_stderr\": 0.02847875554958271,\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5254959632468497,\n \"mc2_stderr\": 0.014732861007836748\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938213,\n \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179344\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6477793268273252,\n \"acc_stderr\": 0.004766860907171532,\n \"acc_norm\": 0.8324039036048596,\n \"acc_norm_stderr\": 0.003727438786513392\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9078947368421053,\n \"acc_stderr\": 0.02353268597044349,\n \"acc_norm\": 0.9078947368421053,\n \"acc_norm_stderr\": 0.02353268597044349\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8264150943396227,\n \"acc_stderr\": 0.02331058302600625,\n \"acc_norm\": 0.8264150943396227,\n \"acc_norm_stderr\": 0.02331058302600625\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.031265112061730445,\n \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.031265112061730445\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.026148818018424502,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.026148818018424502\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6878306878306878,\n \"acc_stderr\": 0.023865206836972585,\n \"acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.023865206836972585\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.01754510295165663,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.01754510295165663\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.0270459488258654,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.0270459488258654\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.01699999492742161,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.01699999492742161\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588768,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588768\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.030464621718895322,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.030464621718895322\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8277310924369747,\n \"acc_stderr\": 0.02452866497130543,\n \"acc_norm\": 0.8277310924369747,\n \"acc_norm_stderr\": 0.02452866497130543\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5364238410596026,\n \"acc_stderr\": 0.04071636065944217,\n \"acc_norm\": 0.5364238410596026,\n \"acc_norm_stderr\": 0.04071636065944217\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9302752293577982,\n \"acc_stderr\": 0.010919426411848605,\n \"acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.010919426411848605\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758535,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758535\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6696428571428571,\n \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.6696428571428571,\n \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253878,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253878\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9195402298850575,\n \"acc_stderr\": 0.009726831316141866,\n \"acc_norm\": 0.9195402298850575,\n \"acc_norm_stderr\": 0.009726831316141866\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8497109826589595,\n \"acc_stderr\": 0.019239318783904717,\n \"acc_norm\": 0.8497109826589595,\n \"acc_norm_stderr\": 0.019239318783904717\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5664804469273743,\n \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.5664804469273743,\n \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043714,\n \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043714\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8520900321543409,\n \"acc_stderr\": 0.020163253806284125,\n \"acc_norm\": 0.8520900321543409,\n \"acc_norm_stderr\": 0.020163253806284125\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8858024691358025,\n \"acc_stderr\": 0.017696832447213894,\n \"acc_norm\": 0.8858024691358025,\n \"acc_norm_stderr\": 0.017696832447213894\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6347517730496454,\n \"acc_stderr\": 0.02872386385328127,\n \"acc_norm\": 0.6347517730496454,\n \"acc_norm_stderr\": 0.02872386385328127\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6271186440677966,\n \"acc_stderr\": 0.012350630058333364,\n \"acc_norm\": 0.6271186440677966,\n \"acc_norm_stderr\": 0.012350630058333364\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.02296606758558181,\n \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.02296606758558181\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.015588643495370457,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.015588643495370457\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.025991117672813296,\n \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.025991117672813296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.96,\n \"acc_stderr\": 0.01969463855669321,\n \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.01969463855669321\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5254959632468497,\n \"mc2_stderr\": 0.014732861007836748\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7156937073540561,\n \"acc_stderr\": 0.012425078188395977\n }\n}\n```", "repo_url": "https://huggingface.co/CausalLM/72B-preview-llamafied-qwen-llamafy", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|arc:challenge|25_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|gsm8k|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hellaswag|10_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T03-04-27.948723.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["**/details_harness|winogrande|5_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T03-04-27.948723.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T03_04_27.948723", "path": ["results_2024-01-19T03-04-27.948723.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T03-04-27.948723.parquet"]}]}]}
2024-01-19T03:07:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CausalLM/72B-preview-llamafied-qwen-llamafy Dataset automatically created during the evaluation run of model CausalLM/72B-preview-llamafied-qwen-llamafy on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T03:04:27.948723(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of CausalLM/72B-preview-llamafied-qwen-llamafy\n\n\n\nDataset automatically created during the evaluation run of model CausalLM/72B-preview-llamafied-qwen-llamafy on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T03:04:27.948723(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CausalLM/72B-preview-llamafied-qwen-llamafy\n\n\n\nDataset automatically created during the evaluation run of model CausalLM/72B-preview-llamafied-qwen-llamafy on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T03:04:27.948723(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
51954823511ffd96d15ed6d1004949277e7af464
# Dataset Card for "cai-conversation-dev1705629166" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HuggingFaceH4/cai-conversation-harmless
[ "license:apache-2.0", "region:us" ]
2024-01-19T03:08:56+00:00
{"license": "apache-2.0", "dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 81872474, "num_examples": 21268}, {"name": "train_prefs", "num_bytes": 82070344, "num_examples": 21269}, {"name": "test_sft", "num_bytes": 4489276, "num_examples": 1156}, {"name": "test_prefs", "num_bytes": 4523043, "num_examples": 1156}], "download_size": 74758771, "dataset_size": 172955137}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-02-02T04:10:50+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
# Dataset Card for "cai-conversation-dev1705629166" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705629166\"\n\nMore Information needed" ]
[ "TAGS\n#license-apache-2.0 #region-us \n", "# Dataset Card for \"cai-conversation-dev1705629166\"\n\nMore Information needed" ]
9348b193ecb0c057a3eb05e20f22e7d4629d98ef
# Dataset Card for "cai-conversation-dev1705634160" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705634160
[ "region:us" ]
2024-01-19T03:17:29+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 280036, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 284217, "num_examples": 64}, {"name": "test_sft", "num_bytes": 291380, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 284562, "num_examples": 64}], "download_size": 622492, "dataset_size": 1140195}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T03:17:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705634160" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705634160\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705634160\"\n\nMore Information needed" ]
bcd96ba7d15ab48df4f4009a54d64f0b6d92ed6c
# Dataset Card for "eag100" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Minglii/eag100
[ "region:us" ]
2024-01-19T03:47:39+00:00
{"dataset_info": {"features": [{"name": "data", "struct": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "id", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 42517092, "num_examples": 52002}], "download_size": 23802551, "dataset_size": 42517092}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-19T03:51:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for "eag100" More Information needed
[ "# Dataset Card for \"eag100\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"eag100\"\n\nMore Information needed" ]
a02f64f874303a3533822d1edcf324cee0f91888
# Dataset Card for "eag5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Minglii/eag5
[ "region:us" ]
2024-01-19T03:47:52+00:00
{"dataset_info": {"features": [{"name": "data", "struct": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "id", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 4251710, "num_examples": 2600}], "download_size": 2380684, "dataset_size": 4251710}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-19T03:51:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "eag5" More Information needed
[ "# Dataset Card for \"eag5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"eag5\"\n\nMore Information needed" ]
986c1567a3ac79332f455e339d2d3ab78603b71e
# Dataset Card for "eag10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Minglii/eag10
[ "region:us" ]
2024-01-19T03:48:03+00:00
{"dataset_info": {"features": [{"name": "data", "struct": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "id", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 8476394, "num_examples": 5200}], "download_size": 4739915, "dataset_size": 8476394}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-19T03:51:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "eag10" More Information needed
[ "# Dataset Card for \"eag10\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"eag10\"\n\nMore Information needed" ]
1503933b9c67449b5907cad244c82dde31d19af2
# Dataset Card for "eag15" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Minglii/eag15
[ "region:us" ]
2024-01-19T03:48:16+00:00
{"dataset_info": {"features": [{"name": "data", "struct": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "id", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 12446847, "num_examples": 7800}], "download_size": 6942905, "dataset_size": 12446847}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-19T03:52:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "eag15" More Information needed
[ "# Dataset Card for \"eag15\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"eag15\"\n\nMore Information needed" ]
31a7d2cf4b346aa86b61a20a896e0186afb0909d
# Dataset Card for "mdpo_train_demo" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
cjhyeok/mdpo_train_demo
[ "region:us" ]
2024-01-19T03:55:10+00:00
{"dataset_info": {"features": [{"name": "qid", "dtype": "int64"}, {"name": "Question", "dtype": "string"}, {"name": "Answers", "sequence": "string"}, {"name": "Pm_score", "sequence": "int64"}, {"name": "standard", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 43188785238, "num_examples": 10807695}], "download_size": 18806305215, "dataset_size": 43188785238}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-19T04:36:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mdpo_train_demo" More Information needed
[ "# Dataset Card for \"mdpo_train_demo\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mdpo_train_demo\"\n\nMore Information needed" ]
dbad016a08ab4782b6b8065e5bc4567b23d1217d
# Dataset Card for "test-dedup" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tmnam20/test-dedup
[ "region:us" ]
2024-01-19T03:57:59+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 395, "num_examples": 4}], "download_size": 0, "dataset_size": 395}}
2024-01-19T04:01:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "test-dedup" More Information needed
[ "# Dataset Card for \"test-dedup\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"test-dedup\"\n\nMore Information needed" ]
f6f50096ec5a97ddefd46a614158b8b3d07fae83
# Dataset Card for Evaluation run of Zangs3011/mistral_7b_HalfEpoch_DolphinCoder <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Zangs3011/mistral_7b_HalfEpoch_DolphinCoder](https://huggingface.co/Zangs3011/mistral_7b_HalfEpoch_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T04:49:07.320364](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder/blob/main/results_2024-01-19T04-49-07.320364.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.611565024381518, "acc_stderr": 0.032882150972704735, "acc_norm": 0.6180275851937712, "acc_norm_stderr": 0.03355752116870104, "mc1": 0.3011015911872705, "mc1_stderr": 0.016058999026100612, "mc2": 0.4550661709609789, "mc2_stderr": 0.014832885424648957 }, "harness|arc:challenge|25": { "acc": 0.5844709897610921, "acc_stderr": 0.014401366641216384, "acc_norm": 0.6168941979522184, "acc_norm_stderr": 0.014206472661672877 }, "harness|hellaswag|10": { "acc": 0.6356303525194185, "acc_stderr": 0.004802694106203654, "acc_norm": 0.8238398725353515, "acc_norm_stderr": 0.00380177777980958 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6052631578947368, "acc_stderr": 0.039777499346220734, "acc_norm": 0.6052631578947368, "acc_norm_stderr": 0.039777499346220734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6490566037735849, "acc_stderr": 0.029373646253234686, "acc_norm": 0.6490566037735849, "acc_norm_stderr": 0.029373646253234686 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.03724249595817729, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.03724249595817729 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266346, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266346 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.032650194750335815, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.032650194750335815 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36243386243386244, "acc_stderr": 0.02475747390275205, "acc_norm": 0.36243386243386244, "acc_norm_stderr": 0.02475747390275205 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949097, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949097 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7258064516129032, "acc_stderr": 0.025378139970885203, "acc_norm": 0.7258064516129032, "acc_norm_stderr": 0.025378139970885203 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.0347769116216366, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.0347769116216366 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.02503387058301518, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.02503387058301518 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6025641025641025, "acc_stderr": 0.024811920017903836, "acc_norm": 0.6025641025641025, "acc_norm_stderr": 0.024811920017903836 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251976, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251976 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6092436974789915, "acc_stderr": 0.031693802357129965, "acc_norm": 0.6092436974789915, "acc_norm_stderr": 0.031693802357129965 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8055045871559633, "acc_stderr": 0.016970289090458033, "acc_norm": 0.8055045871559633, "acc_norm_stderr": 0.016970289090458033 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7401960784313726, "acc_stderr": 0.03077855467869326, "acc_norm": 0.7401960784313726, "acc_norm_stderr": 0.03077855467869326 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.028304657943035293, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.028304657943035293 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.03210062154134986, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.03210062154134986 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302872, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.041858325989283136, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.041858325989283136 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.023902325549560396, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.023902325549560396 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7982120051085568, "acc_stderr": 0.014351702181636863, "acc_norm": 0.7982120051085568, "acc_norm_stderr": 0.014351702181636863 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7109826589595376, "acc_stderr": 0.024405173935783227, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.024405173935783227 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3396648044692737, "acc_stderr": 0.015839400406212505, "acc_norm": 0.3396648044692737, "acc_norm_stderr": 0.015839400406212505 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.026082700695399665, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.026082700695399665 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7006172839506173, "acc_stderr": 0.025483115601195455, "acc_norm": 0.7006172839506173, "acc_norm_stderr": 0.025483115601195455 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4397163120567376, "acc_stderr": 0.02960991207559411, "acc_norm": 0.4397163120567376, "acc_norm_stderr": 0.02960991207559411 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.438722294654498, "acc_stderr": 0.012673969883493272, "acc_norm": 0.438722294654498, "acc_norm_stderr": 0.012673969883493272 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6102941176470589, "acc_stderr": 0.0296246635811597, "acc_norm": 0.6102941176470589, "acc_norm_stderr": 0.0296246635811597 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6274509803921569, "acc_stderr": 0.019559646809215927, "acc_norm": 0.6274509803921569, "acc_norm_stderr": 0.019559646809215927 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6612244897959184, "acc_stderr": 0.03029950656215418, "acc_norm": 0.6612244897959184, "acc_norm_stderr": 0.03029950656215418 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8109452736318408, "acc_stderr": 0.02768691358801302, "acc_norm": 0.8109452736318408, "acc_norm_stderr": 0.02768691358801302 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.034873508801977725, "acc_norm": 0.86, "acc_norm_stderr": 0.034873508801977725 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.3011015911872705, "mc1_stderr": 0.016058999026100612, "mc2": 0.4550661709609789, "mc2_stderr": 0.014832885424648957 }, "harness|winogrande|5": { "acc": 0.7576953433307024, "acc_stderr": 0.012042352526174789 }, "harness|gsm8k|5": { "acc": 0.30477634571645185, "acc_stderr": 0.012679297549515418 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder
[ "region:us" ]
2024-01-19T04:52:06+00:00
{"pretty_name": "Evaluation run of Zangs3011/mistral_7b_HalfEpoch_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [Zangs3011/mistral_7b_HalfEpoch_DolphinCoder](https://huggingface.co/Zangs3011/mistral_7b_HalfEpoch_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T04:49:07.320364](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mistral_7b_HalfEpoch_DolphinCoder/blob/main/results_2024-01-19T04-49-07.320364.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.611565024381518,\n \"acc_stderr\": 0.032882150972704735,\n \"acc_norm\": 0.6180275851937712,\n \"acc_norm_stderr\": 0.03355752116870104,\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4550661709609789,\n \"mc2_stderr\": 0.014832885424648957\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216384,\n \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672877\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6356303525194185,\n \"acc_stderr\": 0.004802694106203654,\n \"acc_norm\": 0.8238398725353515,\n \"acc_norm_stderr\": 0.00380177777980958\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.029373646253234686,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.029373646253234686\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275205,\n \"acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275205\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n \"acc_stderr\": 0.025378139970885203,\n \"acc_norm\": 0.7258064516129032,\n \"acc_norm_stderr\": 0.025378139970885203\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.016970289090458033,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.016970289090458033\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035293,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035293\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n \"acc_stderr\": 0.014351702181636863,\n \"acc_norm\": 0.7982120051085568,\n \"acc_norm_stderr\": 0.014351702181636863\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.024405173935783227,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.024405173935783227\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n \"acc_stderr\": 0.015839400406212505,\n \"acc_norm\": 0.3396648044692737,\n \"acc_norm_stderr\": 0.015839400406212505\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215927,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215927\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.03029950656215418,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.03029950656215418\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4550661709609789,\n \"mc2_stderr\": 0.014832885424648957\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174789\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30477634571645185,\n \"acc_stderr\": 0.012679297549515418\n }\n}\n```", "repo_url": "https://huggingface.co/Zangs3011/mistral_7b_HalfEpoch_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|arc:challenge|25_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|gsm8k|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hellaswag|10_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T04-49-07.320364.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["**/details_harness|winogrande|5_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T04-49-07.320364.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T04_49_07.320364", "path": ["results_2024-01-19T04-49-07.320364.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T04-49-07.320364.parquet"]}]}]}
2024-01-19T04:53:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Zangs3011/mistral_7b_HalfEpoch_DolphinCoder Dataset automatically created during the evaluation run of model Zangs3011/mistral_7b_HalfEpoch_DolphinCoder on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T04:49:07.320364(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Zangs3011/mistral_7b_HalfEpoch_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mistral_7b_HalfEpoch_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T04:49:07.320364(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Zangs3011/mistral_7b_HalfEpoch_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mistral_7b_HalfEpoch_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T04:49:07.320364(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
542489b2bfd2bb4c5952e12c8c9c9bde9d001bae
# Data Summary This is a compilation of math test questions and answers drawn from the 2023 Chinese National College Entrance Examination, the 2023 American Mathematics Competitions, and the 2023 American College Testing. For simplicity, we refer to it as `Gaokao2023`.
MARIO-Math-Reasoning/Gaokao2023-Math-En
[ "license:mit", "region:us" ]
2024-01-19T04:57:09+00:00
{"license": "mit"}
2024-01-19T21:49:46+00:00
[]
[]
TAGS #license-mit #region-us
# Data Summary This is a compilation of math test questions and answers drawn from the 2023 Chinese National College Entrance Examination, the 2023 American Mathematics Competitions, and the 2023 American College Testing. For simplicity, we refer to it as 'Gaokao2023'.
[ "# Data Summary\n\nThis is a compilation of math test questions and answers drawn from the 2023 Chinese National College Entrance Examination, the 2023 American Mathematics Competitions, and the 2023 American College Testing. For simplicity, we refer to it as 'Gaokao2023'." ]
[ "TAGS\n#license-mit #region-us \n", "# Data Summary\n\nThis is a compilation of math test questions and answers drawn from the 2023 Chinese National College Entrance Examination, the 2023 American Mathematics Competitions, and the 2023 American College Testing. For simplicity, we refer to it as 'Gaokao2023'." ]
1da81c04485686adb65dc9df724ffa3b15f32d5b
# Dataset Card for Evaluation run of Zangs3011/mistral_7b_2EPOCH_DolphinCoder <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Zangs3011/mistral_7b_2EPOCH_DolphinCoder](https://huggingface.co/Zangs3011/mistral_7b_2EPOCH_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Zangs3011__mistral_7b_2EPOCH_DolphinCoder", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T04:55:31.577709](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mistral_7b_2EPOCH_DolphinCoder/blob/main/results_2024-01-19T04-55-31.577709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.590189563445543, "acc_stderr": 0.033213747146494416, "acc_norm": 0.5975943163476723, "acc_norm_stderr": 0.03391041523451993, "mc1": 0.2974296205630355, "mc1_stderr": 0.016002651487361005, "mc2": 0.44646084605621383, "mc2_stderr": 0.014640949505732814 }, "harness|arc:challenge|25": { "acc": 0.568259385665529, "acc_stderr": 0.014474591427196202, "acc_norm": 0.6075085324232082, "acc_norm_stderr": 0.014269634635670722 }, "harness|hellaswag|10": { "acc": 0.6229834694284008, "acc_stderr": 0.004836486437527263, "acc_norm": 0.8114917347142003, "acc_norm_stderr": 0.003903181667466359 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5986842105263158, "acc_stderr": 0.039889037033362836, "acc_norm": 0.5986842105263158, "acc_norm_stderr": 0.039889037033362836 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.630188679245283, "acc_stderr": 0.029711421880107936, "acc_norm": 0.630188679245283, "acc_norm_stderr": 0.029711421880107936 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.038990736873573344, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.038990736873573344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5664739884393064, "acc_stderr": 0.03778621079092056, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.03778621079092056 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929777, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929777 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224469, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224469 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3968253968253968, "acc_stderr": 0.02519710107424649, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.02519710107424649 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377562, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377562 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6806451612903226, "acc_stderr": 0.026522709674667765, "acc_norm": 0.6806451612903226, "acc_norm_stderr": 0.026522709674667765 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4187192118226601, "acc_stderr": 0.03471192860518468, "acc_norm": 0.4187192118226601, "acc_norm_stderr": 0.03471192860518468 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.696969696969697, "acc_stderr": 0.03588624800091706, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.03588624800091706 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7373737373737373, "acc_stderr": 0.03135305009533086, "acc_norm": 0.7373737373737373, "acc_norm_stderr": 0.03135305009533086 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.02649905770139746, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.02649905770139746 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5512820512820513, "acc_stderr": 0.025217315184846482, "acc_norm": 0.5512820512820513, "acc_norm_stderr": 0.025217315184846482 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.02813325257881564, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.02813325257881564 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.03135709599613591, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.03135709599613591 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7853211009174312, "acc_stderr": 0.01760430414925648, "acc_norm": 0.7853211009174312, "acc_norm_stderr": 0.01760430414925648 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591362, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591362 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7172995780590717, "acc_stderr": 0.029312814153955934, "acc_norm": 0.7172995780590717, "acc_norm_stderr": 0.029312814153955934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.032100621541349864, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596914, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596914 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.04026187527591205, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.04026187527591205 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.044143436668549335, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.044143436668549335 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6871165644171779, "acc_stderr": 0.036429145782924055, "acc_norm": 0.6871165644171779, "acc_norm_stderr": 0.036429145782924055 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.023636873317489288, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.023636873317489288 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7701149425287356, "acc_stderr": 0.01504630184669182, "acc_norm": 0.7701149425287356, "acc_norm_stderr": 0.01504630184669182 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6820809248554913, "acc_stderr": 0.025070713719153183, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.025070713719153183 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2927374301675978, "acc_stderr": 0.015218109544410179, "acc_norm": 0.2927374301675978, "acc_norm_stderr": 0.015218109544410179 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6764705882352942, "acc_stderr": 0.0267874531119065, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.0267874531119065 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6527331189710611, "acc_stderr": 0.027040745502307336, "acc_norm": 0.6527331189710611, "acc_norm_stderr": 0.027040745502307336 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6574074074074074, "acc_stderr": 0.026406145973625686, "acc_norm": 0.6574074074074074, "acc_norm_stderr": 0.026406145973625686 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4219858156028369, "acc_stderr": 0.029462189233370593, "acc_norm": 0.4219858156028369, "acc_norm_stderr": 0.029462189233370593 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4322033898305085, "acc_stderr": 0.012652297777114968, "acc_norm": 0.4322033898305085, "acc_norm_stderr": 0.012652297777114968 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.028418208619406752, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.028418208619406752 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6225490196078431, "acc_stderr": 0.01961085147488029, "acc_norm": 0.6225490196078431, "acc_norm_stderr": 0.01961085147488029 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.04582004841505417, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.04582004841505417 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6571428571428571, "acc_stderr": 0.030387262919547728, "acc_norm": 0.6571428571428571, "acc_norm_stderr": 0.030387262919547728 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7810945273631841, "acc_stderr": 0.029239174636647, "acc_norm": 0.7810945273631841, "acc_norm_stderr": 0.029239174636647 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.031885780176863984, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.031885780176863984 }, "harness|truthfulqa:mc|0": { "mc1": 0.2974296205630355, "mc1_stderr": 0.016002651487361005, "mc2": 0.44646084605621383, "mc2_stderr": 0.014640949505732814 }, "harness|winogrande|5": { "acc": 0.7324388318863457, "acc_stderr": 0.01244171845689301 }, "harness|gsm8k|5": { "acc": 0.23881728582259287, "acc_stderr": 0.011744097081003805 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Zangs3011__mistral_7b_2EPOCH_DolphinCoder
[ "region:us" ]
2024-01-19T04:57:58+00:00
{"pretty_name": "Evaluation run of Zangs3011/mistral_7b_2EPOCH_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [Zangs3011/mistral_7b_2EPOCH_DolphinCoder](https://huggingface.co/Zangs3011/mistral_7b_2EPOCH_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zangs3011__mistral_7b_2EPOCH_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T04:55:31.577709](https://huggingface.co/datasets/open-llm-leaderboard/details_Zangs3011__mistral_7b_2EPOCH_DolphinCoder/blob/main/results_2024-01-19T04-55-31.577709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.590189563445543,\n \"acc_stderr\": 0.033213747146494416,\n \"acc_norm\": 0.5975943163476723,\n \"acc_norm_stderr\": 0.03391041523451993,\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.44646084605621383,\n \"mc2_stderr\": 0.014640949505732814\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670722\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6229834694284008,\n \"acc_stderr\": 0.004836486437527263,\n \"acc_norm\": 0.8114917347142003,\n \"acc_norm_stderr\": 0.003903181667466359\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107936,\n \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107936\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139746,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139746\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7853211009174312,\n \"acc_stderr\": 0.01760430414925648,\n \"acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.01760430414925648\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489288,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489288\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n \"acc_stderr\": 0.01504630184669182,\n \"acc_norm\": 0.7701149425287356,\n \"acc_norm_stderr\": 0.01504630184669182\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153183,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n \"acc_stderr\": 0.015218109544410179,\n \"acc_norm\": 0.2927374301675978,\n \"acc_norm_stderr\": 0.015218109544410179\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0267874531119065,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0267874531119065\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625686,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625686\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406752,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406752\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.01961085147488029,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.01961085147488029\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.44646084605621383,\n \"mc2_stderr\": 0.014640949505732814\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.01244171845689301\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23881728582259287,\n \"acc_stderr\": 0.011744097081003805\n }\n}\n```", "repo_url": "https://huggingface.co/Zangs3011/mistral_7b_2EPOCH_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|arc:challenge|25_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|gsm8k|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hellaswag|10_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T04-55-31.577709.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["**/details_harness|winogrande|5_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T04-55-31.577709.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T04_55_31.577709", "path": ["results_2024-01-19T04-55-31.577709.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T04-55-31.577709.parquet"]}]}]}
2024-01-19T04:58:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Zangs3011/mistral_7b_2EPOCH_DolphinCoder Dataset automatically created during the evaluation run of model Zangs3011/mistral_7b_2EPOCH_DolphinCoder on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T04:55:31.577709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Zangs3011/mistral_7b_2EPOCH_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mistral_7b_2EPOCH_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T04:55:31.577709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Zangs3011/mistral_7b_2EPOCH_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model Zangs3011/mistral_7b_2EPOCH_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T04:55:31.577709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
820b8e769eb61c207909825cbf4a4e58f3914a1d
Utilizing the [LLMPerf](https://github.com/ray-project/llmperf), we have benchmarked a selection of LLM inference providers. Our analysis focuses on evaluating their performance, reliability, and efficiency under the following key metrics: - Output tokens throughput, which represents the average number of output tokens returned per second. This metric is important for applications that require high throughput, such as summarization and translation, and easy to compare across different models and providers. - Time to first token (TTFT), which represents the duration of time that LLM returns the first token. TTFT is especially important for streaming applications, such as chatbots. ### Time to First Token (seconds) For streaming applications, the TTFT is how long before the LLM returns the first token. | Framework | Model | Median | Mean | Min | Max | P25 | P75 | P95 | P99 | |------------|------------------------------------------------------------------------------------------------------|---------|--------|-------|-------|-------|-------|-------|-------| | bedrock | claude-instant-v1 | 1.21 | 1.29 | 1.12 | 2.19 | 1.17 | 1.27 | 1.89 | 2.17 | ### Output Tokens Throughput (tokens/s) The output tokens throughput is measured as the average number of **output** tokens returned per second. We collect results by sending 100 requests to each LLM inference provider, and calculate the mean output tokens throughput based on 100 requests. A higher output tokens throughput indicates a higher throughput of the LLM inference provider. | Framework | Model | Median | Mean | Min | Max | P25 | P75 | P95 | P99 | |:------------|:----------------------------------------------|---------:|-------:|------:|------:|------:|------:|------:|------:| | bedrock | claude-instant-v1 | 65.64 | 65.98 | 16.05 | 110.38 | 57.29 | 75.57 | 99.73 | 106.42 | #### Run Configurations testscript [token_benchmark_ray.py](https://github.com/ray-project/llmperf/blob/main/token_benchmark_ray.py) For each provider, we perform: - Total number of requests: 100 - Concurrency: 1 - Prompt's token length: 1024 - Expected output length: 1024 - Tested models: claude-instant-v1-100k ``` python token_benchmark_ray.py \ --model bedrock/anthropic.claude-instant-v1 \ --mean-input-tokens 1024 \ --stddev-input-tokens 0 \ --mean-output-tokens 1024 \ --stddev-output-tokens 100 \ --max-num-completed-requests 100 \ --num-concurrent-requests 1 \ --llm-api litellm ``` We ran the LLMPerf clients from an on-premise Kubernetes Bastion host. The results were up-to-date of January 19, 2023, 3pm KST. You could find the detailed results in the [raw_data](https://huggingface.co/datasets/ssong1/llmperf-bedrock/tree/main/raw_data) folder. #### Caveats and Disclaimers - The endpoints provider backend might vary widely, so this is not a reflection on how the software runs on a particular hardware. - The results may vary with time of day. - The results (e.g. measurement of TTFT) depend on client location, and can also be biased by some providers lagging on the first token in order to increase ITL. - The results is only a proxy of the system capabilities and is also impacted by the existing system load and provider traffic. - The results may not correlate with users’ workloads.
ssong1/llmperf-bedrock
[ "license:apache-2.0", "region:us" ]
2024-01-19T05:29:01+00:00
{"license": "apache-2.0"}
2024-01-24T00:50:26+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
Utilizing the LLMPerf, we have benchmarked a selection of LLM inference providers. Our analysis focuses on evaluating their performance, reliability, and efficiency under the following key metrics: * Output tokens throughput, which represents the average number of output tokens returned per second. This metric is important for applications that require high throughput, such as summarization and translation, and easy to compare across different models and providers. * Time to first token (TTFT), which represents the duration of time that LLM returns the first token. TTFT is especially important for streaming applications, such as chatbots. ### Time to First Token (seconds) For streaming applications, the TTFT is how long before the LLM returns the first token. ### Output Tokens Throughput (tokens/s) The output tokens throughput is measured as the average number of output tokens returned per second. We collect results by sending 100 requests to each LLM inference provider, and calculate the mean output tokens throughput based on 100 requests. A higher output tokens throughput indicates a higher throughput of the LLM inference provider. #### Run Configurations testscript token\_benchmark\_ray.py For each provider, we perform: * Total number of requests: 100 * Concurrency: 1 * Prompt's token length: 1024 * Expected output length: 1024 * Tested models: claude-instant-v1-100k We ran the LLMPerf clients from an on-premise Kubernetes Bastion host. The results were up-to-date of January 19, 2023, 3pm KST. You could find the detailed results in the raw\_data folder. #### Caveats and Disclaimers * The endpoints provider backend might vary widely, so this is not a reflection on how the software runs on a particular hardware. * The results may vary with time of day. * The results (e.g. measurement of TTFT) depend on client location, and can also be biased by some providers lagging on the first token in order to increase ITL. * The results is only a proxy of the system capabilities and is also impacted by the existing system load and provider traffic. * The results may not correlate with users’ workloads.
[ "### Time to First Token (seconds)\n\n\nFor streaming applications, the TTFT is how long before the LLM returns the first token.", "### Output Tokens Throughput (tokens/s)\n\n\nThe output tokens throughput is measured as the average number of output tokens returned per second.\nWe collect results by sending 100 requests to each LLM inference provider, and calculate the mean output tokens throughput based on 100 requests.\nA higher output tokens throughput indicates a higher throughput of the LLM inference provider.", "#### Run Configurations\n\n\ntestscript token\\_benchmark\\_ray.py\n\n\nFor each provider, we perform:\n\n\n* Total number of requests: 100\n* Concurrency: 1\n* Prompt's token length: 1024\n* Expected output length: 1024\n* Tested models: claude-instant-v1-100k\n\n\nWe ran the LLMPerf clients from an on-premise Kubernetes Bastion host.\nThe results were up-to-date of January 19, 2023, 3pm KST. You could find the detailed results in the raw\\_data folder.", "#### Caveats and Disclaimers\n\n\n* The endpoints provider backend might vary widely, so this is not a reflection on how the software runs on a particular hardware.\n* The results may vary with time of day.\n* The results (e.g. measurement of TTFT) depend on client location, and can also be biased by some providers lagging on the first token in order to increase ITL.\n* The results is only a proxy of the system capabilities and is also impacted by the existing system load and provider traffic.\n* The results may not correlate with users’ workloads." ]
[ "TAGS\n#license-apache-2.0 #region-us \n", "### Time to First Token (seconds)\n\n\nFor streaming applications, the TTFT is how long before the LLM returns the first token.", "### Output Tokens Throughput (tokens/s)\n\n\nThe output tokens throughput is measured as the average number of output tokens returned per second.\nWe collect results by sending 100 requests to each LLM inference provider, and calculate the mean output tokens throughput based on 100 requests.\nA higher output tokens throughput indicates a higher throughput of the LLM inference provider.", "#### Run Configurations\n\n\ntestscript token\\_benchmark\\_ray.py\n\n\nFor each provider, we perform:\n\n\n* Total number of requests: 100\n* Concurrency: 1\n* Prompt's token length: 1024\n* Expected output length: 1024\n* Tested models: claude-instant-v1-100k\n\n\nWe ran the LLMPerf clients from an on-premise Kubernetes Bastion host.\nThe results were up-to-date of January 19, 2023, 3pm KST. You could find the detailed results in the raw\\_data folder.", "#### Caveats and Disclaimers\n\n\n* The endpoints provider backend might vary widely, so this is not a reflection on how the software runs on a particular hardware.\n* The results may vary with time of day.\n* The results (e.g. measurement of TTFT) depend on client location, and can also be biased by some providers lagging on the first token in order to increase ITL.\n* The results is only a proxy of the system capabilities and is also impacted by the existing system load and provider traffic.\n* The results may not correlate with users’ workloads." ]
947e7e0bcb99ab9a6e50b35686f7789ed5ecdba9
# Dataset Card for "layoutlmv3_cord" ## Original Dataset is "naver-clova-ix/cord-v2" ### This dataset is modified for learning. [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Hyeoli/layoutlmv3_cord
[ "region:us" ]
2024-01-19T06:09:26+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "words", "sequence": "string"}, {"name": "bboxes", "sequence": {"sequence": "int64"}}, {"name": "ner_tags", "sequence": {"class_label": {"names": {"0": "O", "1": "I-menu.cnt", "2": "I-menu.discountprice", "3": "I-menu.nm", "4": "I-menu.num", "5": "I-menu.price", "6": "I-menu.sub_cnt", "7": "I-menu.sub_nm", "8": "I-menu.sub_price", "9": "I-menu.unitprice", "10": "I-sub_total.discount_price", "11": "I-sub_total.etc", "12": "I-sub_total.service_price", "13": "I-sub_total.subtotal_price", "14": "I-sub_total.tax_price", "15": "I-total.cashprice", "16": "I-total.changeprice", "17": "I-total.creditcardprice", "18": "I-total.emoneyprice", "19": "I-total.menuqty_cnt", "20": "I-total.menutype_cnt", "21": "I-total.total_etc", "22": "I-total.total_price"}}}}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 1296349383.0, "num_examples": 800}, {"name": "test", "num_bytes": 162954804.0, "num_examples": 100}, {"name": "validation", "num_bytes": 171507971.0, "num_examples": 100}], "download_size": 1628026145, "dataset_size": 1630812158.0}}
2024-01-22T06:55:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "layoutlmv3_cord" ## Original Dataset is "naver-clova-ix/cord-v2" ### This dataset is modified for learning. More Information needed
[ "# Dataset Card for \"layoutlmv3_cord\"", "## Original Dataset is \"naver-clova-ix/cord-v2\"", "### This dataset is modified for learning.\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"layoutlmv3_cord\"", "## Original Dataset is \"naver-clova-ix/cord-v2\"", "### This dataset is modified for learning.\nMore Information needed" ]
90dcfb8120b6e0ce1f916485d975f134ead54ac0
# Dataset Card for Evaluation run of Karko/Proctora <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Karko/Proctora](https://huggingface.co/Karko/Proctora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Karko__Proctora", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T06:36:32.426186](https://huggingface.co/datasets/open-llm-leaderboard/details_Karko__Proctora/blob/main/results_2024-01-19T06-36-32.426186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6586351892929487, "acc_stderr": 0.03179113658240876, "acc_norm": 0.6588977596758985, "acc_norm_stderr": 0.032441095885830044, "mc1": 0.4173806609547124, "mc1_stderr": 0.017262891063272175, "mc2": 0.5955283522928707, "mc2_stderr": 0.015487512280277603 }, "harness|arc:challenge|25": { "acc": 0.6459044368600683, "acc_stderr": 0.01397545412275655, "acc_norm": 0.6783276450511946, "acc_norm_stderr": 0.013650488084494166 }, "harness|hellaswag|10": { "acc": 0.6835291774546903, "acc_stderr": 0.004641484273335098, "acc_norm": 0.866759609639514, "acc_norm_stderr": 0.0033913982936134386 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.04793724854411021, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411021 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544057, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778408, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778408 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.02289168798455496, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.02289168798455496 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3814814814814815, "acc_stderr": 0.029616718927497586, "acc_norm": 0.3814814814814815, "acc_norm_stderr": 0.029616718927497586 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7016806722689075, "acc_stderr": 0.029719142876342863, "acc_norm": 0.7016806722689075, "acc_norm_stderr": 0.029719142876342863 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374308, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374308 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.03407632093854051, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.03407632093854051 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250458, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250458 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8326947637292464, "acc_stderr": 0.013347327202920332, "acc_norm": 0.8326947637292464, "acc_norm_stderr": 0.013347327202920332 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7485549132947977, "acc_stderr": 0.02335736578587403, "acc_norm": 0.7485549132947977, "acc_norm_stderr": 0.02335736578587403 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39776536312849164, "acc_stderr": 0.016369204971262985, "acc_norm": 0.39776536312849164, "acc_norm_stderr": 0.016369204971262985 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7320261437908496, "acc_stderr": 0.025360603796242553, "acc_norm": 0.7320261437908496, "acc_norm_stderr": 0.025360603796242553 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188936, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188936 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7407407407407407, "acc_stderr": 0.024383665531035454, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.024383665531035454 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4784876140808344, "acc_stderr": 0.012758410941038915, "acc_norm": 0.4784876140808344, "acc_norm_stderr": 0.012758410941038915 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6985294117647058, "acc_stderr": 0.027875982114273168, "acc_norm": 0.6985294117647058, "acc_norm_stderr": 0.027875982114273168 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.01899970738316267, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.01899970738316267 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142773, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142773 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896308, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587952, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587952 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.4173806609547124, "mc1_stderr": 0.017262891063272175, "mc2": 0.5955283522928707, "mc2_stderr": 0.015487512280277603 }, "harness|winogrande|5": { "acc": 0.797947908445146, "acc_stderr": 0.011285013754047432 }, "harness|gsm8k|5": { "acc": 0.7194844579226687, "acc_stderr": 0.012374608490929561 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Karko__Proctora
[ "region:us" ]
2024-01-19T06:39:35+00:00
{"pretty_name": "Evaluation run of Karko/Proctora", "dataset_summary": "Dataset automatically created during the evaluation run of model [Karko/Proctora](https://huggingface.co/Karko/Proctora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Karko__Proctora\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T06:36:32.426186](https://huggingface.co/datasets/open-llm-leaderboard/details_Karko__Proctora/blob/main/results_2024-01-19T06-36-32.426186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6586351892929487,\n \"acc_stderr\": 0.03179113658240876,\n \"acc_norm\": 0.6588977596758985,\n \"acc_norm_stderr\": 0.032441095885830044,\n \"mc1\": 0.4173806609547124,\n \"mc1_stderr\": 0.017262891063272175,\n \"mc2\": 0.5955283522928707,\n \"mc2_stderr\": 0.015487512280277603\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.01397545412275655,\n \"acc_norm\": 0.6783276450511946,\n \"acc_norm_stderr\": 0.013650488084494166\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6835291774546903,\n \"acc_stderr\": 0.004641484273335098,\n \"acc_norm\": 0.866759609639514,\n \"acc_norm_stderr\": 0.0033913982936134386\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497586,\n \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342863,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342863\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39776536312849164,\n \"acc_stderr\": 0.016369204971262985,\n \"acc_norm\": 0.39776536312849164,\n \"acc_norm_stderr\": 0.016369204971262985\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038915,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038915\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587952,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587952\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4173806609547124,\n \"mc1_stderr\": 0.017262891063272175,\n \"mc2\": 0.5955283522928707,\n \"mc2_stderr\": 0.015487512280277603\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047432\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7194844579226687,\n \"acc_stderr\": 0.012374608490929561\n }\n}\n```", "repo_url": "https://huggingface.co/Karko/Proctora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|arc:challenge|25_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|gsm8k|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hellaswag|10_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T06-36-32.426186.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["**/details_harness|winogrande|5_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T06-36-32.426186.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T06_36_32.426186", "path": ["results_2024-01-19T06-36-32.426186.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T06-36-32.426186.parquet"]}]}]}
2024-01-19T06:41:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Karko/Proctora Dataset automatically created during the evaluation run of model Karko/Proctora on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T06:36:32.426186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Karko/Proctora\n\n\n\nDataset automatically created during the evaluation run of model Karko/Proctora on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T06:36:32.426186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Karko/Proctora\n\n\n\nDataset automatically created during the evaluation run of model Karko/Proctora on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T06:36:32.426186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f8c11c78bfb052b51537d0c0ad9ff6aca4470ca6
# Personal Finance QA Dataset ## Overview Welcome to the Personal Finance QA Dataset! This collection consists of question-answer pairs curated to address various aspects of personal finance. Whether you're diving into budgeting, exploring investment strategies, or navigating credit decisions, this dataset provides a valuable resource for understanding key concepts in personal finance. ## Features - **Input**: Questions related to personal finance. - **Output**: Corresponding detailed answers offering insights and guidance. ## Motivation The aim of this dataset is to empower users with a comprehensive resource for personal finance education. By providing a diverse set of questions and informative answers, we hope to facilitate learning and understanding in the realm of personal finance. ## Original Dataset This collection is derived from a fingpt and has been meticulously processed to ensure relevance and coherence. The original dataset used for compilation can be found (https://huggingface.co/datasets/FinGPT/fingpt-fiqa_qa). ## Usage Explore and utilize this dataset for tasks such as: - Training and evaluating question-answering models. - Analyzing common themes and concerns in personal finance.
bilalRahib/fiqa-personal-finance-dataset
[ "region:us" ]
2024-01-19T06:56:32+00:00
{}
2024-01-19T07:05:47+00:00
[]
[]
TAGS #region-us
# Personal Finance QA Dataset ## Overview Welcome to the Personal Finance QA Dataset! This collection consists of question-answer pairs curated to address various aspects of personal finance. Whether you're diving into budgeting, exploring investment strategies, or navigating credit decisions, this dataset provides a valuable resource for understanding key concepts in personal finance. ## Features - Input: Questions related to personal finance. - Output: Corresponding detailed answers offering insights and guidance. ## Motivation The aim of this dataset is to empower users with a comprehensive resource for personal finance education. By providing a diverse set of questions and informative answers, we hope to facilitate learning and understanding in the realm of personal finance. ## Original Dataset This collection is derived from a fingpt and has been meticulously processed to ensure relevance and coherence. The original dataset used for compilation can be found (URL ## Usage Explore and utilize this dataset for tasks such as: - Training and evaluating question-answering models. - Analyzing common themes and concerns in personal finance.
[ "# Personal Finance QA Dataset", "## Overview\n\nWelcome to the Personal Finance QA Dataset! This collection consists of question-answer pairs curated to address various aspects of personal finance. Whether you're diving into budgeting, exploring investment strategies, or navigating credit decisions, this dataset provides a valuable resource for understanding key concepts in personal finance.", "## Features\n\n- Input: Questions related to personal finance.\n- Output: Corresponding detailed answers offering insights and guidance.", "## Motivation\n\nThe aim of this dataset is to empower users with a comprehensive resource for personal finance education. By providing a diverse set of questions and informative answers, we hope to facilitate learning and understanding in the realm of personal finance.", "## Original Dataset\n\nThis collection is derived from a fingpt and has been meticulously processed to ensure relevance and coherence. The original dataset used for compilation can be found (URL", "## Usage\n\nExplore and utilize this dataset for tasks such as:\n- Training and evaluating question-answering models.\n- Analyzing common themes and concerns in personal finance." ]
[ "TAGS\n#region-us \n", "# Personal Finance QA Dataset", "## Overview\n\nWelcome to the Personal Finance QA Dataset! This collection consists of question-answer pairs curated to address various aspects of personal finance. Whether you're diving into budgeting, exploring investment strategies, or navigating credit decisions, this dataset provides a valuable resource for understanding key concepts in personal finance.", "## Features\n\n- Input: Questions related to personal finance.\n- Output: Corresponding detailed answers offering insights and guidance.", "## Motivation\n\nThe aim of this dataset is to empower users with a comprehensive resource for personal finance education. By providing a diverse set of questions and informative answers, we hope to facilitate learning and understanding in the realm of personal finance.", "## Original Dataset\n\nThis collection is derived from a fingpt and has been meticulously processed to ensure relevance and coherence. The original dataset used for compilation can be found (URL", "## Usage\n\nExplore and utilize this dataset for tasks such as:\n- Training and evaluating question-answering models.\n- Analyzing common themes and concerns in personal finance." ]
62b277ea1ab2509baea85f410905e6487afae918
# Dataset Card for "alpaca_farm-reward-model-deberta-v3-large-v2-re-preference" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Mitsuki-Sakamoto/alpaca_farm-reward-model-deberta-v3-large-v2-re-preference
[ "region:us" ]
2024-01-19T07:09:46+00:00
{"dataset_info": [{"config_name": "alpaca_gpt4_preference", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output_1", "dtype": "string"}, {"name": "output_2", "dtype": "string"}, {"name": "preference", "dtype": "int64"}, {"name": "reward_model_prompt_format", "dtype": "string"}, {"name": "old_preference", "dtype": "int64"}, {"name": "reward_1", "dtype": "float64"}, {"name": "reward_2", "dtype": "float64"}], "splits": [{"name": "preference", "num_bytes": 15360054, "num_examples": 19472}], "download_size": 8238325, "dataset_size": 15360054}, {"config_name": "alpaca_instructions-42dot_LLM-SFT-1.3B", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "preference", "dtype": "int64"}, {"name": "output_1", "dtype": "string"}, {"name": "output_2", "dtype": "string"}, {"name": "reward_model_prompt_format", "dtype": "string"}, {"name": "gen_prompt_format", "dtype": "string"}, {"name": "gen_kwargs", "struct": [{"name": "do_sample", "dtype": "bool"}, {"name": "max_new_tokens", "dtype": "int64"}, {"name": "pad_token_id", "dtype": "int64"}, {"name": "top_k", "dtype": "int64"}, {"name": "top_p", "dtype": "float64"}]}, {"name": "reward_1", "dtype": "float64"}, {"name": "reward_2", "dtype": "float64"}], "splits": [{"name": "preference", "num_bytes": 20420791, "num_examples": 20001}], "download_size": 9214981, "dataset_size": 20420791}, {"config_name": "alpaca_instructions-pythia-1.4b", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "preference", "dtype": "int64"}, {"name": "output_1", "dtype": "string"}, {"name": "output_2", "dtype": "string"}, {"name": "reward_model_prompt_format", "dtype": "string"}, {"name": "gen_prompt_format", "dtype": "string"}, {"name": "gen_kwargs", "struct": [{"name": "do_sample", "dtype": "bool"}, {"name": "max_new_tokens", "dtype": "int64"}, {"name": "pad_token_id", "dtype": "int64"}, {"name": "top_k", "dtype": "int64"}, {"name": "top_p", "dtype": "float64"}]}, {"name": "reward_1", "dtype": "float64"}, {"name": "reward_2", "dtype": "float64"}], "splits": [{"name": "preference", "num_bytes": 21113808, "num_examples": 20001}], "download_size": 9676257, "dataset_size": 21113808}, {"config_name": "alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "preference", "dtype": "int64"}, {"name": "output_1", "dtype": "string"}, {"name": "output_2", "dtype": "string"}, {"name": "reward_model_prompt_format", "dtype": "string"}, {"name": "gen_prompt_format", "dtype": "string"}, {"name": "gen_kwargs", "struct": [{"name": "do_sample", "dtype": "bool"}, {"name": "max_new_tokens", "dtype": "int64"}, {"name": "pad_token_id", "dtype": "int64"}, {"name": "top_k", "dtype": "int64"}, {"name": "top_p", "dtype": "float64"}]}, {"name": "reward_1", "dtype": "float64"}, {"name": "reward_2", "dtype": "float64"}], "splits": [{"name": "preference", "num_bytes": 25636770, "num_examples": 20001}], "download_size": 12193208, "dataset_size": 25636770}], "configs": [{"config_name": "alpaca_gpt4_preference", "data_files": [{"split": "preference", "path": "alpaca_gpt4_preference/preference-*"}]}, {"config_name": "alpaca_instructions-42dot_LLM-SFT-1.3B", "data_files": [{"split": "preference", "path": "alpaca_instructions-42dot_LLM-SFT-1.3B/preference-*"}]}, {"config_name": "alpaca_instructions-pythia-1.4b", "data_files": [{"split": "preference", "path": "alpaca_instructions-pythia-1.4b/preference-*"}]}, {"config_name": "alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500", "data_files": [{"split": "preference", "path": "alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/preference-*"}]}]}
2024-02-05T02:02:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for "alpaca_farm-reward-model-deberta-v3-large-v2-re-preference" More Information needed
[ "# Dataset Card for \"alpaca_farm-reward-model-deberta-v3-large-v2-re-preference\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"alpaca_farm-reward-model-deberta-v3-large-v2-re-preference\"\n\nMore Information needed" ]
b0c175309b06c87ef6c5915715d3affa4b799f4f
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
AnalyticalRecon/lus_en
[ "task_categories:translation", "language:en", "license:mit", "region:us" ]
2024-01-19T07:36:54+00:00
{"language": ["en"], "license": "mit", "task_categories": ["translation"]}
2024-01-19T07:40:31+00:00
[]
[ "en" ]
TAGS #task_categories-translation #language-English #license-mit #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#task_categories-translation #language-English #license-mit #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
2e4a1c4016e2c920a3fd73271ce54e1e9f1dd143
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
hiuman/vietnamese_classification
[ "task_categories:text-classification", "size_categories:100K<n<1M", "language:vi", "region:us" ]
2024-01-19T07:45:13+00:00
{"language": ["vi"], "size_categories": ["100K<n<1M"], "task_categories": ["text-classification"], "pretty_name": "v"}
2024-01-19T07:50:04+00:00
[]
[ "vi" ]
TAGS #task_categories-text-classification #size_categories-100K<n<1M #language-Vietnamese #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#task_categories-text-classification #size_categories-100K<n<1M #language-Vietnamese #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d1c6dcdf789d818a4c998bd45dc33b50cf6cadd0
Image Web Category Date Label https://cdn.dnaindia.com/sites/default/files/styles/third/public/2020/10/13/931075-aarogya-setu-who.jpg DNAINDIA COVID-19 Oct-20 TRUE https://cdn.dnaindia.com/sites/default/files/styles/third/public/2020/10/13/931068-pax.jpg DNAINDIA VIOLENCE Oct-20 TRUE https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/facebookimages.joeg-170x96.jpeg?aLWsCT.E.c_NF_vuleYzNIU7jGzDNEbo INDIATODAY COVID-19 Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/tea_cures_coronavirus-170x96.jpeg?aCrK65Ko2yaDK22kdDp.wWDdq.8d2Muh INDIATODAY MISLEADING Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/cover_pic-170x96.jpeg?tGgPId5Rjp7tFIFkKcZf7RBagynt8fBL INDIATODAY POLITICS Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/thumbnail_Cover_pic_fact_check-170x96.jpeg?Yq0H52ld0eDqwXVJxtb3H49eqFUoKVsm INDIATODAY COVID-19 Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/video/202003/screen-170x96.jpeg?PsP5gooufuxjVAfWtgFBKjvMn6szjroD INDIATODAY VIOLENCE Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/trump_false_quote_coronavirus-170x96.png?CDfhoJ6WNjyUNOwglikhPVUq7AjOCnly INDIATODAY VIOLENCE Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/trump_coronavirus-170x96.jpeg?f188.wThHwyb1ofZ6mOTKfpHDhI8UYVP INDIATODAY VIOLENCE Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/PTI12_11_2019_000140A-170x96.jpeg?LIGJ8qKvY4n3Eh9LqdDOWABeVECCgm6s INDIATODAY MISLEADING Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/trump_vaccine_coronavirus_roch-170x96.jpeg?MGbPFx5dO9vbviEiRX9YbSpgNAvG0FhD INDIATODAY VIOLENCE Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/india-170x96.jpeg?h9_SD_M3ZGGJlI0vvRBZzAlGWarUwA7k INDIATODAY MISLEADIND Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/factcheck-170x96.png?dMDIpKfLeKpva6kRdsQDVzzg8xoUwSvq INDIATODAY GOVERNMENT Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202003/poep-770x433_0-170x96.jpeg?KrsuBgMbawnmUZPPWFZaVzOvjDLMhJsy INDIATODAY VIOLENCE Mar-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/video/202002/corona-170x96.png?Of.K1arM4yS7nQ7N7WevdhmmL0TTzaFq INDIATODAY GOVERNMENT Feb-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202002/hfghfghf-170x96.png?HquLervdVkVHZVO1CLv.xpbYoMeWOIo8 INDIATODAY VIOLENCE Feb-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202002/sdddgfdgfd-170x96.png?zleWM5T.fOhoeBM9yiw9zN93sE09zUzu INDIATODAY GOVERNMENT Feb-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202001/australia_bushfire_koala_fact_-170x96.jpeg?T_dKnzeRMQ33BJ6Mg_2Wh7l3F8kExYjP INDIATODAY MISLEADING Jan-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202001/missile-170x96.png?QySavTejHHpgOy_fvwavlIBZagS4PkqB INDIATODAY COVID-19 Jan-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202001/finland_pm_six_hour_four_day_w-170x96.jpeg?7UOdb0xZtNAOwUCrCZMNG8zhzFNhxjJA INDIATODAY COVID-19 Jan-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202001/hfghfg_0-170x96.jpeg?hWMTuMN8ro0hoZrJPxteEfygreGo1qEe INDIATODAY MISLEADING Jan-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202001/all-170x96.png?1pyxpVLPJN7o0lN0R8kyGSaVA9BcwWmM INDIATODAY VIOLENCE Jan-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/202001/FB_kangaroo-170x96.png?03uiCD7ADlHBehH9.ImRQh1IHw6Cp2Kg INDIATODAY VIOLENCE Jan-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/201912/thumbnail_79092148_27536792546-170x96.jpeg?M3wCHj9Ejof5A9izw9xYp25v4t9EPzQl INDIATODAY MISLEADING Dec-20 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/201911/cover-bill_770-170x96.jpeg?KfaWVStxkSfnCYwSitp.ZzG6hQZRyJt4 INDIATODAY GOVERNMENT Nov-19 Fake https://akm-img-a-in.tosshub.com/indiatoday/images/story/201911/cover_pic_flesh-170x96.png?.DgygWbqzrvL0veP57pMPz61Xep4OUgp INDIATODAY COVID-19 Nov-19 Fake
SSoudamini/IFND
[ "region:us" ]
2024-01-19T07:57:22+00:00
{}
2024-01-19T08:02:27+00:00
[]
[]
TAGS #region-us
Image Web Category Date Label URL DNAINDIA COVID-19 Oct-20 TRUE URL DNAINDIA VIOLENCE Oct-20 TRUE URL INDIATODAY COVID-19 Mar-20 Fake URL INDIATODAY MISLEADING Mar-20 Fake URL INDIATODAY POLITICS Mar-20 Fake URL INDIATODAY COVID-19 Mar-20 Fake URL INDIATODAY VIOLENCE Mar-20 Fake URL INDIATODAY VIOLENCE Mar-20 Fake URL INDIATODAY VIOLENCE Mar-20 Fake URL INDIATODAY MISLEADING Mar-20 Fake URL INDIATODAY VIOLENCE Mar-20 Fake URL INDIATODAY MISLEADIND Mar-20 Fake URL INDIATODAY GOVERNMENT Mar-20 Fake URL INDIATODAY VIOLENCE Mar-20 Fake URL INDIATODAY GOVERNMENT Feb-20 Fake URL INDIATODAY VIOLENCE Feb-20 Fake URL INDIATODAY GOVERNMENT Feb-20 Fake URL INDIATODAY MISLEADING Jan-20 Fake URL INDIATODAY COVID-19 Jan-20 Fake URL INDIATODAY COVID-19 Jan-20 Fake URL INDIATODAY MISLEADING Jan-20 Fake URL INDIATODAY VIOLENCE Jan-20 Fake URL INDIATODAY VIOLENCE Jan-20 Fake URL INDIATODAY MISLEADING Dec-20 Fake URL INDIATODAY GOVERNMENT Nov-19 Fake URL INDIATODAY COVID-19 Nov-19 Fake
[]
[ "TAGS\n#region-us \n" ]
2578867a251c1fb7d75e2e489433d909fd1c609a
# Dataset Card for Evaluation run of leveldevai/MarcBeagle-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [leveldevai/MarcBeagle-7B](https://huggingface.co/leveldevai/MarcBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_leveldevai__MarcBeagle-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T07:57:12.038982](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MarcBeagle-7B/blob/main/results_2024-01-19T07-57-12.038982.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6553973972126328, "acc_stderr": 0.032015444547493624, "acc_norm": 0.6546131355189969, "acc_norm_stderr": 0.03268571911497314, "mc1": 0.5556915544675642, "mc1_stderr": 0.017394586250743183, "mc2": 0.691835601558332, "mc2_stderr": 0.01511112035401396 }, "harness|arc:challenge|25": { "acc": 0.7064846416382252, "acc_stderr": 0.01330725044494112, "acc_norm": 0.7312286689419796, "acc_norm_stderr": 0.012955065963710696 }, "harness|hellaswag|10": { "acc": 0.7159928301135232, "acc_stderr": 0.004500186424443794, "acc_norm": 0.8842859988050189, "acc_norm_stderr": 0.0031922790394687452 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7245283018867924, "acc_stderr": 0.027495663683724057, "acc_norm": 0.7245283018867924, "acc_norm_stderr": 0.027495663683724057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.0356760379963917, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.0356760379963917 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.02522545028406788, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.02522545028406788 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677171, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677171 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268545, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268545 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.023854795680971125, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.023854795680971125 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.02857834836547308, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.02857834836547308 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977934, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977934 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8513761467889909, "acc_stderr": 0.015251253773660834, "acc_norm": 0.8513761467889909, "acc_norm_stderr": 0.015251253773660834 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455334, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455334 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.025955020841621112, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.025955020841621112 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092375, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092375 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8314176245210728, "acc_stderr": 0.013387895731543604, "acc_norm": 0.8314176245210728, "acc_norm_stderr": 0.013387895731543604 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069363, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069363 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43687150837988825, "acc_stderr": 0.016588680864530626, "acc_norm": 0.43687150837988825, "acc_norm_stderr": 0.016588680864530626 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.0256468630971379, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.0256468630971379 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984813, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.023993501709042107, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.023993501709042107 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46936114732724904, "acc_stderr": 0.012746237711716634, "acc_norm": 0.46936114732724904, "acc_norm_stderr": 0.012746237711716634 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507208, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507208 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.02783302387139967, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.02783302387139967 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5556915544675642, "mc1_stderr": 0.017394586250743183, "mc2": 0.691835601558332, "mc2_stderr": 0.01511112035401396 }, "harness|winogrande|5": { "acc": 0.8382004735595896, "acc_stderr": 0.010350128010292406 }, "harness|gsm8k|5": { "acc": 0.7119029567854435, "acc_stderr": 0.012474469737197926 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_leveldevai__MarcBeagle-7B
[ "region:us" ]
2024-01-19T07:59:32+00:00
{"pretty_name": "Evaluation run of leveldevai/MarcBeagle-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [leveldevai/MarcBeagle-7B](https://huggingface.co/leveldevai/MarcBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__MarcBeagle-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T07:57:12.038982](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MarcBeagle-7B/blob/main/results_2024-01-19T07-57-12.038982.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6553973972126328,\n \"acc_stderr\": 0.032015444547493624,\n \"acc_norm\": 0.6546131355189969,\n \"acc_norm_stderr\": 0.03268571911497314,\n \"mc1\": 0.5556915544675642,\n \"mc1_stderr\": 0.017394586250743183,\n \"mc2\": 0.691835601558332,\n \"mc2_stderr\": 0.01511112035401396\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7064846416382252,\n \"acc_stderr\": 0.01330725044494112,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7159928301135232,\n \"acc_stderr\": 0.004500186424443794,\n \"acc_norm\": 0.8842859988050189,\n \"acc_norm_stderr\": 0.0031922790394687452\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n \"acc_stderr\": 0.016588680864530626,\n \"acc_norm\": 0.43687150837988825,\n \"acc_norm_stderr\": 0.016588680864530626\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139967,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139967\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5556915544675642,\n \"mc1_stderr\": 0.017394586250743183,\n \"mc2\": 0.691835601558332,\n \"mc2_stderr\": 0.01511112035401396\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292406\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7119029567854435,\n \"acc_stderr\": 0.012474469737197926\n }\n}\n```", "repo_url": "https://huggingface.co/leveldevai/MarcBeagle-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|arc:challenge|25_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|gsm8k|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hellaswag|10_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T07-57-12.038982.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["**/details_harness|winogrande|5_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T07-57-12.038982.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T07_57_12.038982", "path": ["results_2024-01-19T07-57-12.038982.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T07-57-12.038982.parquet"]}]}]}
2024-01-19T07:59:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of leveldevai/MarcBeagle-7B Dataset automatically created during the evaluation run of model leveldevai/MarcBeagle-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T07:57:12.038982(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of leveldevai/MarcBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/MarcBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T07:57:12.038982(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of leveldevai/MarcBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/MarcBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T07:57:12.038982(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0522a40d064ceb381c6b20badb06f212164d929b
# Dataset Card for Evaluation run of bhavinjawade/SuperAligned-Jawade <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [bhavinjawade/SuperAligned-Jawade](https://huggingface.co/bhavinjawade/SuperAligned-Jawade) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bhavinjawade__SuperAligned-Jawade", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T08:19:34.151424](https://huggingface.co/datasets/open-llm-leaderboard/details_bhavinjawade__SuperAligned-Jawade/blob/main/results_2024-01-19T08-19-34.151424.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6138237397234094, "acc_stderr": 0.03280388075307674, "acc_norm": 0.6150005706967872, "acc_norm_stderr": 0.033478603290902784, "mc1": 0.5507955936352509, "mc1_stderr": 0.01741294198611529, "mc2": 0.6916834493650746, "mc2_stderr": 0.015547003848871116 }, "harness|arc:challenge|25": { "acc": 0.6911262798634812, "acc_stderr": 0.013501770929344004, "acc_norm": 0.7158703071672355, "acc_norm_stderr": 0.013179442447653886 }, "harness|hellaswag|10": { "acc": 0.758514240191197, "acc_stderr": 0.004271094187097971, "acc_norm": 0.9057956582354113, "acc_norm_stderr": 0.0029151579677991948 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6490566037735849, "acc_stderr": 0.02937364625323469, "acc_norm": 0.6490566037735849, "acc_norm_stderr": 0.02937364625323469 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5953757225433526, "acc_stderr": 0.03742461193887249, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.03742461193887249 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440678, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440678 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.36507936507936506, "acc_stderr": 0.04306241259127153, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.04306241259127153 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7322580645161291, "acc_stderr": 0.02518900666021238, "acc_norm": 0.7322580645161291, "acc_norm_stderr": 0.02518900666021238 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229872, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229872 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306446, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306446 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5717948717948718, "acc_stderr": 0.025088301454694834, "acc_norm": 0.5717948717948718, "acc_norm_stderr": 0.025088301454694834 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683515, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683515 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7963302752293578, "acc_stderr": 0.017266742087630814, "acc_norm": 0.7963302752293578, "acc_norm_stderr": 0.017266742087630814 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.033981108902946366, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.033981108902946366 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8312236286919831, "acc_stderr": 0.02438140683258622, "acc_norm": 0.8312236286919831, "acc_norm_stderr": 0.02438140683258622 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.04225875451969637, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.04225875451969637 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990945, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990945 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.036028141763926456, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.036028141763926456 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6809815950920245, "acc_stderr": 0.03661997551073836, "acc_norm": 0.6809815950920245, "acc_norm_stderr": 0.03661997551073836 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077812, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077812 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720683, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7803320561941252, "acc_stderr": 0.01480538447837116, "acc_norm": 0.7803320561941252, "acc_norm_stderr": 0.01480538447837116 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6676300578034682, "acc_stderr": 0.02536116874968821, "acc_norm": 0.6676300578034682, "acc_norm_stderr": 0.02536116874968821 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41899441340782123, "acc_stderr": 0.016501579306861677, "acc_norm": 0.41899441340782123, "acc_norm_stderr": 0.016501579306861677 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6274509803921569, "acc_stderr": 0.027684181883302895, "acc_norm": 0.6274509803921569, "acc_norm_stderr": 0.027684181883302895 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6141479099678456, "acc_stderr": 0.02764814959975147, "acc_norm": 0.6141479099678456, "acc_norm_stderr": 0.02764814959975147 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.691358024691358, "acc_stderr": 0.025702640260603746, "acc_norm": 0.691358024691358, "acc_norm_stderr": 0.025702640260603746 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.029583452036284073, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.029583452036284073 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45697522816166886, "acc_stderr": 0.012722869501611419, "acc_norm": 0.45697522816166886, "acc_norm_stderr": 0.012722869501611419 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.02916312857067073, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.02916312857067073 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.019722058939618068, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.019722058939618068 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6857142857142857, "acc_stderr": 0.02971932942241747, "acc_norm": 0.6857142857142857, "acc_norm_stderr": 0.02971932942241747 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7562189054726368, "acc_stderr": 0.030360490154014635, "acc_norm": 0.7562189054726368, "acc_norm_stderr": 0.030360490154014635 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.4578313253012048, "acc_stderr": 0.038786267710023595, "acc_norm": 0.4578313253012048, "acc_norm_stderr": 0.038786267710023595 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.03218093795602357, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.03218093795602357 }, "harness|truthfulqa:mc|0": { "mc1": 0.5507955936352509, "mc1_stderr": 0.01741294198611529, "mc2": 0.6916834493650746, "mc2_stderr": 0.015547003848871116 }, "harness|winogrande|5": { "acc": 0.8382004735595896, "acc_stderr": 0.010350128010292404 }, "harness|gsm8k|5": { "acc": 0.4920394238059136, "acc_stderr": 0.01377073906313537 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_bhavinjawade__SuperAligned-Jawade
[ "region:us" ]
2024-01-19T08:21:49+00:00
{"pretty_name": "Evaluation run of bhavinjawade/SuperAligned-Jawade", "dataset_summary": "Dataset automatically created during the evaluation run of model [bhavinjawade/SuperAligned-Jawade](https://huggingface.co/bhavinjawade/SuperAligned-Jawade) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhavinjawade__SuperAligned-Jawade\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T08:19:34.151424](https://huggingface.co/datasets/open-llm-leaderboard/details_bhavinjawade__SuperAligned-Jawade/blob/main/results_2024-01-19T08-19-34.151424.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6138237397234094,\n \"acc_stderr\": 0.03280388075307674,\n \"acc_norm\": 0.6150005706967872,\n \"acc_norm_stderr\": 0.033478603290902784,\n \"mc1\": 0.5507955936352509,\n \"mc1_stderr\": 0.01741294198611529,\n \"mc2\": 0.6916834493650746,\n \"mc2_stderr\": 0.015547003848871116\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6911262798634812,\n \"acc_stderr\": 0.013501770929344004,\n \"acc_norm\": 0.7158703071672355,\n \"acc_norm_stderr\": 0.013179442447653886\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.758514240191197,\n \"acc_stderr\": 0.004271094187097971,\n \"acc_norm\": 0.9057956582354113,\n \"acc_norm_stderr\": 0.0029151579677991948\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887249,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887249\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306446,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306446\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630814,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630814\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8312236286919831,\n \"acc_stderr\": 0.02438140683258622,\n \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.02438140683258622\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990945,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990945\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.01480538447837116,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.01480538447837116\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968821,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968821\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n \"acc_stderr\": 0.016501579306861677,\n \"acc_norm\": 0.41899441340782123,\n \"acc_norm_stderr\": 0.016501579306861677\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n \"acc_stderr\": 0.02764814959975147,\n \"acc_norm\": 0.6141479099678456,\n \"acc_norm_stderr\": 0.02764814959975147\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284073,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284073\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.02971932942241747,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.02971932942241747\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.030360490154014635,\n \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.030360490154014635\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5507955936352509,\n \"mc1_stderr\": 0.01741294198611529,\n \"mc2\": 0.6916834493650746,\n \"mc2_stderr\": 0.015547003848871116\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292404\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4920394238059136,\n \"acc_stderr\": 0.01377073906313537\n }\n}\n```", "repo_url": "https://huggingface.co/bhavinjawade/SuperAligned-Jawade", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|arc:challenge|25_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|gsm8k|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hellaswag|10_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T08-19-34.151424.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["**/details_harness|winogrande|5_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T08-19-34.151424.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T08_19_34.151424", "path": ["results_2024-01-19T08-19-34.151424.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T08-19-34.151424.parquet"]}]}]}
2024-01-19T08:22:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of bhavinjawade/SuperAligned-Jawade Dataset automatically created during the evaluation run of model bhavinjawade/SuperAligned-Jawade on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T08:19:34.151424(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of bhavinjawade/SuperAligned-Jawade\n\n\n\nDataset automatically created during the evaluation run of model bhavinjawade/SuperAligned-Jawade on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T08:19:34.151424(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of bhavinjawade/SuperAligned-Jawade\n\n\n\nDataset automatically created during the evaluation run of model bhavinjawade/SuperAligned-Jawade on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T08:19:34.151424(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
bda684acbbb2abc709b63f147573f55e94964d13
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_51", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T08:19:43.491405](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_51/blob/main/results_2024-01-19T08-19-43.491405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5956914302139066, "acc_stderr": 0.03334872123940361, "acc_norm": 0.6014998966219293, "acc_norm_stderr": 0.03404340642551802, "mc1": 0.2802937576499388, "mc1_stderr": 0.015723139524608767, "mc2": 0.4145856205202453, "mc2_stderr": 0.015440318842858623 }, "harness|arc:challenge|25": { "acc": 0.5648464163822525, "acc_stderr": 0.01448798619718604, "acc_norm": 0.5972696245733788, "acc_norm_stderr": 0.014332236306790149 }, "harness|hellaswag|10": { "acc": 0.6258713403704441, "acc_stderr": 0.0048290815328265015, "acc_norm": 0.8252340171280621, "acc_norm_stderr": 0.003789906792644689 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6716981132075471, "acc_stderr": 0.02890159361241178, "acc_norm": 0.6716981132075471, "acc_norm_stderr": 0.02890159361241178 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03942082639927213, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03942082639927213 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.03714325906302065, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.03714325906302065 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5191489361702127, "acc_stderr": 0.032662042990646796, "acc_norm": 0.5191489361702127, "acc_norm_stderr": 0.032662042990646796 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.046151869625837026, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.046151869625837026 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3783068783068783, "acc_stderr": 0.02497695405315524, "acc_norm": 0.3783068783068783, "acc_norm_stderr": 0.02497695405315524 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7129032258064516, "acc_stderr": 0.025736542745594525, "acc_norm": 0.7129032258064516, "acc_norm_stderr": 0.025736542745594525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.03517945038691063, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7323232323232324, "acc_stderr": 0.03154449888270285, "acc_norm": 0.7323232323232324, "acc_norm_stderr": 0.03154449888270285 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7875647668393783, "acc_stderr": 0.02951928261681723, "acc_norm": 0.7875647668393783, "acc_norm_stderr": 0.02951928261681723 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5820512820512821, "acc_stderr": 0.02500732988246121, "acc_norm": 0.5820512820512821, "acc_norm_stderr": 0.02500732988246121 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524565, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524565 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8110091743119267, "acc_stderr": 0.016785481159203624, "acc_norm": 0.8110091743119267, "acc_norm_stderr": 0.016785481159203624 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4398148148148148, "acc_stderr": 0.03385177976044811, "acc_norm": 0.4398148148148148, "acc_norm_stderr": 0.03385177976044811 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7303921568627451, "acc_stderr": 0.031145570659486782, "acc_norm": 0.7303921568627451, "acc_norm_stderr": 0.031145570659486782 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.729957805907173, "acc_stderr": 0.028900721906293426, "acc_norm": 0.729957805907173, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.032100621541349864, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6564885496183206, "acc_stderr": 0.041649760719448786, "acc_norm": 0.6564885496183206, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6932515337423313, "acc_stderr": 0.03623089915724147, "acc_norm": 0.6932515337423313, "acc_norm_stderr": 0.03623089915724147 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8247863247863247, "acc_stderr": 0.024904439098918242, "acc_norm": 0.8247863247863247, "acc_norm_stderr": 0.024904439098918242 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.776500638569604, "acc_stderr": 0.01489723522945071, "acc_norm": 0.776500638569604, "acc_norm_stderr": 0.01489723522945071 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.684971098265896, "acc_stderr": 0.025009313790069706, "acc_norm": 0.684971098265896, "acc_norm_stderr": 0.025009313790069706 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38324022346368714, "acc_stderr": 0.016260159604429128, "acc_norm": 0.38324022346368714, "acc_norm_stderr": 0.016260159604429128 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6535947712418301, "acc_stderr": 0.02724561304721536, "acc_norm": 0.6535947712418301, "acc_norm_stderr": 0.02724561304721536 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6688102893890675, "acc_stderr": 0.026730620728004906, "acc_norm": 0.6688102893890675, "acc_norm_stderr": 0.026730620728004906 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6728395061728395, "acc_stderr": 0.026105673861409828, "acc_norm": 0.6728395061728395, "acc_norm_stderr": 0.026105673861409828 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4397163120567376, "acc_stderr": 0.029609912075594113, "acc_norm": 0.4397163120567376, "acc_norm_stderr": 0.029609912075594113 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4211212516297262, "acc_stderr": 0.012610325733489905, "acc_norm": 0.4211212516297262, "acc_norm_stderr": 0.012610325733489905 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.029768263528933105, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.029768263528933105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6143790849673203, "acc_stderr": 0.01969145905235403, "acc_norm": 0.6143790849673203, "acc_norm_stderr": 0.01969145905235403 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5363636363636364, "acc_stderr": 0.04776449162396197, "acc_norm": 0.5363636363636364, "acc_norm_stderr": 0.04776449162396197 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6612244897959184, "acc_stderr": 0.030299506562154185, "acc_norm": 0.6612244897959184, "acc_norm_stderr": 0.030299506562154185 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8059701492537313, "acc_stderr": 0.02796267760476892, "acc_norm": 0.8059701492537313, "acc_norm_stderr": 0.02796267760476892 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366255, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366255 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.03094445977853321, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.03094445977853321 }, "harness|truthfulqa:mc|0": { "mc1": 0.2802937576499388, "mc1_stderr": 0.015723139524608767, "mc2": 0.4145856205202453, "mc2_stderr": 0.015440318842858623 }, "harness|winogrande|5": { "acc": 0.7719021310181531, "acc_stderr": 0.01179301581766359 }, "harness|gsm8k|5": { "acc": 0.30856709628506446, "acc_stderr": 0.012723076049815884 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_51
[ "region:us" ]
2024-01-19T08:22:02+00:00
{"pretty_name": "Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_51\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T08:19:43.491405](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppo_5e7step_51/blob/main/results_2024-01-19T08-19-43.491405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5956914302139066,\n \"acc_stderr\": 0.03334872123940361,\n \"acc_norm\": 0.6014998966219293,\n \"acc_norm_stderr\": 0.03404340642551802,\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.4145856205202453,\n \"mc2_stderr\": 0.015440318842858623\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.01448798619718604,\n \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6258713403704441,\n \"acc_stderr\": 0.0048290815328265015,\n \"acc_norm\": 0.8252340171280621,\n \"acc_norm_stderr\": 0.003789906792644689\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315524,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315524\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n \"acc_stderr\": 0.025736542745594525,\n \"acc_norm\": 0.7129032258064516,\n \"acc_norm_stderr\": 0.025736542745594525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246121,\n \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246121\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203624,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n \"acc_stderr\": 0.024904439098918242,\n \"acc_norm\": 0.8247863247863247,\n \"acc_norm_stderr\": 0.024904439098918242\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069706,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069706\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.02724561304721536,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.02724561304721536\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n \"acc_stderr\": 0.026730620728004906,\n \"acc_norm\": 0.6688102893890675,\n \"acc_norm_stderr\": 0.026730620728004906\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594113,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594113\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235403,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235403\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.02796267760476892,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.02796267760476892\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.4145856205202453,\n \"mc2_stderr\": 0.015440318842858623\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.01179301581766359\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30856709628506446,\n \"acc_stderr\": 0.012723076049815884\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|arc:challenge|25_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|gsm8k|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hellaswag|10_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T08-19-43.491405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["**/details_harness|winogrande|5_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T08-19-43.491405.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T08_19_43.491405", "path": ["results_2024-01-19T08-19-43.491405.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T08-19-43.491405.parquet"]}]}]}
2024-01-19T08:22:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51 Dataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T08:19:43.491405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T08:19:43.491405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppo_5e7step_51 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T08:19:43.491405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b6f4abe4d7ccab91bb79f1ce457dc8eeccf02ad4
Official Repo of AesBench: An Expert Benchmark for Multimodal Large Language Models on Image Aesthetics Perception Test set of AesBench. See at our GitHub Repo [Here](https://github.com/yipoh/AesBench) for more details.
qyuan/AesBench
[ "license:mit", "region:us" ]
2024-01-19T08:26:09+00:00
{"license": "mit"}
2024-01-19T09:00:27+00:00
[]
[]
TAGS #license-mit #region-us
Official Repo of AesBench: An Expert Benchmark for Multimodal Large Language Models on Image Aesthetics Perception Test set of AesBench. See at our GitHub Repo Here for more details.
[]
[ "TAGS\n#license-mit #region-us \n" ]
1b047ff9bdcdb7c43cb663ce1582ed5ce3b8face
# Dataset Card for Evaluation run of HanNayeoniee/LHK <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [HanNayeoniee/LHK](https://huggingface.co/HanNayeoniee/LHK) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_HanNayeoniee__LHK", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T08:36:46.255504](https://huggingface.co/datasets/open-llm-leaderboard/details_HanNayeoniee__LHK/blob/main/results_2024-01-19T08-36-46.255504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6522091397950804, "acc_stderr": 0.031753247805341805, "acc_norm": 0.6548324288435591, "acc_norm_stderr": 0.032386658093364704, "mc1": 0.4186046511627907, "mc1_stderr": 0.017270015284476848, "mc2": 0.591200906179408, "mc2_stderr": 0.01538726882622229 }, "harness|arc:challenge|25": { "acc": 0.6279863481228669, "acc_stderr": 0.014124597881844461, "acc_norm": 0.6638225255972696, "acc_norm_stderr": 0.013804855026205766 }, "harness|hellaswag|10": { "acc": 0.657239593706433, "acc_stderr": 0.004736621698861176, "acc_norm": 0.844851623182633, "acc_norm_stderr": 0.0036130615166899823 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353227, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353227 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7368421052631579, "acc_stderr": 0.03583496176361073, "acc_norm": 0.7368421052631579, "acc_norm_stderr": 0.03583496176361073 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337124, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337124 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383888, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383888 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5914893617021276, "acc_stderr": 0.032134180267015755, "acc_norm": 0.5914893617021276, "acc_norm_stderr": 0.032134180267015755 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.04685473041907789, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.04685473041907789 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6137931034482759, "acc_stderr": 0.04057324734419035, "acc_norm": 0.6137931034482759, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531003, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531003 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.0442626668137991, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.0442626668137991 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723292, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723292 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.45320197044334976, "acc_stderr": 0.03502544650845872, "acc_norm": 0.45320197044334976, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8181818181818182, "acc_stderr": 0.030117688929503575, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.030117688929503575 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8484848484848485, "acc_stderr": 0.025545650426603627, "acc_norm": 0.8484848484848485, "acc_norm_stderr": 0.025545650426603627 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6410256410256411, "acc_stderr": 0.024321738484602354, "acc_norm": 0.6410256410256411, "acc_norm_stderr": 0.024321738484602354 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114986, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114986 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.029344572500634342, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.029344572500634342 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461763, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461763 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5648148148148148, "acc_stderr": 0.033812000056435254, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.033812000056435254 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.02646056956124064, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.02646056956124064 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8312236286919831, "acc_stderr": 0.024381406832586234, "acc_norm": 0.8312236286919831, "acc_norm_stderr": 0.024381406832586234 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8974358974358975, "acc_stderr": 0.01987565502786747, "acc_norm": 0.8974358974358975, "acc_norm_stderr": 0.01987565502786747 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8250319284802043, "acc_stderr": 0.013586619219903341, "acc_norm": 0.8250319284802043, "acc_norm_stderr": 0.013586619219903341 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508283, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508283 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2905027932960894, "acc_stderr": 0.01518384430720616, "acc_norm": 0.2905027932960894, "acc_norm_stderr": 0.01518384430720616 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7647058823529411, "acc_stderr": 0.024288619466046102, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.024288619466046102 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7746913580246914, "acc_stderr": 0.023246202647819746, "acc_norm": 0.7746913580246914, "acc_norm_stderr": 0.023246202647819746 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5070921985815603, "acc_stderr": 0.02982449855912901, "acc_norm": 0.5070921985815603, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4784876140808344, "acc_stderr": 0.01275841094103892, "acc_norm": 0.4784876140808344, "acc_norm_stderr": 0.01275841094103892 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7242647058823529, "acc_stderr": 0.027146271936625166, "acc_norm": 0.7242647058823529, "acc_norm_stderr": 0.027146271936625166 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.01899970738316267, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.01899970738316267 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7510204081632653, "acc_stderr": 0.027682979522960238, "acc_norm": 0.7510204081632653, "acc_norm_stderr": 0.027682979522960238 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197768, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197768 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.03126781714663179, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.03126781714663179 }, "harness|truthfulqa:mc|0": { "mc1": 0.4186046511627907, "mc1_stderr": 0.017270015284476848, "mc2": 0.591200906179408, "mc2_stderr": 0.01538726882622229 }, "harness|winogrande|5": { "acc": 0.8097868981846882, "acc_stderr": 0.01103033579861744 }, "harness|gsm8k|5": { "acc": 0.5633055344958302, "acc_stderr": 0.01366164978090549 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_HanNayeoniee__LHK
[ "region:us" ]
2024-01-19T08:39:03+00:00
{"pretty_name": "Evaluation run of HanNayeoniee/LHK", "dataset_summary": "Dataset automatically created during the evaluation run of model [HanNayeoniee/LHK](https://huggingface.co/HanNayeoniee/LHK) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HanNayeoniee__LHK\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T08:36:46.255504](https://huggingface.co/datasets/open-llm-leaderboard/details_HanNayeoniee__LHK/blob/main/results_2024-01-19T08-36-46.255504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522091397950804,\n \"acc_stderr\": 0.031753247805341805,\n \"acc_norm\": 0.6548324288435591,\n \"acc_norm_stderr\": 0.032386658093364704,\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.017270015284476848,\n \"mc2\": 0.591200906179408,\n \"mc2_stderr\": 0.01538726882622229\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205766\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657239593706433,\n \"acc_stderr\": 0.004736621698861176,\n \"acc_norm\": 0.844851623182633,\n \"acc_norm_stderr\": 0.0036130615166899823\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503575,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503575\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634342,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634342\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586234,\n \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586234\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.01987565502786747,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.01987565502786747\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n \"acc_stderr\": 0.01518384430720616,\n \"acc_norm\": 0.2905027932960894,\n \"acc_norm_stderr\": 0.01518384430720616\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046102,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046102\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.023246202647819746,\n \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.023246202647819746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.01275841094103892,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.01275841094103892\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625166,\n \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625166\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.017270015284476848,\n \"mc2\": 0.591200906179408,\n \"mc2_stderr\": 0.01538726882622229\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.01103033579861744\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5633055344958302,\n \"acc_stderr\": 0.01366164978090549\n }\n}\n```", "repo_url": "https://huggingface.co/HanNayeoniee/LHK", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|arc:challenge|25_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|gsm8k|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hellaswag|10_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T08-36-46.255504.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["**/details_harness|winogrande|5_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T08-36-46.255504.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T08_36_46.255504", "path": ["results_2024-01-19T08-36-46.255504.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T08-36-46.255504.parquet"]}]}]}
2024-01-19T08:39:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of HanNayeoniee/LHK Dataset automatically created during the evaluation run of model HanNayeoniee/LHK on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T08:36:46.255504(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of HanNayeoniee/LHK\n\n\n\nDataset automatically created during the evaluation run of model HanNayeoniee/LHK on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T08:36:46.255504(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of HanNayeoniee/LHK\n\n\n\nDataset automatically created during the evaluation run of model HanNayeoniee/LHK on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T08:36:46.255504(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7720e38ac7c79f7163e880cc11f9aa4e134c79e8
Dataset Summary --------------- Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise. For more detailed information please refer to the paper. This dataset is a processed form of "dair-ai/emotion" dataset. [https://huggingface.co/datasets/dair-ai/emotion] In this one, I have amplified the samples for minority classes so that all the emotion classes have [approximate] equal sample count. There is another dataset with duplicate samples here: https://huggingface.co/datasets/manojkumarvohra/replicated_emotions This one is different with grammatic variations of emotional statements.
manojkumarvohra/amplified_emotions
[ "language:en", "region:us" ]
2024-01-19T08:42:55+00:00
{"language": ["en"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "labels", "dtype": {"class_label": {"names": {"0": "sadness", "1": "joy", "2": "love", "3": "anger", "4": "fear", "5": "surprise"}}}}], "splits": [{"name": "train", "num_bytes": 3335026, "num_examples": 30295}, {"name": "validation", "num_bytes": 214695, "num_examples": 2000}, {"name": "test", "num_bytes": 217173, "num_examples": 2000}], "download_size": 1756592, "dataset_size": 3766894}}
2024-01-21T09:50:52+00:00
[]
[ "en" ]
TAGS #language-English #region-us
Dataset Summary --------------- Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise. For more detailed information please refer to the paper. This dataset is a processed form of "dair-ai/emotion" dataset. [URL In this one, I have amplified the samples for minority classes so that all the emotion classes have [approximate] equal sample count. There is another dataset with duplicate samples here: URL This one is different with grammatic variations of emotional statements.
[]
[ "TAGS\n#language-English #region-us \n" ]
70d602a53e3a5e5ef2332d10c15e9b5f1c11ecd7
# Italian version of the HellaSwag Dataset The dataset has been automatically translate by using [Argos Translate](https://github.com/argosopentech/argos-translate) v. 1.9.1 ### Citation Information ``` @misc{basile2023llamantino, title={LLaMAntino: LLaMA 2 Models for Effective Text Generation in Italian Language}, author={Pierpaolo Basile and Elio Musacchio and Marco Polignano and Lucia Siciliani and Giuseppe Fiameni and Giovanni Semeraro}, year={2023}, eprint={2312.09993}, archivePrefix={arXiv}, primaryClass={cs.CL} } @inproceedings{zellers2019hellaswag, title={HellaSwag: Can a Machine Really Finish Your Sentence?}, author={Zellers, Rowan and Holtzman, Ari and Bisk, Yonatan and Farhadi, Ali and Choi, Yejin}, booktitle ={Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics}, year={2019} } ``` <br> <br> </hr> # Original English version of the "hellaswag" dataset ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Additional Information](#additional-information) - [Licensing Information](#licensing-information) ## Dataset Description - **Homepage:** [https://rowanzellers.com/hellaswag/](https://rowanzellers.com/hellaswag/) - **Repository:** [https://github.com/rowanz/hellaswag/](https://github.com/rowanz/hellaswag/) - **Paper:** [HellaSwag: Can a Machine Really Finish Your Sentence?](https://arxiv.org/abs/1905.07830) - **Size of downloaded dataset files:** 71.49 MB - **Size of the generated dataset:** 65.32 MB - **Total amount of disk used:** 136.81 MB ### Dataset Summary HellaSwag: Can a Machine Really Finish Your Sentence? is a new dataset for commonsense NLI. A paper was published at ACL2019. ### Languages EN - ITA ## Dataset Structure ### Data Instances #### default - **Size of downloaded dataset files:** 71.49 MB - **Size of the generated dataset:** 65.32 MB - **Total amount of disk used:** 136.81 MB An example of 'train' looks as follows. ``` This example was too long and was cropped: { "activity_label": "Removing ice from car", "ctx": "Then, the man writes over the snow covering the window of a car, and a woman wearing winter clothes smiles. then", "ctx_a": "Then, the man writes over the snow covering the window of a car, and a woman wearing winter clothes smiles.", "ctx_b": "then", "endings": "[\", the man adds wax to the windshield and cuts it.\", \", a person board a ski lift, while two men supporting the head of the per...", "ind": 4, "label": "3", "source_id": "activitynet~v_-1IBHYS3L-Y", "split": "train", "split_type": "indomain" } ``` ### Data Fields The data fields are the same among all splits. #### default - `ind`: a `int32` feature. - `activity_label`: a `string` feature. - `ctx_a`: a `string` feature. - `ctx_b`: a `string` feature. - `ctx`: a `string` feature. - `endings`: a `list` of `string` features. - `source_id`: a `string` feature. - `split`: a `string` feature. - `split_type`: a `string` feature. - `label`: a `string` feature. ### Data Splits | name |train|validation|test | |-------|----:|---------:|----:| |default|39905| 10042|10003| ### Licensing Information MIT https://github.com/rowanz/hellaswag/blob/master/LICENSE ### Contributions Thanks to [@albertvillanova](https://github.com/albertvillanova), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun) for adding this dataset.
swap-uniba/hellaswag_ita
[ "task_categories:question-answering", "task_categories:text-generation", "language:it", "ita", "llms", "llamantino", "evaluation", "arxiv:2312.09993", "arxiv:1905.07830", "region:us" ]
2024-01-19T08:44:32+00:00
{"language": ["it"], "task_categories": ["question-answering", "text-generation"], "paperswithcode_id": "hellaswag", "pretty_name": "HellaSwag IT", "dataset_info": {"features": [{"name": "ind", "dtype": "int32"}, {"name": "activity_label", "dtype": "string"}, {"name": "ctx_a", "dtype": "string"}, {"name": "ctx_b", "dtype": "string"}, {"name": "ctx", "dtype": "string"}, {"name": "endings", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "split_type", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 43232624, "num_examples": 39905}, {"name": "test", "num_bytes": 10791853, "num_examples": 10003}, {"name": "validation", "num_bytes": 11175717, "num_examples": 10042}], "download_size": 71494896, "dataset_size": 65200194}, "tags": ["ita", "llms", "llamantino", "evaluation"]}
2024-01-19T08:51:42+00:00
[ "2312.09993", "1905.07830" ]
[ "it" ]
TAGS #task_categories-question-answering #task_categories-text-generation #language-Italian #ita #llms #llamantino #evaluation #arxiv-2312.09993 #arxiv-1905.07830 #region-us
Italian version of the HellaSwag Dataset ======================================== The dataset has been automatically translate by using Argos Translate v. 1.9.1 Original English version of the "hellaswag" dataset =================================================== Table of Contents ----------------- * Dataset Description + Dataset Summary + Languages * Dataset Structure + Data Instances + Data Fields + Data Splits * Additional Information + Licensing Information Dataset Description ------------------- * Homepage: URL * Repository: URL * Paper: HellaSwag: Can a Machine Really Finish Your Sentence? * Size of downloaded dataset files: 71.49 MB * Size of the generated dataset: 65.32 MB * Total amount of disk used: 136.81 MB ### Dataset Summary HellaSwag: Can a Machine Really Finish Your Sentence? is a new dataset for commonsense NLI. A paper was published at ACL2019. ### Languages EN - ITA Dataset Structure ----------------- ### Data Instances #### default * Size of downloaded dataset files: 71.49 MB * Size of the generated dataset: 65.32 MB * Total amount of disk used: 136.81 MB An example of 'train' looks as follows. ### Data Fields The data fields are the same among all splits. #### default * 'ind': a 'int32' feature. * 'activity\_label': a 'string' feature. * 'ctx\_a': a 'string' feature. * 'ctx\_b': a 'string' feature. * 'ctx': a 'string' feature. * 'endings': a 'list' of 'string' features. * 'source\_id': a 'string' feature. * 'split': a 'string' feature. * 'split\_type': a 'string' feature. * 'label': a 'string' feature. ### Data Splits ### Licensing Information MIT URL ### Contributions Thanks to @albertvillanova, @mariamabarham, @thomwolf, @patrickvonplaten, @lewtun for adding this dataset.
[ "### Dataset Summary\n\n\nHellaSwag: Can a Machine Really Finish Your Sentence? is a new dataset for commonsense NLI. A paper was published at ACL2019.", "### Languages\n\n\nEN - ITA\n\n\nDataset Structure\n-----------------", "### Data Instances", "#### default\n\n\n* Size of downloaded dataset files: 71.49 MB\n* Size of the generated dataset: 65.32 MB\n* Total amount of disk used: 136.81 MB\n\n\nAn example of 'train' looks as follows.", "### Data Fields\n\n\nThe data fields are the same among all splits.", "#### default\n\n\n* 'ind': a 'int32' feature.\n* 'activity\\_label': a 'string' feature.\n* 'ctx\\_a': a 'string' feature.\n* 'ctx\\_b': a 'string' feature.\n* 'ctx': a 'string' feature.\n* 'endings': a 'list' of 'string' features.\n* 'source\\_id': a 'string' feature.\n* 'split': a 'string' feature.\n* 'split\\_type': a 'string' feature.\n* 'label': a 'string' feature.", "### Data Splits", "### Licensing Information\n\n\nMIT URL", "### Contributions\n\n\nThanks to @albertvillanova, @mariamabarham, @thomwolf, @patrickvonplaten, @lewtun for adding this dataset." ]
[ "TAGS\n#task_categories-question-answering #task_categories-text-generation #language-Italian #ita #llms #llamantino #evaluation #arxiv-2312.09993 #arxiv-1905.07830 #region-us \n", "### Dataset Summary\n\n\nHellaSwag: Can a Machine Really Finish Your Sentence? is a new dataset for commonsense NLI. A paper was published at ACL2019.", "### Languages\n\n\nEN - ITA\n\n\nDataset Structure\n-----------------", "### Data Instances", "#### default\n\n\n* Size of downloaded dataset files: 71.49 MB\n* Size of the generated dataset: 65.32 MB\n* Total amount of disk used: 136.81 MB\n\n\nAn example of 'train' looks as follows.", "### Data Fields\n\n\nThe data fields are the same among all splits.", "#### default\n\n\n* 'ind': a 'int32' feature.\n* 'activity\\_label': a 'string' feature.\n* 'ctx\\_a': a 'string' feature.\n* 'ctx\\_b': a 'string' feature.\n* 'ctx': a 'string' feature.\n* 'endings': a 'list' of 'string' features.\n* 'source\\_id': a 'string' feature.\n* 'split': a 'string' feature.\n* 'split\\_type': a 'string' feature.\n* 'label': a 'string' feature.", "### Data Splits", "### Licensing Information\n\n\nMIT URL", "### Contributions\n\n\nThanks to @albertvillanova, @mariamabarham, @thomwolf, @patrickvonplaten, @lewtun for adding this dataset." ]
ab63aa796d3bd67cb2bd0c965273948a99263dfe
# Dataset Card for Evaluation run of leveldevai/MarcDareBeagle-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [leveldevai/MarcDareBeagle-7B](https://huggingface.co/leveldevai/MarcDareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T08:53:36.117087](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B/blob/main/results_2024-01-19T08-53-36.117087.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6561257128939425, "acc_stderr": 0.03198667178637761, "acc_norm": 0.6554096772583735, "acc_norm_stderr": 0.03265522262038939, "mc1": 0.5410036719706243, "mc1_stderr": 0.017444544447661206, "mc2": 0.6808641879386289, "mc2_stderr": 0.015124785314472101 }, "harness|arc:challenge|25": { "acc": 0.6988054607508533, "acc_stderr": 0.013406741767847632, "acc_norm": 0.7209897610921502, "acc_norm_stderr": 0.01310678488360133 }, "harness|hellaswag|10": { "acc": 0.7101175064728141, "acc_stderr": 0.004527804016253783, "acc_norm": 0.8832901812387971, "acc_norm_stderr": 0.00320418007294237 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04072314811876837, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700918, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700918 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.03533133389323657, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.03533133389323657 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816508, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42063492063492064, "acc_stderr": 0.025424835086923996, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.025424835086923996 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.04451807959055328, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.04451807959055328 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.02302589961718872, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.02302589961718872 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6564102564102564, "acc_stderr": 0.02407869658063548, "acc_norm": 0.6564102564102564, "acc_norm_stderr": 0.02407869658063548 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.02889774874113115, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.02889774874113115 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250454, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250454 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.025955020841621115, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.025955020841621115 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.816793893129771, "acc_stderr": 0.03392770926494733, "acc_norm": 0.816793893129771, "acc_norm_stderr": 0.03392770926494733 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.046840993210771065, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.046840993210771065 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8314176245210728, "acc_stderr": 0.013387895731543604, "acc_norm": 0.8314176245210728, "acc_norm_stderr": 0.013387895731543604 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069356, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069356 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4312849162011173, "acc_stderr": 0.016563829399047707, "acc_norm": 0.4312849162011173, "acc_norm_stderr": 0.016563829399047707 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818733, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818733 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984813, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.023891879541959614, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.023891879541959614 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4654498044328553, "acc_stderr": 0.012739711554045702, "acc_norm": 0.4654498044328553, "acc_norm_stderr": 0.012739711554045702 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.028501452860396556, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.028501452860396556 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685516, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5410036719706243, "mc1_stderr": 0.017444544447661206, "mc2": 0.6808641879386289, "mc2_stderr": 0.015124785314472101 }, "harness|winogrande|5": { "acc": 0.8318863456985004, "acc_stderr": 0.01051033695416674 }, "harness|gsm8k|5": { "acc": 0.7179681576952237, "acc_stderr": 0.012394926584335695 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B
[ "region:us" ]
2024-01-19T08:55:53+00:00
{"pretty_name": "Evaluation run of leveldevai/MarcDareBeagle-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [leveldevai/MarcDareBeagle-7B](https://huggingface.co/leveldevai/MarcDareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T08:53:36.117087](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B/blob/main/results_2024-01-19T08-53-36.117087.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6561257128939425,\n \"acc_stderr\": 0.03198667178637761,\n \"acc_norm\": 0.6554096772583735,\n \"acc_norm_stderr\": 0.03265522262038939,\n \"mc1\": 0.5410036719706243,\n \"mc1_stderr\": 0.017444544447661206,\n \"mc2\": 0.6808641879386289,\n \"mc2_stderr\": 0.015124785314472101\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6988054607508533,\n \"acc_stderr\": 0.013406741767847632,\n \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.01310678488360133\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7101175064728141,\n \"acc_stderr\": 0.004527804016253783,\n \"acc_norm\": 0.8832901812387971,\n \"acc_norm_stderr\": 0.00320418007294237\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063548,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063548\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045702,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045702\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5410036719706243,\n \"mc1_stderr\": 0.017444544447661206,\n \"mc2\": 0.6808641879386289,\n \"mc2_stderr\": 0.015124785314472101\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.01051033695416674\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7179681576952237,\n \"acc_stderr\": 0.012394926584335695\n }\n}\n```", "repo_url": "https://huggingface.co/leveldevai/MarcDareBeagle-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|arc:challenge|25_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|gsm8k|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hellaswag|10_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T08-53-36.117087.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["**/details_harness|winogrande|5_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T08-53-36.117087.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T08_53_36.117087", "path": ["results_2024-01-19T08-53-36.117087.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T08-53-36.117087.parquet"]}]}]}
2024-01-19T08:56:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of leveldevai/MarcDareBeagle-7B Dataset automatically created during the evaluation run of model leveldevai/MarcDareBeagle-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T08:53:36.117087(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of leveldevai/MarcDareBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/MarcDareBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T08:53:36.117087(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of leveldevai/MarcDareBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/MarcDareBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T08:53:36.117087(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
2c42bd321cbbc4ebb619907f77b24421d477f860
# Dataset Card for "orca_dedup" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
hqfx/orca_dedup
[ "region:us" ]
2024-01-19T08:57:05+00:00
{"dataset_info": {"features": [{"name": "conversation", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 602329260, "num_examples": 363491}], "download_size": 301934253, "dataset_size": 602329260}}
2024-01-19T09:04:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for "orca_dedup" More Information needed
[ "# Dataset Card for \"orca_dedup\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"orca_dedup\"\n\nMore Information needed" ]
ad0385b6a9a53a4745785faae8bf9f1385bd0cd6
# Dataset Card for Evaluation run of leveldevai/MBA-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [leveldevai/MBA-7B](https://huggingface.co/leveldevai/MBA-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_leveldevai__MBA-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T09:07:51.198061](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MBA-7B/blob/main/results_2024-01-19T09-07-51.198061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6557516748328418, "acc_stderr": 0.03188082626311897, "acc_norm": 0.656045399940601, "acc_norm_stderr": 0.03253327964878977, "mc1": 0.45777233782129745, "mc1_stderr": 0.01744096571248212, "mc2": 0.6270987571451256, "mc2_stderr": 0.015280108431010799 }, "harness|arc:challenge|25": { "acc": 0.6629692832764505, "acc_stderr": 0.013813476652902274, "acc_norm": 0.6945392491467577, "acc_norm_stderr": 0.01346008047800251 }, "harness|hellaswag|10": { "acc": 0.6919936267675761, "acc_stderr": 0.004607256752931883, "acc_norm": 0.8722366062537343, "acc_norm_stderr": 0.0033314391934060345 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7283018867924528, "acc_stderr": 0.027377706624670713, "acc_norm": 0.7283018867924528, "acc_norm_stderr": 0.027377706624670713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7052023121387283, "acc_stderr": 0.03476599607516478, "acc_norm": 0.7052023121387283, "acc_norm_stderr": 0.03476599607516478 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.049135952012744975, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.049135952012744975 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5957446808510638, "acc_stderr": 0.03208115750788684, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944427, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944427 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8, "acc_stderr": 0.022755204959542946, "acc_norm": 0.8, "acc_norm_stderr": 0.022755204959542946 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033484, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033484 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131147, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131147 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.029953823891887034, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.029953823891887034 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669237, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669237 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931796, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931796 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8312236286919831, "acc_stderr": 0.024381406832586234, "acc_norm": 0.8312236286919831, "acc_norm_stderr": 0.024381406832586234 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8352490421455939, "acc_stderr": 0.013265346261323798, "acc_norm": 0.8352490421455939, "acc_norm_stderr": 0.013265346261323798 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069353, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069353 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.423463687150838, "acc_stderr": 0.0165254258987735, "acc_norm": 0.423463687150838, "acc_norm_stderr": 0.0165254258987735 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.02575586592263295, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.02575586592263295 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600713, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600713 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869647, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869647 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.018999707383162673, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.018999707383162673 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142773, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142773 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306046, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306046 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640038, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640038 }, "harness|truthfulqa:mc|0": { "mc1": 0.45777233782129745, "mc1_stderr": 0.01744096571248212, "mc2": 0.6270987571451256, "mc2_stderr": 0.015280108431010799 }, "harness|winogrande|5": { "acc": 0.8153117600631413, "acc_stderr": 0.01090597811215688 }, "harness|gsm8k|5": { "acc": 0.690674753601213, "acc_stderr": 0.012731710925078143 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_leveldevai__MBA-7B
[ "region:us" ]
2024-01-19T09:10:08+00:00
{"pretty_name": "Evaluation run of leveldevai/MBA-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [leveldevai/MBA-7B](https://huggingface.co/leveldevai/MBA-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__MBA-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T09:07:51.198061](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MBA-7B/blob/main/results_2024-01-19T09-07-51.198061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557516748328418,\n \"acc_stderr\": 0.03188082626311897,\n \"acc_norm\": 0.656045399940601,\n \"acc_norm_stderr\": 0.03253327964878977,\n \"mc1\": 0.45777233782129745,\n \"mc1_stderr\": 0.01744096571248212,\n \"mc2\": 0.6270987571451256,\n \"mc2_stderr\": 0.015280108431010799\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902274,\n \"acc_norm\": 0.6945392491467577,\n \"acc_norm_stderr\": 0.01346008047800251\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6919936267675761,\n \"acc_stderr\": 0.004607256752931883,\n \"acc_norm\": 0.8722366062537343,\n \"acc_norm_stderr\": 0.0033314391934060345\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586234,\n \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586234\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323798,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323798\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069353,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069353\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n \"acc_stderr\": 0.0165254258987735,\n \"acc_norm\": 0.423463687150838,\n \"acc_norm_stderr\": 0.0165254258987735\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45777233782129745,\n \"mc1_stderr\": 0.01744096571248212,\n \"mc2\": 0.6270987571451256,\n \"mc2_stderr\": 0.015280108431010799\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \"acc_stderr\": 0.012731710925078143\n }\n}\n```", "repo_url": "https://huggingface.co/leveldevai/MBA-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|arc:challenge|25_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|gsm8k|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hellaswag|10_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T09-07-51.198061.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["**/details_harness|winogrande|5_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T09-07-51.198061.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T09_07_51.198061", "path": ["results_2024-01-19T09-07-51.198061.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T09-07-51.198061.parquet"]}]}]}
2024-01-19T09:10:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of leveldevai/MBA-7B Dataset automatically created during the evaluation run of model leveldevai/MBA-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T09:07:51.198061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of leveldevai/MBA-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/MBA-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T09:07:51.198061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of leveldevai/MBA-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/MBA-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T09:07:51.198061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
54f9b4419fe7f2dae4b2b59ac2299b708022c567
# Dataset Card for Evaluation run of namanpundir/theus_concepttagger <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [namanpundir/theus_concepttagger](https://huggingface.co/namanpundir/theus_concepttagger) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_namanpundir__theus_concepttagger", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T09:19:07.563026](https://huggingface.co/datasets/open-llm-leaderboard/details_namanpundir__theus_concepttagger/blob/main/results_2024-01-19T09-19-07.563026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.23155074551271232, "acc_stderr": 0.029928741303991906, "acc_norm": 0.2318198554858782, "acc_norm_stderr": 0.030717432810767893, "mc1": 0.24357405140758873, "mc1_stderr": 0.01502635482491078, "mc2": 0.4824892409622544, "mc2_stderr": 0.01656017052953912 }, "harness|arc:challenge|25": { "acc": 0.20733788395904437, "acc_stderr": 0.011846905782971356, "acc_norm": 0.24573378839590443, "acc_norm_stderr": 0.012581033453730111 }, "harness|hellaswag|10": { "acc": 0.25761800438159727, "acc_stderr": 0.004364287353415452, "acc_norm": 0.2550288787094204, "acc_norm_stderr": 0.004349866376068979 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.24357405140758873, "mc1_stderr": 0.01502635482491078, "mc2": 0.4824892409622544, "mc2_stderr": 0.01656017052953912 }, "harness|winogrande|5": { "acc": 0.48303078137332284, "acc_stderr": 0.014044390401612978 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_namanpundir__theus_concepttagger
[ "region:us" ]
2024-01-19T09:20:36+00:00
{"pretty_name": "Evaluation run of namanpundir/theus_concepttagger", "dataset_summary": "Dataset automatically created during the evaluation run of model [namanpundir/theus_concepttagger](https://huggingface.co/namanpundir/theus_concepttagger) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_namanpundir__theus_concepttagger\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T09:19:07.563026](https://huggingface.co/datasets/open-llm-leaderboard/details_namanpundir__theus_concepttagger/blob/main/results_2024-01-19T09-19-07.563026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23155074551271232,\n \"acc_stderr\": 0.029928741303991906,\n \"acc_norm\": 0.2318198554858782,\n \"acc_norm_stderr\": 0.030717432810767893,\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.4824892409622544,\n \"mc2_stderr\": 0.01656017052953912\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20733788395904437,\n \"acc_stderr\": 0.011846905782971356,\n \"acc_norm\": 0.24573378839590443,\n \"acc_norm_stderr\": 0.012581033453730111\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25761800438159727,\n \"acc_stderr\": 0.004364287353415452,\n \"acc_norm\": 0.2550288787094204,\n \"acc_norm_stderr\": 0.004349866376068979\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.4824892409622544,\n \"mc2_stderr\": 0.01656017052953912\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.48303078137332284,\n \"acc_stderr\": 0.014044390401612978\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/namanpundir/theus_concepttagger", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|arc:challenge|25_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|gsm8k|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hellaswag|10_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T09-19-07.563026.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["**/details_harness|winogrande|5_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T09-19-07.563026.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T09_19_07.563026", "path": ["results_2024-01-19T09-19-07.563026.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T09-19-07.563026.parquet"]}]}]}
2024-01-19T09:21:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of namanpundir/theus_concepttagger Dataset automatically created during the evaluation run of model namanpundir/theus_concepttagger on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T09:19:07.563026(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of namanpundir/theus_concepttagger\n\n\n\nDataset automatically created during the evaluation run of model namanpundir/theus_concepttagger on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T09:19:07.563026(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of namanpundir/theus_concepttagger\n\n\n\nDataset automatically created during the evaluation run of model namanpundir/theus_concepttagger on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T09:19:07.563026(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
bad60a51580440866adb470bc68a309dfa72a403
## Source dataset created from https://huggingface.co/datasets/huggan/wikiart ## Task Find two images that have the same genre ## Prompt: ``` The most common method for classifying art paintings is by genre (or theme). The term “genre” refers to the type of image that serves as the subject of a painting. The genre of a painting is classified as a cityscape, landscape, nude painting, portrait, religious painting, sketch and study, or still life. Given the four images of art paintings, can you tell which two of them have the same genre? Select between the following choices. (A) ... (B) ... (C) ... (D) ... ``` --- license: apache-2.0 dataset_info: features: - name: idx dtype: int32 - name: image1 dtype: image - name: image2 dtype: image - name: image3 dtype: image - name: image4 dtype: image - name: choices sequence: string splits: - name: test num_bytes: 486017599.0 num_examples: 300 download_size: 480853119 dataset_size: 486017599.0 configs: - config_name: default data_files: - split: test path: data/test-* ---
PerceptionEval/ArtGenreTest
[ "region:us" ]
2024-01-19T09:52:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "val", "path": "data/val-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "idx", "dtype": "int32"}, {"name": "image1", "dtype": "image"}, {"name": "image2", "dtype": "image"}, {"name": "image3", "dtype": "image"}, {"name": "image4", "dtype": "image"}, {"name": "choices", "sequence": "string"}], "splits": [{"name": "val", "num_bytes": 240272359.0, "num_examples": 150}, {"name": "test", "num_bytes": 245745240.0, "num_examples": 150}], "download_size": 480433760, "dataset_size": 486017599.0}}
2024-01-29T10:40:59+00:00
[]
[]
TAGS #region-us
## Source dataset created from URL ## Task Find two images that have the same genre ## Prompt: --- license: apache-2.0 dataset_info: features: - name: idx dtype: int32 - name: image1 dtype: image - name: image2 dtype: image - name: image3 dtype: image - name: image4 dtype: image - name: choices sequence: string splits: - name: test num_bytes: 486017599.0 num_examples: 300 download_size: 480853119 dataset_size: 486017599.0 configs: - config_name: default data_files: - split: test path: data/test-* ---
[ "## Source\ndataset created from URL", "## Task\nFind two images that have the same genre", "## Prompt: \n\n\n---\nlicense: apache-2.0\ndataset_info:\n features:\n - name: idx\n dtype: int32\n - name: image1\n dtype: image\n - name: image2\n dtype: image\n - name: image3\n dtype: image\n - name: image4\n dtype: image\n - name: choices\n sequence: string\n splits:\n - name: test\n num_bytes: 486017599.0\n num_examples: 300\n download_size: 480853119\n dataset_size: 486017599.0\nconfigs:\n- config_name: default\n data_files:\n - split: test\n path: data/test-*\n---" ]
[ "TAGS\n#region-us \n", "## Source\ndataset created from URL", "## Task\nFind two images that have the same genre", "## Prompt: \n\n\n---\nlicense: apache-2.0\ndataset_info:\n features:\n - name: idx\n dtype: int32\n - name: image1\n dtype: image\n - name: image2\n dtype: image\n - name: image3\n dtype: image\n - name: image4\n dtype: image\n - name: choices\n sequence: string\n splits:\n - name: test\n num_bytes: 486017599.0\n num_examples: 300\n download_size: 480853119\n dataset_size: 486017599.0\nconfigs:\n- config_name: default\n data_files:\n - split: test\n path: data/test-*\n---" ]
71a2e011a42823051a2b4eb303a3366bdbe048d3
# Ko-mrtydi This dataset represents a conversion of the Korean (Ko) section from the [Mr.TyDI dataset](https://github.com/castorini/mr.tydi) into the [BeIR](https://github.com/beir-cellar/beir) format, making it compatible for use with [mteb](https://github.com/embeddings-benchmark/mteb).
taeminlee/Ko-mrtydi
[ "task_categories:text-retrieval", "task_ids:document-retrieval", "multilinguality:monolingual", "size_categories:1K<n<10K", "source_datasets:mrtydi", "language:ko", "text-retrieval", "region:us" ]
2024-01-19T10:08:45+00:00
{"language": ["ko"], "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "source_datasets": ["mrtydi"], "task_categories": ["text-retrieval"], "task_ids": ["document-retrieval"], "config_names": ["default", "queries", "corpus"], "tags": ["text-retrieval"], "dataset_info": [{"config_name": "default", "features": [{"name": "query-id", "dtype": "string"}, {"name": "corpus-id", "dtype": "string"}, {"name": "score", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 34828, "num_examples": 1317}, {"name": "dev", "num_bytes": 8121, "num_examples": 307}, {"name": "test", "num_bytes": 13482, "num_examples": 492}]}, {"config_name": "corpus", "features": [{"name": "_id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "corpus", "num_bytes": 609925549, "num_examples": 1496126}]}, {"config_name": "queries", "features": [{"name": "_id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "queries", "num_bytes": 135944, "num_examples": 2019}]}], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "qrels/train.jsonl"}, {"split": "dev", "path": "qrels/dev.jsonl"}, {"split": "test", "path": "qrels/test.jsonl"}]}, {"config_name": "corpus", "data_files": [{"split": "corpus", "path": "corpus.jsonl"}]}, {"config_name": "queries", "data_files": [{"split": "queries", "path": "queries.jsonl"}]}]}
2024-01-19T10:16:40+00:00
[]
[ "ko" ]
TAGS #task_categories-text-retrieval #task_ids-document-retrieval #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-mrtydi #language-Korean #text-retrieval #region-us
# Ko-mrtydi This dataset represents a conversion of the Korean (Ko) section from the Mr.TyDI dataset into the BeIR format, making it compatible for use with mteb.
[ "# Ko-mrtydi\n\nThis dataset represents a conversion of the Korean (Ko) section from the Mr.TyDI dataset into the BeIR format, making it compatible for use with mteb." ]
[ "TAGS\n#task_categories-text-retrieval #task_ids-document-retrieval #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-mrtydi #language-Korean #text-retrieval #region-us \n", "# Ko-mrtydi\n\nThis dataset represents a conversion of the Korean (Ko) section from the Mr.TyDI dataset into the BeIR format, making it compatible for use with mteb." ]
ad172eee024097b13f5f20c0134dc0202521cf97
# LitScan high quality subset This is a subset of the ePMC articles located by LitScan that mention the RNA ID in their title. We're calling this high quality because the hope is that if the RNA is mentioned in the title, then the paper is _definitely_ about that RNA and doesn't just mention it in passing. There are approx 50k articles in this subset, which isn't loads.
afg1/litscan-epmc-subset-hq
[ "task_categories:text-generation", "language:en", "region:us" ]
2024-01-19T10:24:35+00:00
{"language": ["en"], "task_categories": ["text-generation"]}
2024-01-19T10:31:23+00:00
[]
[ "en" ]
TAGS #task_categories-text-generation #language-English #region-us
# LitScan high quality subset This is a subset of the ePMC articles located by LitScan that mention the RNA ID in their title. We're calling this high quality because the hope is that if the RNA is mentioned in the title, then the paper is _definitely_ about that RNA and doesn't just mention it in passing. There are approx 50k articles in this subset, which isn't loads.
[ "# LitScan high quality subset\n\nThis is a subset of the ePMC articles located by LitScan that mention the RNA ID in their title.\n\nWe're calling this high quality because the hope is that if the RNA is mentioned in the title, \nthen the paper is _definitely_ about that RNA and doesn't just mention it in passing.\n\nThere are approx 50k articles in this subset, which isn't loads." ]
[ "TAGS\n#task_categories-text-generation #language-English #region-us \n", "# LitScan high quality subset\n\nThis is a subset of the ePMC articles located by LitScan that mention the RNA ID in their title.\n\nWe're calling this high quality because the hope is that if the RNA is mentioned in the title, \nthen the paper is _definitely_ about that RNA and doesn't just mention it in passing.\n\nThere are approx 50k articles in this subset, which isn't loads." ]
c1edb810482d27aad153bce6e988646eb043ce62
<p align="right"> <a href="https://github.com/argilla-io/distilabel"> <img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/> </a> </p> # distilabel Orca Pairs for DPO The dataset is a "distilabeled" version of the widely used dataset: [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs). The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved. Continuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with [distilabel](https://github.com/argilla-io/distilabel). This was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs. Additionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details. ## Using this dataset This dataset is useful for preference tuning and we recommend using it instead of the original. It's already prepared in the "standard" chosen, rejected format with additional information for further filtering and experimentation. The main changes are: 1. ~2K pairs have been swapped: rejected become the chosen response. We have kept the original chosen and rejected on two new columns `original_*` for reproducibility purposes. 2. 4K pairs have been identified as `tie`: equally bad or good. 3. Chosen scores have been added: you can now filter out based on a threshold (see our distilabeled Hermes 2.5 model for an example) 4. We have kept the ratings and rationales generated with gpt-4-turbo and distilabel so you can prepare the data differently if you want. 5. We have added a column to indicate if the input is part of gsm8k train set. In our experiments, we have got very good results by reducing the size of the dataset by more than 50%. Here's an example of how to achieve that: ```python from datasets import load_dataset # Instead of this: # dataset = load_dataset("Intel/orca_dpo_pairs", split="train") # use this: dataset = load_dataset("argilla/distilabel-intel-orca-dpo-pairs", split="train") dataset = dataset.filter( lambda r: r["status"] != "tie" and r["chosen_score"] >= 8 and not r["in_gsm8k_train"] ) ``` This results in `5,922` instead of `12,859` samples (54% reduction) and leads to better performance than the same model tuned with 100% of the samples in the original dataset. > We'd love to hear about your experiments! If you want to try this out, consider joining our [Slack community](https://join.slack.com/t/rubrixworkspace/shared_invite/zt-whigkyjn-a3IUJLD7gDbTZ0rKlvcJ5g) and let's build some open datasets and models together. ## Reproducing the dataset In this section, we outline the steps to reproduce this dataset. ### Rate original dataset pairs Build a preference dataset with distilabel using the original dataset: ```python from distilabel.llm import OpenAILLM from distilabel.tasks import JudgeLMTask from distilabel.pipeline import Pipeline from datasets import load_dataset # Shuffle 'chosen' and 'rejected' to avoid positional bias and keep track of the order def shuffle_and_track(chosen, rejected): pair = [chosen, rejected] random.shuffle(pair) order = ["chosen" if x == chosen else "rejected" for x in pair] return {"generations": pair, "order": order} dataset = load_dataset("Intel/orca_dpo_pairs", split="train") # This shuffles the pairs to mitigate positional bias dataset = dataset.map(lambda x: shuffle_and_track(x["chosen"], x["rejected"])) # We use our JudgeLM implementation to rate the original pairs labeler = OpenAILLM( task=JudgeLMTask(), model="gpt-4-1106-preview", num_threads=16, max_new_tokens=512, ) dataset = dataset.rename_columns({"question": "input"}) distipipe = Pipeline( labeller=labeler ) # This computes ratings and natural language critiques for each pair ds = distipipe.generate(dataset=dataset, num_generations=2) ``` If you want to further filter and curate the dataset, you can push the dataset to [Argilla](https://github.com/argilla-io/argilla) as follows: ```python rg_dataset = ds.to_argilla() rg_dataset.push_to_argilla(name="your_dataset_name", workspace="your_workspace_name") ``` You get a nice UI with a lot of pre-computed metadata to explore and curate the dataset: ![image/png](https://cdn-uploads.huggingface.co/production/uploads/60420dccc15e823a685f2b03/IoK4nFObadhJpkVmWALZP.png) The resulting dataset is now much more useful: we know which response is preferred (by gpt-4-turbo), which ones have low scores, and we even have natural language explanations. But what did we find? Was our intuition confirmed? ![image/png](https://cdn-uploads.huggingface.co/production/uploads/60420dccc15e823a685f2b03/-V8wY1DYzrtwM9LbGrBXq.png) The above chart shows the following: * ~4,000 pairs were given the same rating (a tie). * ~7,000 pairs were correct according to our AI judge (`unchanged`). * and ~2,000 times the rejected response was preferred (`swapped`). Now the next question is: can we build better models with this new knowledge? The answer is the "distilabeled Hermes" model, check it out! ### Post-processing to add useful information Swap rejected and chosen, and add chosen scores and status: ```python def add_status(r): status = "unchanged" highest_rated_idx = np.argmax(r['rating']) # Compare to the index of the chosen response if r['rating']== None or r['rating'][0] == r['rating'][1]: status = "tie" elif r['order'][highest_rated_idx] != 'chosen': status = "swapped" return {"status": status} def swap(r): chosen = r["chosen"] rejected = r["rejected"] if r['rating'] is not None: chosen_score = r['rating'][np.argmax(r['rating'])] else: chosen_score = None if r['status'] == "swapped": chosen = r["rejected"] rejected = r["chosen"] return { "chosen": chosen, "rejected": rejected, "original_chosen": r["chosen"], "original_rejected": r["rejected"], "chosen_score": chosen_score } updated = ds.map(add_status).map(swap) ``` ### gsm8k "decontamination" The basic approach for finding duplicated examples. We didn't find any from the test sets. We experimented with lower thresholds but below 0.8 they introduced false positives: ```python import pandas as pd import nltk from sklearn.feature_extraction.text import TfidfVectorizer from sklearn.metrics.pairwise import cosine_similarity from datasets import load_dataset nltk.download('punkt') # Load the datasets source_dataset = load_dataset("gsm8k", "main", split="train") source_dataset_socratic = load_dataset("gsm8k", "socratic", split="train") #target_dataset = load_dataset("Intel/orca_dpo_pairs", split="train") target_dataset = load_dataset("argilla/distilabel-intel-orca-dpo-pairs", split="train") # Extract the 'question' column from each dataset source_questions = source_dataset['question'] source_questions_socratic = source_dataset_socratic['question'] target_questions = target_dataset['input'] # Function to preprocess the text def preprocess(text): return nltk.word_tokenize(text.lower()) # Preprocess the questions source_questions_processed = [preprocess(q) for q in source_questions] source_questions.extend([preprocess(q) for q in source_questions_socratic]) target_questions_processed = [preprocess(q) for q in target_questions] # Vectorize the questions vectorizer = TfidfVectorizer() source_vec = vectorizer.fit_transform([' '.join(q) for q in source_questions_processed]) target_vec = vectorizer.transform([' '.join(q) for q in target_questions_processed]) # Calculate cosine similarity similarity_matrix = cosine_similarity(source_vec, target_vec) # Determine matches based on a threshold: # checked manually and below 0.8 there are only false positives threshold = 0.8 matching_pairs = [] for i, row in enumerate(similarity_matrix): for j, similarity in enumerate(row): if similarity >= threshold: matching_pairs.append((source_questions[i], target_questions[j], similarity)) # Create a DataFrame from the matching pairs df = pd.DataFrame(matching_pairs, columns=['Source Question', 'Target Question', 'Similarity Score']) # Create a set of matching target questions matching_target_questions = list(df['Target Question']) # Add a column to the target dataset indicating whether each question is matched target_dataset = target_dataset.map(lambda example: {"in_gsm8k_train": example['input'] in matching_target_questions}) ``` Result: ``` False 12780 True 79 Name: in_gsm8k_train ```
duxx/distilabel-intel-orca-dpo-pairs-tr
[ "language:tr", "license:apache-2.0", "rlaif", "dpo", "rlhf", "distilabel", "synthetic", "region:us" ]
2024-01-19T10:32:50+00:00
{"language": ["tr"], "license": "apache-2.0", "tags": ["rlaif", "dpo", "rlhf", "distilabel", "synthetic"]}
2024-02-05T18:59:36+00:00
[]
[ "tr" ]
TAGS #language-Turkish #license-apache-2.0 #rlaif #dpo #rlhf #distilabel #synthetic #region-us
<p align="right"> <a href="URL <img src="URL alt="Built with Distilabel" width="200" height="32"/> </a> </p> # distilabel Orca Pairs for DPO The dataset is a "distilabeled" version of the widely used dataset: Intel/orca_dpo_pairs. The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved. Continuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with distilabel. This was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs. Additionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details. ## Using this dataset This dataset is useful for preference tuning and we recommend using it instead of the original. It's already prepared in the "standard" chosen, rejected format with additional information for further filtering and experimentation. The main changes are: 1. ~2K pairs have been swapped: rejected become the chosen response. We have kept the original chosen and rejected on two new columns 'original_*' for reproducibility purposes. 2. 4K pairs have been identified as 'tie': equally bad or good. 3. Chosen scores have been added: you can now filter out based on a threshold (see our distilabeled Hermes 2.5 model for an example) 4. We have kept the ratings and rationales generated with gpt-4-turbo and distilabel so you can prepare the data differently if you want. 5. We have added a column to indicate if the input is part of gsm8k train set. In our experiments, we have got very good results by reducing the size of the dataset by more than 50%. Here's an example of how to achieve that: This results in '5,922' instead of '12,859' samples (54% reduction) and leads to better performance than the same model tuned with 100% of the samples in the original dataset. > We'd love to hear about your experiments! If you want to try this out, consider joining our Slack community and let's build some open datasets and models together. ## Reproducing the dataset In this section, we outline the steps to reproduce this dataset. ### Rate original dataset pairs Build a preference dataset with distilabel using the original dataset: If you want to further filter and curate the dataset, you can push the dataset to Argilla as follows: You get a nice UI with a lot of pre-computed metadata to explore and curate the dataset: !image/png The resulting dataset is now much more useful: we know which response is preferred (by gpt-4-turbo), which ones have low scores, and we even have natural language explanations. But what did we find? Was our intuition confirmed? !image/png The above chart shows the following: * ~4,000 pairs were given the same rating (a tie). * ~7,000 pairs were correct according to our AI judge ('unchanged'). * and ~2,000 times the rejected response was preferred ('swapped'). Now the next question is: can we build better models with this new knowledge? The answer is the "distilabeled Hermes" model, check it out! ### Post-processing to add useful information Swap rejected and chosen, and add chosen scores and status: ### gsm8k "decontamination" The basic approach for finding duplicated examples. We didn't find any from the test sets. We experimented with lower thresholds but below 0.8 they introduced false positives: Result:
[ "# distilabel Orca Pairs for DPO\n\nThe dataset is a \"distilabeled\" version of the widely used dataset: Intel/orca_dpo_pairs. The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved.\n\nContinuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with distilabel. \n\nThis was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs. \n\nAdditionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details.", "## Using this dataset\nThis dataset is useful for preference tuning and we recommend using it instead of the original. It's already prepared in the \"standard\" chosen, rejected format with additional information for further filtering and experimentation. \n\nThe main changes are:\n\n1. ~2K pairs have been swapped: rejected become the chosen response. We have kept the original chosen and rejected on two new columns 'original_*' for reproducibility purposes.\n2. 4K pairs have been identified as 'tie': equally bad or good.\n3. Chosen scores have been added: you can now filter out based on a threshold (see our distilabeled Hermes 2.5 model for an example)\n4. We have kept the ratings and rationales generated with gpt-4-turbo and distilabel so you can prepare the data differently if you want.\n5. We have added a column to indicate if the input is part of gsm8k train set.\n\nIn our experiments, we have got very good results by reducing the size of the dataset by more than 50%. Here's an example of how to achieve that:\n\n\nThis results in '5,922' instead of '12,859' samples (54% reduction) and leads to better performance than the same model tuned with 100% of the samples in the original dataset.\n\n> We'd love to hear about your experiments! If you want to try this out, consider joining our Slack community and let's build some open datasets and models together.", "## Reproducing the dataset\nIn this section, we outline the steps to reproduce this dataset.", "### Rate original dataset pairs\n\nBuild a preference dataset with distilabel using the original dataset:\n\n\n\nIf you want to further filter and curate the dataset, you can push the dataset to Argilla as follows:\n\n\nYou get a nice UI with a lot of pre-computed metadata to explore and curate the dataset:\n\n!image/png\n\nThe resulting dataset is now much more useful: we know which response is preferred (by gpt-4-turbo), which ones have low scores, and we even have natural language explanations. But what did we find? Was our intuition confirmed?\n\n!image/png\n\nThe above chart shows the following: \n\n* ~4,000 pairs were given the same rating (a tie).\n* ~7,000 pairs were correct according to our AI judge ('unchanged').\n* and ~2,000 times the rejected response was preferred ('swapped').\n\n\n\nNow the next question is: can we build better models with this new knowledge? The answer is the \"distilabeled Hermes\" model, check it out!", "### Post-processing to add useful information\n\n\nSwap rejected and chosen, and add chosen scores and status:", "### gsm8k \"decontamination\"\nThe basic approach for finding duplicated examples. We didn't find any from the test sets. We experimented with lower thresholds but below 0.8 they introduced false positives:\n\nResult:" ]
[ "TAGS\n#language-Turkish #license-apache-2.0 #rlaif #dpo #rlhf #distilabel #synthetic #region-us \n", "# distilabel Orca Pairs for DPO\n\nThe dataset is a \"distilabeled\" version of the widely used dataset: Intel/orca_dpo_pairs. The original dataset has been used by 100s of open-source practitioners and models. We knew from fixing UltraFeedback (and before that, Alpacas and Dollys) that this dataset could be highly improved.\n\nContinuing with our mission to build the best alignment datasets for open-source LLMs and the community, we spent a few hours improving it with distilabel. \n\nThis was our main intuition: the original dataset just assumes gpt4/3.5-turbo are always the best response. We know from UltraFeedback that's not always the case. Moreover, DPO fine-tuning benefits from the diversity of preference pairs. \n\nAdditionally, we have added a new column indicating whether the question in the dataset is part of the train set of gsm8k (there were no examples from the test set). See the reproduction section for more details.", "## Using this dataset\nThis dataset is useful for preference tuning and we recommend using it instead of the original. It's already prepared in the \"standard\" chosen, rejected format with additional information for further filtering and experimentation. \n\nThe main changes are:\n\n1. ~2K pairs have been swapped: rejected become the chosen response. We have kept the original chosen and rejected on two new columns 'original_*' for reproducibility purposes.\n2. 4K pairs have been identified as 'tie': equally bad or good.\n3. Chosen scores have been added: you can now filter out based on a threshold (see our distilabeled Hermes 2.5 model for an example)\n4. We have kept the ratings and rationales generated with gpt-4-turbo and distilabel so you can prepare the data differently if you want.\n5. We have added a column to indicate if the input is part of gsm8k train set.\n\nIn our experiments, we have got very good results by reducing the size of the dataset by more than 50%. Here's an example of how to achieve that:\n\n\nThis results in '5,922' instead of '12,859' samples (54% reduction) and leads to better performance than the same model tuned with 100% of the samples in the original dataset.\n\n> We'd love to hear about your experiments! If you want to try this out, consider joining our Slack community and let's build some open datasets and models together.", "## Reproducing the dataset\nIn this section, we outline the steps to reproduce this dataset.", "### Rate original dataset pairs\n\nBuild a preference dataset with distilabel using the original dataset:\n\n\n\nIf you want to further filter and curate the dataset, you can push the dataset to Argilla as follows:\n\n\nYou get a nice UI with a lot of pre-computed metadata to explore and curate the dataset:\n\n!image/png\n\nThe resulting dataset is now much more useful: we know which response is preferred (by gpt-4-turbo), which ones have low scores, and we even have natural language explanations. But what did we find? Was our intuition confirmed?\n\n!image/png\n\nThe above chart shows the following: \n\n* ~4,000 pairs were given the same rating (a tie).\n* ~7,000 pairs were correct according to our AI judge ('unchanged').\n* and ~2,000 times the rejected response was preferred ('swapped').\n\n\n\nNow the next question is: can we build better models with this new knowledge? The answer is the \"distilabeled Hermes\" model, check it out!", "### Post-processing to add useful information\n\n\nSwap rejected and chosen, and add chosen scores and status:", "### gsm8k \"decontamination\"\nThe basic approach for finding duplicated examples. We didn't find any from the test sets. We experimented with lower thresholds but below 0.8 they introduced false positives:\n\nResult:" ]
6b7d31fd0b4568caef409a799a8b558fa4f1e070
# Italian version of the Arc Challenge dataset (ARC-c) The dataset has been automatically translate by using [Argos Translate](https://github.com/argosopentech/argos-translate) v. 1.9.1 ### Citation Information ``` @misc{basile2023llamantino, title={LLaMAntino: LLaMA 2 Models for Effective Text Generation in Italian Language}, author={Pierpaolo Basile and Elio Musacchio and Marco Polignano and Lucia Siciliani and Giuseppe Fiameni and Giovanni Semeraro}, year={2023}, eprint={2312.09993}, archivePrefix={arXiv}, primaryClass={cs.CL} } @article{Clark2018ThinkYH, title={Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge}, author={Peter Clark and Isaac Cowhey and Oren Etzioni and Tushar Khot and Ashish Sabharwal and Carissa Schoenick and Oyvind Tafjord}, journal={ArXiv}, year={2018}, volume={abs/1803.05457} } ``` # Dataset Description The ARC dataset consists of **7,787 science exam questions** drawn from a variety of sources, including science questions provided under license by a research partner affiliated with AI2. These are text-only, English language exam questions that span several grade levels as indicated in the files. Each question has a **multiple choice structure** (typically 4 answer options). The questions are sorted into a Challenge Set of 2,590 “hard” questions (those that both a retrieval and a co-occurrence method fail to answer correctly) and an Easy Set of 5,197 questions. Official website: [https://allenai.org/data/arc](https://allenai.org/data/arc)
swap-uniba/arc_challenge_ita
[ "task_categories:question-answering", "task_categories:text-generation", "size_categories:1K<n<10K", "language:it", "llm", "evaluation", "llamantino", "italian", "arxiv:2312.09993", "region:us" ]
2024-01-19T10:35:30+00:00
{"language": ["it"], "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "text-generation"], "pretty_name": "Arc-c dataset Italian Version", "tags": ["llm", "evaluation", "llamantino", "italian"]}
2024-01-19T10:39:33+00:00
[ "2312.09993" ]
[ "it" ]
TAGS #task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-Italian #llm #evaluation #llamantino #italian #arxiv-2312.09993 #region-us
# Italian version of the Arc Challenge dataset (ARC-c) The dataset has been automatically translate by using Argos Translate v. 1.9.1 # Dataset Description The ARC dataset consists of 7,787 science exam questions drawn from a variety of sources, including science questions provided under license by a research partner affiliated with AI2. These are text-only, English language exam questions that span several grade levels as indicated in the files. Each question has a multiple choice structure (typically 4 answer options). The questions are sorted into a Challenge Set of 2,590 “hard” questions (those that both a retrieval and a co-occurrence method fail to answer correctly) and an Easy Set of 5,197 questions. Official website: URL
[ "# Italian version of the Arc Challenge dataset (ARC-c)\nThe dataset has been automatically translate by using Argos Translate v. 1.9.1", "# Dataset Description\nThe ARC dataset consists of 7,787 science exam questions drawn from a variety of sources, including science questions provided under license by a research partner affiliated with AI2. These are text-only, English language exam questions that span several grade levels as indicated in the files. Each question has a\nmultiple choice structure (typically 4 answer options). \n\nThe questions are sorted into a Challenge Set of 2,590 “hard” questions (those that both a retrieval and a co-occurrence method fail to answer correctly) and an Easy Set of 5,197 questions.\n\nOfficial website: URL" ]
[ "TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-Italian #llm #evaluation #llamantino #italian #arxiv-2312.09993 #region-us \n", "# Italian version of the Arc Challenge dataset (ARC-c)\nThe dataset has been automatically translate by using Argos Translate v. 1.9.1", "# Dataset Description\nThe ARC dataset consists of 7,787 science exam questions drawn from a variety of sources, including science questions provided under license by a research partner affiliated with AI2. These are text-only, English language exam questions that span several grade levels as indicated in the files. Each question has a\nmultiple choice structure (typically 4 answer options). \n\nThe questions are sorted into a Challenge Set of 2,590 “hard” questions (those that both a retrieval and a co-occurrence method fail to answer correctly) and an Easy Set of 5,197 questions.\n\nOfficial website: URL" ]
9fafd55a4db1b03ca6cd4f671d1d2813aa947676
# Dataset Card for "autotrain-data-snips-gpt2-with_merge" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
experial/autotrain-data-snips-gpt2-with_merge
[ "region:us" ]
2024-01-19T10:35:36+00:00
{"dataset_info": {"features": [{"name": "autotrain_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24485, "num_examples": 150}, {"name": "validation", "num_bytes": 24485, "num_examples": 150}], "download_size": 21400, "dataset_size": 48970}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]}
2024-01-19T10:35:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "autotrain-data-snips-gpt2-with_merge" More Information needed
[ "# Dataset Card for \"autotrain-data-snips-gpt2-with_merge\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"autotrain-data-snips-gpt2-with_merge\"\n\nMore Information needed" ]
2acb165ae93a3f3466af0290c1edee5770872b52
# Italian Version of the MMLU DATASET Based on the version released by: [**FreedomIntelligence/MMLU_Italian**](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Italian) Includes minor fixes. # Citations This version: ``` @misc{basile2023llamantino, title={LLaMAntino: LLaMA 2 Models for Effective Text Generation in Italian Language}, author={Pierpaolo Basile and Elio Musacchio and Marco Polignano and Lucia Siciliani and Giuseppe Fiameni and Giovanni Semeraro}, year={2023}, eprint={2312.09993}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` Original Dataset: ``` @article{hendryckstest2021, title={Measuring Massive Multitask Language Understanding}, author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt}, journal={Proceedings of the International Conference on Learning Representations (ICLR)}, year={2021} } @article{hendrycks2021ethics, title={Aligning AI With Shared Human Values}, author={Dan Hendrycks and Collin Burns and Steven Basart and Andrew Critch and Jerry Li and Dawn Song and Jacob Steinhardt}, journal={Proceedings of the International Conference on Learning Representations (ICLR)}, year={2021} } ``` # Original Dataset Card for MMLU ## Dataset Description - **Repository**: https://github.com/hendrycks/test - **Paper**: https://arxiv.org/abs/2009.03300 ### Dataset Summary [Measuring Massive Multitask Language Understanding](https://arxiv.org/pdf/2009.03300) by [Dan Hendrycks](https://people.eecs.berkeley.edu/~hendrycks/), [Collin Burns](http://collinpburns.com), [Steven Basart](https://stevenbas.art), Andy Zou, Mantas Mazeika, [Dawn Song](https://people.eecs.berkeley.edu/~dawnsong/), and [Jacob Steinhardt](https://www.stat.berkeley.edu/~jsteinhardt/) (ICLR 2021). This is a massive multitask test consisting of multiple-choice questions from various branches of knowledge. The test spans subjects in the humanities, social sciences, hard sciences, and other areas that are important for some people to learn. This covers 57 tasks including elementary mathematics, US history, computer science, law, and more. To attain high accuracy on this test, models must possess extensive world knowledge and problem solving ability. A complete list of tasks: ['abstract_algebra', 'anatomy', 'astronomy', 'business_ethics', 'clinical_knowledge', 'college_biology', 'college_chemistry', 'college_computer_science', 'college_mathematics', 'college_medicine', 'college_physics', 'computer_security', 'conceptual_physics', 'econometrics', 'electrical_engineering', 'elementary_mathematics', 'formal_logic', 'global_facts', 'high_school_biology', 'high_school_chemistry', 'high_school_computer_science', 'high_school_european_history', 'high_school_geography', 'high_school_government_and_politics', 'high_school_macroeconomics', 'high_school_mathematics', 'high_school_microeconomics', 'high_school_physics', 'high_school_psychology', 'high_school_statistics', 'high_school_us_history', 'high_school_world_history', 'human_aging', 'human_sexuality', 'international_law', 'jurisprudence', 'logical_fallacies', 'machine_learning', 'management', 'marketing', 'medical_genetics', 'miscellaneous', 'moral_disputes', 'moral_scenarios', 'nutrition', 'philosophy', 'prehistory', 'professional_accounting', 'professional_law', 'professional_medicine', 'professional_psychology', 'public_relations', 'security_studies', 'sociology', 'us_foreign_policy', 'virology', 'world_religions'] ### Languages English ## Dataset Structure ### Data Instances An example from anatomy subtask looks as follows: ``` { "question": "What is the embryological origin of the hyoid bone?", "choices": ["The first pharyngeal arch", "The first and second pharyngeal arches", "The second pharyngeal arch", "The second and third pharyngeal arches"], "answer": "D" } ``` ### Data Fields - `question`: a string feature - `choices`: a list of 4 string features - `answer`: a ClassLabel feature ### Data Splits - `auxiliary_train`: auxiliary multiple-choice training questions from ARC, MC_TEST, OBQA, RACE, etc. - `dev`: 5 examples per subtask, meant for few-shot setting - `test`: there are at least 100 examples per subtask | | auxiliary_train | dev | val | test | | ----- | :------: | :-----: | :-----: | :-----: | | TOTAL | 99842 | 285 | 1531 | 14042 ### Licensing Information [MIT License](https://github.com/hendrycks/test/blob/master/LICENSE) ``` ### Contributions Thanks to [@andyzoujm](https://github.com/andyzoujm) for adding this dataset.
swap-uniba/mmlu_ita
[ "task_categories:question-answering", "task_ids:multiple-choice-qa", "annotations_creators:no-annotation", "language_creators:expert-generated", "multilinguality:monolingual", "size_categories:10K<n<100K", "source_datasets:original", "language:en", "license:mit", "arxiv:2312.09993", "arxiv:2009.03300", "region:us" ]
2024-01-19T10:42:53+00:00
{"annotations_creators": ["no-annotation"], "language_creators": ["expert-generated"], "language": ["en"], "license": ["mit"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["original"], "task_categories": ["question-answering"], "task_ids": ["multiple-choice-qa"], "paperswithcode_id": "mmlu", "pretty_name": "Measuring Massive Multitask Language Understanding", "language_bcp47": ["en-US"], "dataset_info": [{"config_name": "abstract_algebra", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 19328, "num_examples": 100}, {"name": "validation", "num_bytes": 2024, "num_examples": 11}, {"name": "dev", "num_bytes": 830, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160623559}, {"config_name": "anatomy", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 33121, "num_examples": 135}, {"name": "validation", "num_bytes": 3140, "num_examples": 14}, {"name": "dev", "num_bytes": 967, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160638605}, {"config_name": "astronomy", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 46771, "num_examples": 152}, {"name": "validation", "num_bytes": 5027, "num_examples": 16}, {"name": "dev", "num_bytes": 2076, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160655251}, {"config_name": "business_ethics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 33252, "num_examples": 100}, {"name": "validation", "num_bytes": 3038, "num_examples": 11}, {"name": "dev", "num_bytes": 2190, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160639857}, {"config_name": "clinical_knowledge", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 62754, "num_examples": 265}, {"name": "validation", "num_bytes": 6664, "num_examples": 29}, {"name": "dev", "num_bytes": 1210, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160672005}, {"config_name": "college_biology", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 48797, "num_examples": 144}, {"name": "validation", "num_bytes": 4819, "num_examples": 16}, {"name": "dev", "num_bytes": 1532, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160656525}, {"config_name": "college_chemistry", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 24708, "num_examples": 100}, {"name": "validation", "num_bytes": 2328, "num_examples": 8}, {"name": "dev", "num_bytes": 1331, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160629744}, {"config_name": "college_computer_science", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 42641, "num_examples": 100}, {"name": "validation", "num_bytes": 4663, "num_examples": 11}, {"name": "dev", "num_bytes": 2765, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160651446}, {"config_name": "college_mathematics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 24711, "num_examples": 100}, {"name": "validation", "num_bytes": 2668, "num_examples": 11}, {"name": "dev", "num_bytes": 1493, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160630249}, {"config_name": "college_medicine", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 82397, "num_examples": 173}, {"name": "validation", "num_bytes": 7909, "num_examples": 22}, {"name": "dev", "num_bytes": 1670, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160693353}, {"config_name": "college_physics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 30181, "num_examples": 102}, {"name": "validation", "num_bytes": 3490, "num_examples": 11}, {"name": "dev", "num_bytes": 1412, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160636460}, {"config_name": "computer_security", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 27124, "num_examples": 100}, {"name": "validation", "num_bytes": 4549, "num_examples": 11}, {"name": "dev", "num_bytes": 1101, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160634151}, {"config_name": "conceptual_physics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 40709, "num_examples": 235}, {"name": "validation", "num_bytes": 4474, "num_examples": 26}, {"name": "dev", "num_bytes": 934, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160647494}, {"config_name": "econometrics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 46547, "num_examples": 114}, {"name": "validation", "num_bytes": 4967, "num_examples": 12}, {"name": "dev", "num_bytes": 1644, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160654535}, {"config_name": "electrical_engineering", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 25142, "num_examples": 145}, {"name": "validation", "num_bytes": 2903, "num_examples": 16}, {"name": "dev", "num_bytes": 972, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160630394}, {"config_name": "elementary_mathematics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 70108, "num_examples": 378}, {"name": "validation", "num_bytes": 8988, "num_examples": 41}, {"name": "dev", "num_bytes": 1440, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160681913}, {"config_name": "formal_logic", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 49785, "num_examples": 126}, {"name": "validation", "num_bytes": 6252, "num_examples": 14}, {"name": "dev", "num_bytes": 1757, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160659171}, {"config_name": "global_facts", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 18403, "num_examples": 100}, {"name": "validation", "num_bytes": 1865, "num_examples": 10}, {"name": "dev", "num_bytes": 1229, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160622874}, {"config_name": "high_school_biology", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 109732, "num_examples": 310}, {"name": "validation", "num_bytes": 11022, "num_examples": 32}, {"name": "dev", "num_bytes": 1673, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160723804}, {"config_name": "high_school_chemistry", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 58464, "num_examples": 203}, {"name": "validation", "num_bytes": 7092, "num_examples": 22}, {"name": "dev", "num_bytes": 1220, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160668153}, {"config_name": "high_school_computer_science", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 44476, "num_examples": 100}, {"name": "validation", "num_bytes": 3343, "num_examples": 9}, {"name": "dev", "num_bytes": 2918, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160652114}, {"config_name": "high_school_european_history", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 270300, "num_examples": 165}, {"name": "validation", "num_bytes": 29632, "num_examples": 18}, {"name": "dev", "num_bytes": 11564, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160912873}, {"config_name": "high_school_geography", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 42034, "num_examples": 198}, {"name": "validation", "num_bytes": 4332, "num_examples": 22}, {"name": "dev", "num_bytes": 1403, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160649146}, {"config_name": "high_school_government_and_politics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 66074, "num_examples": 193}, {"name": "validation", "num_bytes": 7063, "num_examples": 21}, {"name": "dev", "num_bytes": 1779, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160676293}, {"config_name": "high_school_macroeconomics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 117687, "num_examples": 390}, {"name": "validation", "num_bytes": 13020, "num_examples": 43}, {"name": "dev", "num_bytes": 1328, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160733412}, {"config_name": "high_school_mathematics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 54854, "num_examples": 270}, {"name": "validation", "num_bytes": 5765, "num_examples": 29}, {"name": "dev", "num_bytes": 1297, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160663293}, {"config_name": "high_school_microeconomics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 75703, "num_examples": 238}, {"name": "validation", "num_bytes": 7553, "num_examples": 26}, {"name": "dev", "num_bytes": 1298, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160685931}, {"config_name": "high_school_physics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 59538, "num_examples": 151}, {"name": "validation", "num_bytes": 6771, "num_examples": 17}, {"name": "dev", "num_bytes": 1489, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160669175}, {"config_name": "high_school_psychology", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 159407, "num_examples": 545}, {"name": "validation", "num_bytes": 17269, "num_examples": 60}, {"name": "dev", "num_bytes": 1905, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160779958}, {"config_name": "high_school_statistics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 110702, "num_examples": 216}, {"name": "validation", "num_bytes": 9997, "num_examples": 23}, {"name": "dev", "num_bytes": 2528, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160724604}, {"config_name": "high_school_us_history", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 296734, "num_examples": 204}, {"name": "validation", "num_bytes": 31706, "num_examples": 22}, {"name": "dev", "num_bytes": 8864, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160938681}, {"config_name": "high_school_world_history", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 378617, "num_examples": 237}, {"name": "validation", "num_bytes": 45501, "num_examples": 26}, {"name": "dev", "num_bytes": 4882, "num_examples": 5}], "download_size": 166184960, "dataset_size": 161030377}, {"config_name": "human_aging", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 46098, "num_examples": 223}, {"name": "validation", "num_bytes": 4707, "num_examples": 23}, {"name": "dev", "num_bytes": 1008, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160653190}, {"config_name": "human_sexuality", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 32110, "num_examples": 131}, {"name": "validation", "num_bytes": 2421, "num_examples": 12}, {"name": "dev", "num_bytes": 1077, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160636985}, {"config_name": "international_law", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 53531, "num_examples": 121}, {"name": "validation", "num_bytes": 6473, "num_examples": 13}, {"name": "dev", "num_bytes": 2418, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160663799}, {"config_name": "jurisprudence", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 33986, "num_examples": 108}, {"name": "validation", "num_bytes": 3729, "num_examples": 11}, {"name": "dev", "num_bytes": 1303, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160640395}, {"config_name": "logical_fallacies", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 50117, "num_examples": 163}, {"name": "validation", "num_bytes": 5103, "num_examples": 18}, {"name": "dev", "num_bytes": 1573, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160658170}, {"config_name": "machine_learning", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 33880, "num_examples": 112}, {"name": "validation", "num_bytes": 3232, "num_examples": 11}, {"name": "dev", "num_bytes": 2323, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160640812}, {"config_name": "management", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 20002, "num_examples": 103}, {"name": "validation", "num_bytes": 1820, "num_examples": 11}, {"name": "dev", "num_bytes": 898, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160624097}, {"config_name": "marketing", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 63025, "num_examples": 234}, {"name": "validation", "num_bytes": 7394, "num_examples": 25}, {"name": "dev", "num_bytes": 1481, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160673277}, {"config_name": "medical_genetics", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 20864, "num_examples": 100}, {"name": "validation", "num_bytes": 3005, "num_examples": 11}, {"name": "dev", "num_bytes": 1089, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160626335}, {"config_name": "miscellaneous", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 147704, "num_examples": 783}, {"name": "validation", "num_bytes": 14330, "num_examples": 86}, {"name": "dev", "num_bytes": 699, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160764110}, {"config_name": "moral_disputes", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 107818, "num_examples": 346}, {"name": "validation", "num_bytes": 12420, "num_examples": 38}, {"name": "dev", "num_bytes": 1755, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160723370}, {"config_name": "moral_scenarios", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 374026, "num_examples": 895}, {"name": "validation", "num_bytes": 42338, "num_examples": 100}, {"name": "dev", "num_bytes": 2058, "num_examples": 5}], "download_size": 166184960, "dataset_size": 161019799}, {"config_name": "nutrition", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 92410, "num_examples": 306}, {"name": "validation", "num_bytes": 8436, "num_examples": 33}, {"name": "dev", "num_bytes": 2085, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160704308}, {"config_name": "philosophy", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 80073, "num_examples": 311}, {"name": "validation", "num_bytes": 9184, "num_examples": 34}, {"name": "dev", "num_bytes": 988, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160691622}, {"config_name": "prehistory", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 89594, "num_examples": 324}, {"name": "validation", "num_bytes": 10285, "num_examples": 35}, {"name": "dev", "num_bytes": 1878, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160703134}, {"config_name": "professional_accounting", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 124550, "num_examples": 282}, {"name": "validation", "num_bytes": 14372, "num_examples": 31}, {"name": "dev", "num_bytes": 2148, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160742447}, {"config_name": "professional_law", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 1891762, "num_examples": 1534}, {"name": "validation", "num_bytes": 203519, "num_examples": 170}, {"name": "dev", "num_bytes": 6610, "num_examples": 5}], "download_size": 166184960, "dataset_size": 162703268}, {"config_name": "professional_medicine", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 217561, "num_examples": 272}, {"name": "validation", "num_bytes": 23847, "num_examples": 31}, {"name": "dev", "num_bytes": 3807, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160846592}, {"config_name": "professional_psychology", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 225899, "num_examples": 612}, {"name": "validation", "num_bytes": 29101, "num_examples": 69}, {"name": "dev", "num_bytes": 2267, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160858644}, {"config_name": "public_relations", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 28760, "num_examples": 110}, {"name": "validation", "num_bytes": 4566, "num_examples": 12}, {"name": "dev", "num_bytes": 1496, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160636199}, {"config_name": "security_studies", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 204844, "num_examples": 245}, {"name": "validation", "num_bytes": 22637, "num_examples": 27}, {"name": "dev", "num_bytes": 5335, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160834193}, {"config_name": "sociology", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 66243, "num_examples": 201}, {"name": "validation", "num_bytes": 7184, "num_examples": 22}, {"name": "dev", "num_bytes": 1613, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160676417}, {"config_name": "us_foreign_policy", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 28443, "num_examples": 100}, {"name": "validation", "num_bytes": 3264, "num_examples": 11}, {"name": "dev", "num_bytes": 1611, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160634695}, {"config_name": "virology", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 38759, "num_examples": 166}, {"name": "validation", "num_bytes": 5463, "num_examples": 18}, {"name": "dev", "num_bytes": 1096, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160646695}, {"config_name": "world_religions", "features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "auxiliary_train", "num_bytes": 160601377, "num_examples": 99842}, {"name": "test", "num_bytes": 25274, "num_examples": 171}, {"name": "validation", "num_bytes": 2765, "num_examples": 19}, {"name": "dev", "num_bytes": 670, "num_examples": 5}], "download_size": 166184960, "dataset_size": 160630086}]}
2024-01-19T10:53:12+00:00
[ "2312.09993", "2009.03300" ]
[ "en" ]
TAGS #task_categories-question-answering #task_ids-multiple-choice-qa #annotations_creators-no-annotation #language_creators-expert-generated #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-English #license-mit #arxiv-2312.09993 #arxiv-2009.03300 #region-us
Italian Version of the MMLU DATASET =================================== Based on the version released by: FreedomIntelligence/MMLU\_Italian Includes minor fixes. s This version: Original Dataset: Original Dataset Card for MMLU ============================== Dataset Description ------------------- * Repository: URL * Paper: URL ### Dataset Summary Measuring Massive Multitask Language Understanding by Dan Hendrycks, Collin Burns, Steven Basart, Andy Zou, Mantas Mazeika, Dawn Song, and Jacob Steinhardt (ICLR 2021). This is a massive multitask test consisting of multiple-choice questions from various branches of knowledge. The test spans subjects in the humanities, social sciences, hard sciences, and other areas that are important for some people to learn. This covers 57 tasks including elementary mathematics, US history, computer science, law, and more. To attain high accuracy on this test, models must possess extensive world knowledge and problem solving ability. A complete list of tasks: ['abstract\_algebra', 'anatomy', 'astronomy', 'business\_ethics', 'clinical\_knowledge', 'college\_biology', 'college\_chemistry', 'college\_computer\_science', 'college\_mathematics', 'college\_medicine', 'college\_physics', 'computer\_security', 'conceptual\_physics', 'econometrics', 'electrical\_engineering', 'elementary\_mathematics', 'formal\_logic', 'global\_facts', 'high\_school\_biology', 'high\_school\_chemistry', 'high\_school\_computer\_science', 'high\_school\_european\_history', 'high\_school\_geography', 'high\_school\_government\_and\_politics', 'high\_school\_macroeconomics', 'high\_school\_mathematics', 'high\_school\_microeconomics', 'high\_school\_physics', 'high\_school\_psychology', 'high\_school\_statistics', 'high\_school\_us\_history', 'high\_school\_world\_history', 'human\_aging', 'human\_sexuality', 'international\_law', 'jurisprudence', 'logical\_fallacies', 'machine\_learning', 'management', 'marketing', 'medical\_genetics', 'miscellaneous', 'moral\_disputes', 'moral\_scenarios', 'nutrition', 'philosophy', 'prehistory', 'professional\_accounting', 'professional\_law', 'professional\_medicine', 'professional\_psychology', 'public\_relations', 'security\_studies', 'sociology', 'us\_foreign\_policy', 'virology', 'world\_religions'] ### Languages English Dataset Structure ----------------- ### Data Instances An example from anatomy subtask looks as follows: ### Data Fields * 'question': a string feature * 'choices': a list of 4 string features * 'answer': a ClassLabel feature ### Data Splits * 'auxiliary\_train': auxiliary multiple-choice training questions from ARC, MC\_TEST, OBQA, RACE, etc. * 'dev': 5 examples per subtask, meant for few-shot setting * 'test': there are at least 100 examples per subtask ### Licensing Information MIT License ''' ### Contributions Thanks to @andyzoujm for adding this dataset.
[ "### Dataset Summary\n\n\nMeasuring Massive Multitask Language Understanding by Dan Hendrycks, Collin Burns, Steven Basart, Andy Zou, Mantas Mazeika, Dawn Song, and Jacob Steinhardt (ICLR 2021).\n\n\nThis is a massive multitask test consisting of multiple-choice questions from various branches of knowledge. The test spans subjects in the humanities, social sciences, hard sciences, and other areas that are important for some people to learn. This covers 57 tasks including elementary mathematics, US history, computer science, law, and more. To attain high accuracy on this test, models must possess extensive world knowledge and problem solving ability.\nA complete list of tasks: ['abstract\\_algebra', 'anatomy', 'astronomy', 'business\\_ethics', 'clinical\\_knowledge', 'college\\_biology', 'college\\_chemistry', 'college\\_computer\\_science', 'college\\_mathematics', 'college\\_medicine', 'college\\_physics', 'computer\\_security', 'conceptual\\_physics', 'econometrics', 'electrical\\_engineering', 'elementary\\_mathematics', 'formal\\_logic', 'global\\_facts', 'high\\_school\\_biology', 'high\\_school\\_chemistry', 'high\\_school\\_computer\\_science', 'high\\_school\\_european\\_history', 'high\\_school\\_geography', 'high\\_school\\_government\\_and\\_politics', 'high\\_school\\_macroeconomics', 'high\\_school\\_mathematics', 'high\\_school\\_microeconomics', 'high\\_school\\_physics', 'high\\_school\\_psychology', 'high\\_school\\_statistics', 'high\\_school\\_us\\_history', 'high\\_school\\_world\\_history', 'human\\_aging', 'human\\_sexuality', 'international\\_law', 'jurisprudence', 'logical\\_fallacies', 'machine\\_learning', 'management', 'marketing', 'medical\\_genetics', 'miscellaneous', 'moral\\_disputes', 'moral\\_scenarios', 'nutrition', 'philosophy', 'prehistory', 'professional\\_accounting', 'professional\\_law', 'professional\\_medicine', 'professional\\_psychology', 'public\\_relations', 'security\\_studies', 'sociology', 'us\\_foreign\\_policy', 'virology', 'world\\_religions']", "### Languages\n\n\nEnglish\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nAn example from anatomy subtask looks as follows:", "### Data Fields\n\n\n* 'question': a string feature\n* 'choices': a list of 4 string features\n* 'answer': a ClassLabel feature", "### Data Splits\n\n\n* 'auxiliary\\_train': auxiliary multiple-choice training questions from ARC, MC\\_TEST, OBQA, RACE, etc.\n* 'dev': 5 examples per subtask, meant for few-shot setting\n* 'test': there are at least 100 examples per subtask", "### Licensing Information\n\n\nMIT License\n\n\n'''", "### Contributions\n\n\nThanks to @andyzoujm for adding this dataset." ]
[ "TAGS\n#task_categories-question-answering #task_ids-multiple-choice-qa #annotations_creators-no-annotation #language_creators-expert-generated #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-English #license-mit #arxiv-2312.09993 #arxiv-2009.03300 #region-us \n", "### Dataset Summary\n\n\nMeasuring Massive Multitask Language Understanding by Dan Hendrycks, Collin Burns, Steven Basart, Andy Zou, Mantas Mazeika, Dawn Song, and Jacob Steinhardt (ICLR 2021).\n\n\nThis is a massive multitask test consisting of multiple-choice questions from various branches of knowledge. The test spans subjects in the humanities, social sciences, hard sciences, and other areas that are important for some people to learn. This covers 57 tasks including elementary mathematics, US history, computer science, law, and more. To attain high accuracy on this test, models must possess extensive world knowledge and problem solving ability.\nA complete list of tasks: ['abstract\\_algebra', 'anatomy', 'astronomy', 'business\\_ethics', 'clinical\\_knowledge', 'college\\_biology', 'college\\_chemistry', 'college\\_computer\\_science', 'college\\_mathematics', 'college\\_medicine', 'college\\_physics', 'computer\\_security', 'conceptual\\_physics', 'econometrics', 'electrical\\_engineering', 'elementary\\_mathematics', 'formal\\_logic', 'global\\_facts', 'high\\_school\\_biology', 'high\\_school\\_chemistry', 'high\\_school\\_computer\\_science', 'high\\_school\\_european\\_history', 'high\\_school\\_geography', 'high\\_school\\_government\\_and\\_politics', 'high\\_school\\_macroeconomics', 'high\\_school\\_mathematics', 'high\\_school\\_microeconomics', 'high\\_school\\_physics', 'high\\_school\\_psychology', 'high\\_school\\_statistics', 'high\\_school\\_us\\_history', 'high\\_school\\_world\\_history', 'human\\_aging', 'human\\_sexuality', 'international\\_law', 'jurisprudence', 'logical\\_fallacies', 'machine\\_learning', 'management', 'marketing', 'medical\\_genetics', 'miscellaneous', 'moral\\_disputes', 'moral\\_scenarios', 'nutrition', 'philosophy', 'prehistory', 'professional\\_accounting', 'professional\\_law', 'professional\\_medicine', 'professional\\_psychology', 'public\\_relations', 'security\\_studies', 'sociology', 'us\\_foreign\\_policy', 'virology', 'world\\_religions']", "### Languages\n\n\nEnglish\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nAn example from anatomy subtask looks as follows:", "### Data Fields\n\n\n* 'question': a string feature\n* 'choices': a list of 4 string features\n* 'answer': a ClassLabel feature", "### Data Splits\n\n\n* 'auxiliary\\_train': auxiliary multiple-choice training questions from ARC, MC\\_TEST, OBQA, RACE, etc.\n* 'dev': 5 examples per subtask, meant for few-shot setting\n* 'test': there are at least 100 examples per subtask", "### Licensing Information\n\n\nMIT License\n\n\n'''", "### Contributions\n\n\nThanks to @andyzoujm for adding this dataset." ]
72fb507d4758e06ff77fa48c015c2a7d2f5370f2
# SCP Foundation Texts **Overview:** This dataset is a collection of information scraped from the SCP Foundation website, a fictional research organization known for its collaborative web-based creative project. Specifically, this dataset focuses on rated SCP titles in Russian, offering insights into the content and community ratings associated with various SCP objects. **Licensing Rules:** All derivative projects based on this dataset are distributed under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA 3.0). This license mandates that any new works inherit the same license and include proper attribution to the [SCP Foundation website](https://scpfoundation.net/) and the original authors. **SCP Foundation:** The SCP Foundation, an acronym for Secure, Contain, Protect, is a fictional research organization. It is the focus of a collaborative web-based creative project, known for its extensive collection of articles describing the containment procedures for over 8000 anomalous objects (SCP objects). The project also features numerous artistic narratives within the SCP Foundation universe. **Dataset Content:** - **Scope:** Rated SCP titles in Russian - **Insights:** Community ratings associated with various SCP objects **License Summary:** - **License Type:** Creative Commons Attribution-ShareAlike 3.0 Unported (CC-BY-SA 3.0) - **License Details:** [CC-BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/) - **Source Attribution:** Ensure proper attribution with links to the [SCP Foundation website](https://scpfoundation.net/) and acknowledgment of original authors. ### Collaborators - [Hugging Face - Kellkubrick](https://huggingface.co/kellkubrick) --- ### Citation ``` @MISC{igoktech/scp_ru, author = {Nikolas Ivanov, Igor Kuzmin}, title = {SCP Foundation Rated Titles (Russian)}, url = {https://huggingface.co/datasets/igoktech/scp_ru}, year = 2024 } ```
igorktech/scp_ru
[ "task_categories:text-generation", "task_categories:text-classification", "size_categories:1K<n<10K", "language:ru", "license:cc-by-sa-3.0", "not-for-all-audiences", "region:us" ]
2024-01-19T10:49:49+00:00
{"language": ["ru"], "license": "cc-by-sa-3.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation", "text-classification"], "pretty_name": "SCP", "tags": ["not-for-all-audiences"]}
2024-01-19T11:11:35+00:00
[]
[ "ru" ]
TAGS #task_categories-text-generation #task_categories-text-classification #size_categories-1K<n<10K #language-Russian #license-cc-by-sa-3.0 #not-for-all-audiences #region-us
# SCP Foundation Texts Overview: This dataset is a collection of information scraped from the SCP Foundation website, a fictional research organization known for its collaborative web-based creative project. Specifically, this dataset focuses on rated SCP titles in Russian, offering insights into the content and community ratings associated with various SCP objects. Licensing Rules: All derivative projects based on this dataset are distributed under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA 3.0). This license mandates that any new works inherit the same license and include proper attribution to the SCP Foundation website and the original authors. SCP Foundation: The SCP Foundation, an acronym for Secure, Contain, Protect, is a fictional research organization. It is the focus of a collaborative web-based creative project, known for its extensive collection of articles describing the containment procedures for over 8000 anomalous objects (SCP objects). The project also features numerous artistic narratives within the SCP Foundation universe. Dataset Content: - Scope: Rated SCP titles in Russian - Insights: Community ratings associated with various SCP objects License Summary: - License Type: Creative Commons Attribution-ShareAlike 3.0 Unported (CC-BY-SA 3.0) - License Details: CC-BY-SA 3.0 - Source Attribution: Ensure proper attribution with links to the SCP Foundation website and acknowledgment of original authors. ### Collaborators - Hugging Face - Kellkubrick ---
[ "# SCP Foundation Texts\n\nOverview:\n\nThis dataset is a collection of information scraped from the SCP Foundation website, a fictional research organization known for its collaborative web-based creative project. Specifically, this dataset focuses on rated SCP titles in Russian, offering insights into the content and community ratings associated with various SCP objects.\n\nLicensing Rules:\n\nAll derivative projects based on this dataset are distributed under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA 3.0). This license mandates that any new works inherit the same license and include proper attribution to the SCP Foundation website and the original authors.\n\nSCP Foundation:\n\nThe SCP Foundation, an acronym for Secure, Contain, Protect, is a fictional research organization. It is the focus of a collaborative web-based creative project, known for its extensive collection of articles describing the containment procedures for over 8000 anomalous objects (SCP objects). The project also features numerous artistic narratives within the SCP Foundation universe.\n\nDataset Content:\n\n- Scope: Rated SCP titles in Russian\n- Insights: Community ratings associated with various SCP objects\n\nLicense Summary:\n\n- License Type: Creative Commons Attribution-ShareAlike 3.0 Unported (CC-BY-SA 3.0)\n- License Details: CC-BY-SA 3.0\n- Source Attribution: Ensure proper attribution with links to the SCP Foundation website and acknowledgment of original authors.", "### Collaborators\n\n- Hugging Face - Kellkubrick\n\n---" ]
[ "TAGS\n#task_categories-text-generation #task_categories-text-classification #size_categories-1K<n<10K #language-Russian #license-cc-by-sa-3.0 #not-for-all-audiences #region-us \n", "# SCP Foundation Texts\n\nOverview:\n\nThis dataset is a collection of information scraped from the SCP Foundation website, a fictional research organization known for its collaborative web-based creative project. Specifically, this dataset focuses on rated SCP titles in Russian, offering insights into the content and community ratings associated with various SCP objects.\n\nLicensing Rules:\n\nAll derivative projects based on this dataset are distributed under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA 3.0). This license mandates that any new works inherit the same license and include proper attribution to the SCP Foundation website and the original authors.\n\nSCP Foundation:\n\nThe SCP Foundation, an acronym for Secure, Contain, Protect, is a fictional research organization. It is the focus of a collaborative web-based creative project, known for its extensive collection of articles describing the containment procedures for over 8000 anomalous objects (SCP objects). The project also features numerous artistic narratives within the SCP Foundation universe.\n\nDataset Content:\n\n- Scope: Rated SCP titles in Russian\n- Insights: Community ratings associated with various SCP objects\n\nLicense Summary:\n\n- License Type: Creative Commons Attribution-ShareAlike 3.0 Unported (CC-BY-SA 3.0)\n- License Details: CC-BY-SA 3.0\n- Source Attribution: Ensure proper attribution with links to the SCP Foundation website and acknowledgment of original authors.", "### Collaborators\n\n- Hugging Face - Kellkubrick\n\n---" ]
dcc5c361a4e248c889b98d8b99c68a34875baa19
# Italian version of the BHH Dataset Dataset based on the Italian translation provided by: - **Leonardo Ranaldi, Giulia Pucci, Elena Sofia Ruzzetti, Fabio Massimo Zanzotto, and André Freitas** - [Teasing LLMs adapted to Italian](https://github.com/LeonardRanaldi/italian-instruct-eval/tree/main) # Citations ``` @article{suzgun2022challenging, title={Challenging BIG-Bench Tasks and Whether Chain-of-Thought Can Solve Them}, author={Suzgun, Mirac and Scales, Nathan and Sch{\"a}rli, Nathanael and Gehrmann, Sebastian and Tay, Yi and Chung, Hyung Won and Chowdhery, Aakanksha and Le, Quoc V and Chi, Ed H and Zhou, Denny and and Wei, Jason}, journal={arXiv preprint arXiv:2210.09261}, year={2022} } @inproceedings{RanaldiPRZF23, author = {Leonardo Ranaldi and Giulia Pucci and Elena Sofia Ruzzetti and Fabio Massimo Zanzotto and Andr{\'{e}} Freitas}, title = {Teasing LLMs Adapted to Italian}, booktitle = {Proceedings of the 9th Italian Conference on Computational Linguistics, Venice, Italy, November 30 - December 2, 2023}, series = {{CEUR} Workshop Proceedings}, volume = {3596}, publisher = {CEUR-WS.org}, year = {2023}, url = {https://ceur-ws.org/Vol-3596/short18.pdf}, timestamp = {Tue, 02 Jan 2024 17:44:44 +0100}, } @misc{basile2023llamantino, title={LLaMAntino: LLaMA 2 Models for Effective Text Generation in Italian Language}, author={Pierpaolo Basile and Elio Musacchio and Marco Polignano and Lucia Siciliani and Giuseppe Fiameni and Giovanni Semeraro}, year={2023}, eprint={2312.09993}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` # DESCRIPTION BBH focuses on a suite of 23 challenging BIG-Bench tasks which we call BIG-Bench Hard (BBH). These are the task for which prior language model evaluations did not outperform the average human-rater. We find that applying chain-of-thought (CoT) prompting to BBH tasks enables PaLM to surpass the average humanrater performance on 10 of the 23 tasks, and Codex (code-davinci-002) to surpass the average human-rater performance on 17 of the 23 tasks. Since many tasks in BBH require multi-step reasoning, few-shot prompting without CoT, as done in the BIG-Bench evaluations (Srivastava et al., 2022), substantially underestimates the best performance and capabilities of language models, which is better captured via CoT prompting. As further analysis, we explore the interaction between CoT and model scale on BBH, finding that CoT enables emergent task performance on several BBH tasks with otherwise flat scaling curves. # HOMEPAGE URL: [https://github.com/suzgunmirac/BIG-Bench-Hard](https://github.com/suzgunmirac/BIG-Bench-Hard)
swap-uniba/bbh_ita
[ "task_categories:question-answering", "task_categories:text-generation", "language:it", "llms", "italian", "llamantino", "arxiv:2312.09993", "region:us" ]
2024-01-19T11:05:15+00:00
{"language": ["it"], "task_categories": ["question-answering", "text-generation"], "pretty_name": "BBH dataset Italian ", "dataset_info": [{"config_name": "boolean_expressions", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 11790, "num_examples": 250}], "download_size": 17172, "dataset_size": 11790}, {"config_name": "causal_judgement", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 198021, "num_examples": 187}], "download_size": 202943, "dataset_size": 198021}, {"config_name": "date_understanding", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 54666, "num_examples": 250}], "download_size": 61760, "dataset_size": 54666}, {"config_name": "disambiguation_qa", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 78620, "num_examples": 250}], "download_size": 85255, "dataset_size": 78620}, {"config_name": "dyck_languages", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 38432, "num_examples": 250}], "download_size": 43814, "dataset_size": 38432}, {"config_name": "formal_fallacies", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 138224, "num_examples": 250}], "download_size": 145562, "dataset_size": 138224}, {"config_name": "geometric_shapes", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 68560, "num_examples": 250}], "download_size": 77242, "dataset_size": 68560}, {"config_name": "hyperbaton", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 38574, "num_examples": 250}], "download_size": 44706, "dataset_size": 38574}, {"config_name": "logical_deduction_five_objects", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 148595, "num_examples": 250}], "download_size": 155477, "dataset_size": 148595}, {"config_name": "logical_deduction_seven_objects", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 191022, "num_examples": 250}], "download_size": 198404, "dataset_size": 191022}, {"config_name": "logical_deduction_three_objects", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 105831, "num_examples": 250}], "download_size": 112213, "dataset_size": 105831}, {"config_name": "movie_recommendation", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 50985, "num_examples": 250}], "download_size": 57684, "dataset_size": 50985}, {"config_name": "multistep_arithmetic_two", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 12943, "num_examples": 250}], "download_size": 18325, "dataset_size": 12943}, {"config_name": "navigate", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 49031, "num_examples": 250}], "download_size": 55163, "dataset_size": 49031}, {"config_name": "object_counting", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 30508, "num_examples": 250}], "download_size": 35890, "dataset_size": 30508}, {"config_name": "penguins_in_a_table", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 70062, "num_examples": 146}], "download_size": 74516, "dataset_size": 70062}, {"config_name": "reasoning_about_colored_objects", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 89579, "num_examples": 250}], "download_size": 98694, "dataset_size": 89579}, {"config_name": "ruin_names", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 46537, "num_examples": 250}], "download_size": 53178, "dataset_size": 46537}, {"config_name": "salient_translation_error_detection", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 277110, "num_examples": 250}], "download_size": 286443, "dataset_size": 277110}, {"config_name": "snarks", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 38223, "num_examples": 178}], "download_size": 42646, "dataset_size": 38223}, {"config_name": "sports_understanding", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 22723, "num_examples": 250}], "download_size": 28617, "dataset_size": 22723}, {"config_name": "temporal_sequences", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 139546, "num_examples": 250}], "download_size": 148176, "dataset_size": 139546}, {"config_name": "tracking_shuffled_objects_five_objects", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 162590, "num_examples": 250}], "download_size": 169722, "dataset_size": 162590}, {"config_name": "tracking_shuffled_objects_seven_objects", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 207274, "num_examples": 250}], "download_size": 214906, "dataset_size": 207274}, {"config_name": "tracking_shuffled_objects_three_objects", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 122104, "num_examples": 250}], "download_size": 128736, "dataset_size": 122104}, {"config_name": "web_of_lies", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 47582, "num_examples": 250}], "download_size": 52964, "dataset_size": 47582}, {"config_name": "word_sorting", "features": [{"name": "input", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 60918, "num_examples": 250}], "download_size": 66300, "dataset_size": 60918}], "tags": ["llms", "italian", "llamantino"]}
2024-01-19T11:16:48+00:00
[ "2312.09993" ]
[ "it" ]
TAGS #task_categories-question-answering #task_categories-text-generation #language-Italian #llms #italian #llamantino #arxiv-2312.09993 #region-us
# Italian version of the BHH Dataset Dataset based on the Italian translation provided by: - Leonardo Ranaldi, Giulia Pucci, Elena Sofia Ruzzetti, Fabio Massimo Zanzotto, and André Freitas - Teasing LLMs adapted to Italian s # DESCRIPTION BBH focuses on a suite of 23 challenging BIG-Bench tasks which we call BIG-Bench Hard (BBH). These are the task for which prior language model evaluations did not outperform the average human-rater. We find that applying chain-of-thought (CoT) prompting to BBH tasks enables PaLM to surpass the average humanrater performance on 10 of the 23 tasks, and Codex (code-davinci-002) to surpass the average human-rater performance on 17 of the 23 tasks. Since many tasks in BBH require multi-step reasoning, few-shot prompting without CoT, as done in the BIG-Bench evaluations (Srivastava et al., 2022), substantially underestimates the best performance and capabilities of language models, which is better captured via CoT prompting. As further analysis, we explore the interaction between CoT and model scale on BBH, finding that CoT enables emergent task performance on several BBH tasks with otherwise flat scaling curves. # HOMEPAGE URL: URL
[ "# Italian version of the BHH Dataset\nDataset based on the Italian translation provided by:\n\n - Leonardo Ranaldi, Giulia Pucci, Elena Sofia Ruzzetti, Fabio Massimo Zanzotto, and André Freitas - Teasing LLMs adapted to Italian\n \n s", "# DESCRIPTION \nBBH focuses on a suite of 23 challenging BIG-Bench tasks which we call BIG-Bench Hard (BBH). These are the task for which prior language model evaluations did not outperform the average human-rater. We find that applying chain-of-thought (CoT) prompting to BBH tasks enables PaLM to surpass the average humanrater performance on 10 of the 23 tasks, and Codex (code-davinci-002) to surpass the average human-rater performance on 17 of the 23 tasks. Since many tasks in BBH require multi-step reasoning, few-shot prompting without CoT, as done in the BIG-Bench evaluations (Srivastava et al., 2022), substantially underestimates the best performance and capabilities of language models, which is better captured via CoT prompting. As further analysis, we explore the interaction between CoT and model scale on BBH, finding that CoT enables emergent task performance on several BBH tasks with otherwise flat scaling curves.", "# HOMEPAGE \nURL: URL" ]
[ "TAGS\n#task_categories-question-answering #task_categories-text-generation #language-Italian #llms #italian #llamantino #arxiv-2312.09993 #region-us \n", "# Italian version of the BHH Dataset\nDataset based on the Italian translation provided by:\n\n - Leonardo Ranaldi, Giulia Pucci, Elena Sofia Ruzzetti, Fabio Massimo Zanzotto, and André Freitas - Teasing LLMs adapted to Italian\n \n s", "# DESCRIPTION \nBBH focuses on a suite of 23 challenging BIG-Bench tasks which we call BIG-Bench Hard (BBH). These are the task for which prior language model evaluations did not outperform the average human-rater. We find that applying chain-of-thought (CoT) prompting to BBH tasks enables PaLM to surpass the average humanrater performance on 10 of the 23 tasks, and Codex (code-davinci-002) to surpass the average human-rater performance on 17 of the 23 tasks. Since many tasks in BBH require multi-step reasoning, few-shot prompting without CoT, as done in the BIG-Bench evaluations (Srivastava et al., 2022), substantially underestimates the best performance and capabilities of language models, which is better captured via CoT prompting. As further analysis, we explore the interaction between CoT and model scale on BBH, finding that CoT enables emergent task performance on several BBH tasks with otherwise flat scaling curves.", "# HOMEPAGE \nURL: URL" ]
47bb45d53168b3ed846f38f1190c30e89bc63a50
Dataset diviso in train e test (dataset originale -> "b-mc2/sql-create-context")
Federic/Dataset-SQL
[ "region:us" ]
2024-01-19T11:23:22+00:00
{"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15623876.121307762, "num_examples": 70719}, {"name": "test", "num_bytes": 1736059.8786922381, "num_examples": 7858}], "download_size": 8588232, "dataset_size": 17359936.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2024-01-26T09:58:42+00:00
[]
[]
TAGS #region-us
Dataset diviso in train e test (dataset originale -> "b-mc2/sql-create-context")
[]
[ "TAGS\n#region-us \n" ]
3ca963a203be2d47ed4d9acf78ecf5ee95343091
Dataset con colonna 'text' che è il prompt in formato "meta-llama". Il prompt è formato dalla colonna 'context', 'question' e 'answer'.
Federic/Dataset-SQL-prompt
[ "region:us" ]
2024-01-19T11:23:35+00:00
{"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 47922450, "num_examples": 70719}, {"name": "test", "num_bytes": 5341594, "num_examples": 7858}], "download_size": 17737371, "dataset_size": 53264044}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2024-01-26T09:58:48+00:00
[]
[]
TAGS #region-us
Dataset con colonna 'text' che è il prompt in formato "meta-llama". Il prompt è formato dalla colonna 'context', 'question' e 'answer'.
[]
[ "TAGS\n#region-us \n" ]
eea17ab7109b5e837f9511795fd839546cd66718
# Dataset Card for Evaluation run of abacusai/MM-Orc-Vic-bagel-34b-c1000 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [abacusai/MM-Orc-Vic-bagel-34b-c1000](https://huggingface.co/abacusai/MM-Orc-Vic-bagel-34b-c1000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abacusai__MM-Orc-Vic-bagel-34b-c1000", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T11:39:51.323919](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__MM-Orc-Vic-bagel-34b-c1000/blob/main/results_2024-01-19T11-39-51.323919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7572348478258766, "acc_stderr": 0.028456971103197146, "acc_norm": 0.7607060395875296, "acc_norm_stderr": 0.029006282698865283, "mc1": 0.45532435740514077, "mc1_stderr": 0.01743349010253877, "mc2": 0.6057208212416958, "mc2_stderr": 0.014801567997845542 }, "harness|arc:challenge|25": { "acc": 0.636518771331058, "acc_stderr": 0.014056207319068283, "acc_norm": 0.6732081911262798, "acc_norm_stderr": 0.013706665975587331 }, "harness|hellaswag|10": { "acc": 0.6358295160326628, "acc_stderr": 0.004802133511654242, "acc_norm": 0.8351921927902808, "acc_norm_stderr": 0.0037024876621269487 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7407407407407407, "acc_stderr": 0.03785714465066653, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.03785714465066653 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8881578947368421, "acc_stderr": 0.02564834125169361, "acc_norm": 0.8881578947368421, "acc_norm_stderr": 0.02564834125169361 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8037735849056604, "acc_stderr": 0.024442388131100813, "acc_norm": 0.8037735849056604, "acc_norm_stderr": 0.024442388131100813 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9097222222222222, "acc_stderr": 0.023964965777906935, "acc_norm": 0.9097222222222222, "acc_norm_stderr": 0.023964965777906935 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7341040462427746, "acc_stderr": 0.033687629322594316, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.033687629322594316 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5882352941176471, "acc_stderr": 0.048971049527263666, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7659574468085106, "acc_stderr": 0.02767845257821238, "acc_norm": 0.7659574468085106, "acc_norm_stderr": 0.02767845257821238 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5701754385964912, "acc_stderr": 0.04657047260594964, "acc_norm": 0.5701754385964912, "acc_norm_stderr": 0.04657047260594964 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7241379310344828, "acc_stderr": 0.03724563619774631, "acc_norm": 0.7241379310344828, "acc_norm_stderr": 0.03724563619774631 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.7301587301587301, "acc_stderr": 0.022860838309232072, "acc_norm": 0.7301587301587301, "acc_norm_stderr": 0.022860838309232072 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5317460317460317, "acc_stderr": 0.04463112720677173, "acc_norm": 0.5317460317460317, "acc_norm_stderr": 0.04463112720677173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8935483870967742, "acc_stderr": 0.01754510295165663, "acc_norm": 0.8935483870967742, "acc_norm_stderr": 0.01754510295165663 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6896551724137931, "acc_stderr": 0.03255086769970104, "acc_norm": 0.6896551724137931, "acc_norm_stderr": 0.03255086769970104 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8484848484848485, "acc_stderr": 0.027998073798781657, "acc_norm": 0.8484848484848485, "acc_norm_stderr": 0.027998073798781657 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9343434343434344, "acc_stderr": 0.01764652667723333, "acc_norm": 0.9343434343434344, "acc_norm_stderr": 0.01764652667723333 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9637305699481865, "acc_stderr": 0.013492659751295131, "acc_norm": 0.9637305699481865, "acc_norm_stderr": 0.013492659751295131 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8102564102564103, "acc_stderr": 0.019880165406588803, "acc_norm": 0.8102564102564103, "acc_norm_stderr": 0.019880165406588803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.43703703703703706, "acc_stderr": 0.030242862397654002, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.030242862397654002 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8571428571428571, "acc_stderr": 0.02273020811930655, "acc_norm": 0.8571428571428571, "acc_norm_stderr": 0.02273020811930655 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4900662251655629, "acc_stderr": 0.04081677107248436, "acc_norm": 0.4900662251655629, "acc_norm_stderr": 0.04081677107248436 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9137614678899083, "acc_stderr": 0.012035597300116245, "acc_norm": 0.9137614678899083, "acc_norm_stderr": 0.012035597300116245 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6527777777777778, "acc_stderr": 0.032468872436376486, "acc_norm": 0.6527777777777778, "acc_norm_stderr": 0.032468872436376486 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9068627450980392, "acc_stderr": 0.020397853969427, "acc_norm": 0.9068627450980392, "acc_norm_stderr": 0.020397853969427 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9029535864978903, "acc_stderr": 0.01926932302564027, "acc_norm": 0.9029535864978903, "acc_norm_stderr": 0.01926932302564027 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7982062780269058, "acc_stderr": 0.026936111912802273, "acc_norm": 0.7982062780269058, "acc_norm_stderr": 0.026936111912802273 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8702290076335878, "acc_stderr": 0.029473649496907065, "acc_norm": 0.8702290076335878, "acc_norm_stderr": 0.029473649496907065 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8842975206611571, "acc_stderr": 0.029199802455622793, "acc_norm": 0.8842975206611571, "acc_norm_stderr": 0.029199802455622793 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8888888888888888, "acc_stderr": 0.030381596756651655, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.030381596756651655 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8650306748466258, "acc_stderr": 0.026845765054553855, "acc_norm": 0.8650306748466258, "acc_norm_stderr": 0.026845765054553855 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5267857142857143, "acc_stderr": 0.047389751192741546, "acc_norm": 0.5267857142857143, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.883495145631068, "acc_stderr": 0.03176683948640406, "acc_norm": 0.883495145631068, "acc_norm_stderr": 0.03176683948640406 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9316239316239316, "acc_stderr": 0.016534627684311357, "acc_norm": 0.9316239316239316, "acc_norm_stderr": 0.016534627684311357 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.89, "acc_stderr": 0.03144660377352202, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352202 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.909323116219668, "acc_stderr": 0.010268429662528543, "acc_norm": 0.909323116219668, "acc_norm_stderr": 0.010268429662528543 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8092485549132948, "acc_stderr": 0.02115267696657528, "acc_norm": 0.8092485549132948, "acc_norm_stderr": 0.02115267696657528 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.8033519553072626, "acc_stderr": 0.013293183027454653, "acc_norm": 0.8033519553072626, "acc_norm_stderr": 0.013293183027454653 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8496732026143791, "acc_stderr": 0.02046417512433263, "acc_norm": 0.8496732026143791, "acc_norm_stderr": 0.02046417512433263 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.797427652733119, "acc_stderr": 0.02282731749105969, "acc_norm": 0.797427652733119, "acc_norm_stderr": 0.02282731749105969 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8611111111111112, "acc_stderr": 0.019242526226544543, "acc_norm": 0.8611111111111112, "acc_norm_stderr": 0.019242526226544543 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6312056737588653, "acc_stderr": 0.028782227561347257, "acc_norm": 0.6312056737588653, "acc_norm_stderr": 0.028782227561347257 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5925684485006519, "acc_stderr": 0.012549473714212223, "acc_norm": 0.5925684485006519, "acc_norm_stderr": 0.012549473714212223 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8198529411764706, "acc_stderr": 0.02334516361654484, "acc_norm": 0.8198529411764706, "acc_norm_stderr": 0.02334516361654484 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8104575163398693, "acc_stderr": 0.01585615218998026, "acc_norm": 0.8104575163398693, "acc_norm_stderr": 0.01585615218998026 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8326530612244898, "acc_stderr": 0.02389714476891452, "acc_norm": 0.8326530612244898, "acc_norm_stderr": 0.02389714476891452 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700643, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700643 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.02876234912646613, "acc_norm": 0.91, "acc_norm_stderr": 0.02876234912646613 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598052, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598052 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8713450292397661, "acc_stderr": 0.025679342723276908, "acc_norm": 0.8713450292397661, "acc_norm_stderr": 0.025679342723276908 }, "harness|truthfulqa:mc|0": { "mc1": 0.45532435740514077, "mc1_stderr": 0.01743349010253877, "mc2": 0.6057208212416958, "mc2_stderr": 0.014801567997845542 }, "harness|winogrande|5": { "acc": 0.8232044198895028, "acc_stderr": 0.010721923287918756 }, "harness|gsm8k|5": { "acc": 0.7225170583775588, "acc_stderr": 0.012333447581047546 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_abacusai__MM-Orc-Vic-bagel-34b-c1000
[ "region:us" ]
2024-01-19T11:42:08+00:00
{"pretty_name": "Evaluation run of abacusai/MM-Orc-Vic-bagel-34b-c1000", "dataset_summary": "Dataset automatically created during the evaluation run of model [abacusai/MM-Orc-Vic-bagel-34b-c1000](https://huggingface.co/abacusai/MM-Orc-Vic-bagel-34b-c1000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__MM-Orc-Vic-bagel-34b-c1000\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T11:39:51.323919](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__MM-Orc-Vic-bagel-34b-c1000/blob/main/results_2024-01-19T11-39-51.323919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7572348478258766,\n \"acc_stderr\": 0.028456971103197146,\n \"acc_norm\": 0.7607060395875296,\n \"acc_norm_stderr\": 0.029006282698865283,\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6057208212416958,\n \"mc2_stderr\": 0.014801567997845542\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.636518771331058,\n \"acc_stderr\": 0.014056207319068283,\n \"acc_norm\": 0.6732081911262798,\n \"acc_norm_stderr\": 0.013706665975587331\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6358295160326628,\n \"acc_stderr\": 0.004802133511654242,\n \"acc_norm\": 0.8351921927902808,\n \"acc_norm_stderr\": 0.0037024876621269487\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.02564834125169361,\n \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.02564834125169361\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100813,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100813\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.033687629322594316,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.033687629322594316\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7659574468085106,\n \"acc_stderr\": 0.02767845257821238,\n \"acc_norm\": 0.7659574468085106,\n \"acc_norm_stderr\": 0.02767845257821238\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774631,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774631\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7301587301587301,\n \"acc_stderr\": 0.022860838309232072,\n \"acc_norm\": 0.7301587301587301,\n \"acc_norm_stderr\": 0.022860838309232072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.01754510295165663,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.01754510295165663\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6896551724137931,\n \"acc_stderr\": 0.03255086769970104,\n \"acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03255086769970104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781657,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781657\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723333,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723333\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295131,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295131\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588803,\n \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654002,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654002\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.02273020811930655,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.02273020811930655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564027,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564027\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553855,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553855\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.016534627684311357,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.016534627684311357\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.909323116219668,\n \"acc_stderr\": 0.010268429662528543,\n \"acc_norm\": 0.909323116219668,\n \"acc_norm_stderr\": 0.010268429662528543\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657528,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657528\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8033519553072626,\n \"acc_stderr\": 0.013293183027454653,\n \"acc_norm\": 0.8033519553072626,\n \"acc_norm_stderr\": 0.013293183027454653\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n \"acc_stderr\": 0.02282731749105969,\n \"acc_norm\": 0.797427652733119,\n \"acc_norm_stderr\": 0.02282731749105969\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.019242526226544543,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.019242526226544543\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6312056737588653,\n \"acc_stderr\": 0.028782227561347257,\n \"acc_norm\": 0.6312056737588653,\n \"acc_norm_stderr\": 0.028782227561347257\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5925684485006519,\n \"acc_stderr\": 0.012549473714212223,\n \"acc_norm\": 0.5925684485006519,\n \"acc_norm_stderr\": 0.012549473714212223\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8104575163398693,\n \"acc_stderr\": 0.01585615218998026,\n \"acc_norm\": 0.8104575163398693,\n \"acc_norm_stderr\": 0.01585615218998026\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6057208212416958,\n \"mc2_stderr\": 0.014801567997845542\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918756\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.012333447581047546\n }\n}\n```", "repo_url": "https://huggingface.co/abacusai/MM-Orc-Vic-bagel-34b-c1000", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|arc:challenge|25_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|gsm8k|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hellaswag|10_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T11-39-51.323919.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["**/details_harness|winogrande|5_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T11-39-51.323919.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T11_39_51.323919", "path": ["results_2024-01-19T11-39-51.323919.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T11-39-51.323919.parquet"]}]}]}
2024-01-19T11:42:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of abacusai/MM-Orc-Vic-bagel-34b-c1000 Dataset automatically created during the evaluation run of model abacusai/MM-Orc-Vic-bagel-34b-c1000 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T11:39:51.323919(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of abacusai/MM-Orc-Vic-bagel-34b-c1000\n\n\n\nDataset automatically created during the evaluation run of model abacusai/MM-Orc-Vic-bagel-34b-c1000 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T11:39:51.323919(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of abacusai/MM-Orc-Vic-bagel-34b-c1000\n\n\n\nDataset automatically created during the evaluation run of model abacusai/MM-Orc-Vic-bagel-34b-c1000 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T11:39:51.323919(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
2be11cba462f5f67d8a28f07cdff58c8bab646f4
<img src="front_urbanSyn.png" width=100% /> # UrbanSyn Dataset UrbanSyn is an open synthetic dataset featuring photorealistic driving scenes. It contains ground-truth annotations for semantic segmentation, scene depth, panoptic instance segmentation, and 2-D bounding boxes. Website [https://urbansyn.org](https://urbansyn.org) ## Overview UrbanSyn is a diverse, compact, and photorealistic dataset that provides more than 7.5k synthetic annotated images. It was born to address the synth-to-real domain gap, contributing to unprecedented synthetic-only baselines used by domain adaptation (DA) methods. **- Reduce the synth-to-real domain gap** UrbanSyn dataset helps to reduce the domain gap by contributing to unprecedented synthetic-only baselines used by domain adaptation (DA) methods. **- Ground-truth annotations** UrbanSyn comes with photorealistic color images, per-pixel semantic segmentation, depth, instance panoptic segmentation, and 2-D bounding boxes. **- Open for research and commercial purposes** UrbanSyn may be used for research and commercial purposes. It is released publicly under the Creative Commons Attribution-Commercial-ShareAlike 4.0 license. **- High-degree of photorealism** UrbanSyn features highly realistic and curated driving scenarios leveraging procedurally-generated content and high-quality curated assets. To achieve UrbanSyn photorealism we leverage industry-standard unbiased path-tracing and AI-based denoising techniques. ## White Paper **[[Arxiv]](https://arxiv.org/abs/2312.12176)** When using or referring to the UrbanSyn dataset in your research, please cite our white paper: ```BibTeX @misc{gomez2023one, title={All for One, and One for All: UrbanSyn Dataset, the third Musketeer of Synthetic Driving Scenes}, author={Jose L. Gómez and Manuel Silva and Antonio Seoane and Agnès Borrás and Mario Noriega and Germán Ros and Jose A. Iglesias-Guitian and Antonio M. López}, year={2023}, eprint={2312.12176}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` ## Terms of Use The UrbanSyn Dataset is provided by the Computer Vision Center (UAB) and CITIC (University of A Coruña). UrbanSyn may be used for research and commercial purposes, and it is subject to the Creative Commons Attribution-Commercial-ShareAlike 4.0. A summary of the CC-BY-SA 4.0 licensing terms can be found **[[here]](https://creativecommons.org/licenses/by-sa/4.0/deed.en)**. Due to constraints from our asset providers for UrbanSyn, we prohibit the use of generative AI technologies for reverse engineering any assets or creating content for stock media platforms based on the UrbanSyn dataset. While we strive to generate precise data, all information is presented 'as is' without any express or implied warranties. We explicitly disclaim all representations and warranties regarding the validity, scope, accuracy, completeness, safety, or utility of the licensed content, including any implied warranties of merchantability, fitness for a particular purpose, or otherwise. ## Acknowledgements Funded by Grant agreement PID2020-115734RB-C21 "SSL-ADA" and Grant agreement PID2020-115734RB-C22 "PGAS-ADA" <img src="MICINN_Gob_AEI_1.jpg" width="300" /> ## For more information about our team members and how to contact us, visit our website [https://urbansyn.org](https://urbansyn.org) ## Folder structure and content - ```rgb```: contains RGB images with a resolution of 2048x1024 in PNG format. - ```ss and ss_colour``` : contains the pixel-level semantic segmentation labels in grayscale (value = Class ID) and colour (value = Class RGB) respectively in PNG format. We follow the 19 training classes defined on Cityscapes: | name | trainId | color | |----------------------|---------|-----------------| | 'road' | 0 | (128, 64,128) | | 'sidewalk' | 1 | (244, 35,232) | | 'building' | 2 | ( 70, 70, 70) | | 'wall' | 3 | (102,102,156) | | 'fence' | 4 | (190,153,153) | | 'pole' | 5 | (153,153,153) | | 'traffic light' | 6 | (250,170, 30) | | 'traffic sign' | 7 | (220,220, 0) | | 'vegetation' | 8 | (107,142, 35) | | 'terrain' | 9 | (152,251,152) | | 'sky' | 10 | ( 70,130,180) | | 'person' | 11 | (220, 20, 60) | | 'rider' | 12 | (255, 0, 0) | | 'car' | 13 | ( 0, 0,142) | | 'truck' | 14 | ( 0, 0, 70) | | 'bus' | 15 | ( 0, 60,100) | | 'train' | 16 | ( 0, 80,100) | | 'motorcycle' | 17 | ( 0, 0,230) | | 'bicycle' | 18 | (119, 11, 32) | | 'unlabeled' | 19 | ( 0, 0, 0) | - ```panoptic```: contains the instance segmentation of the dynamic objects of the image in PNG format. Each instance is codified using the RGB channels, where RG corresponds to the instance number and B to the class ID. Dynamic objects are Person, Rider, Car, Truck, Bus, Train, Motorcycle and Bicycle. - ```bbox2D```: contains the 2D bounding boxes and Instances information for all the dynamic objects in the image up to 110 meters of distance from the camera and bigger than 150 pixels. We provide the annotations in a json file with the next structure: - bbox: provides the bounding box size determined by the top left corner (xMin, yMin) and Bottom right corner (xMax, YMax). - color: corresponds to the colour of the instance in the panoptic instance segmentation map inside panoptic folder. - label: defines the class name - occlusion_percentage: provides the occlusion percentatge of the object. Being 0 not occluded and 100 fully occluded. - ```depth```: contains the depth map of the image in EXR format. ## Download locally with huggingface_hub library - [Install huggingface_hub library](https://huggingface.co/docs/huggingface_hub/installation) - You can download the dataset on Python this way: ``` from huggingface_hub import snapshot_download ``` ``` snapshot_download(repo_id="UrbanSyn/UrbanSyn", repo_type="dataset") ``` - More information about how to download and additional options can be found [here](https://huggingface.co/docs/huggingface_hub/guides/download)
UrbanSyn/UrbanSyn
[ "task_categories:object-detection", "task_categories:image-segmentation", "task_categories:depth-estimation", "size_categories:1K<n<10K", "language:en", "license:cc-by-sa-4.0", "Urban Scenario", "Autonomous Driving", "Synthethic data", "arxiv:2312.12176", "region:us" ]
2024-01-19T11:45:22+00:00
{"language": ["en"], "license": "cc-by-sa-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["object-detection", "image-segmentation", "depth-estimation"], "pretty_name": "UrbanSyn", "tags": ["Urban Scenario", "Autonomous Driving", "Synthethic data"]}
2024-02-05T07:31:58+00:00
[ "2312.12176" ]
[ "en" ]
TAGS #task_categories-object-detection #task_categories-image-segmentation #task_categories-depth-estimation #size_categories-1K<n<10K #language-English #license-cc-by-sa-4.0 #Urban Scenario #Autonomous Driving #Synthethic data #arxiv-2312.12176 #region-us
![](front_urbanSyn.png) UrbanSyn Dataset ================ UrbanSyn is an open synthetic dataset featuring photorealistic driving scenes. It contains ground-truth annotations for semantic segmentation, scene depth, panoptic instance segmentation, and 2-D bounding boxes. Website URL Overview -------- UrbanSyn is a diverse, compact, and photorealistic dataset that provides more than 7.5k synthetic annotated images. It was born to address the synth-to-real domain gap, contributing to unprecedented synthetic-only baselines used by domain adaptation (DA) methods. * Reduce the synth-to-real domain gap UrbanSyn dataset helps to reduce the domain gap by contributing to unprecedented synthetic-only baselines used by domain adaptation (DA) methods. * Ground-truth annotations UrbanSyn comes with photorealistic color images, per-pixel semantic segmentation, depth, instance panoptic segmentation, and 2-D bounding boxes. * Open for research and commercial purposes UrbanSyn may be used for research and commercial purposes. It is released publicly under the Creative Commons Attribution-Commercial-ShareAlike 4.0 license. * High-degree of photorealism UrbanSyn features highly realistic and curated driving scenarios leveraging procedurally-generated content and high-quality curated assets. To achieve UrbanSyn photorealism we leverage industry-standard unbiased path-tracing and AI-based denoising techniques. White Paper ----------- [[Arxiv]](URL When using or referring to the UrbanSyn dataset in your research, please cite our white paper: Terms of Use ------------ The UrbanSyn Dataset is provided by the Computer Vision Center (UAB) and CITIC (University of A Coruña). UrbanSyn may be used for research and commercial purposes, and it is subject to the Creative Commons Attribution-Commercial-ShareAlike 4.0. A summary of the CC-BY-SA 4.0 licensing terms can be found [[here]](URL Due to constraints from our asset providers for UrbanSyn, we prohibit the use of generative AI technologies for reverse engineering any assets or creating content for stock media platforms based on the UrbanSyn dataset. While we strive to generate precise data, all information is presented 'as is' without any express or implied warranties. We explicitly disclaim all representations and warranties regarding the validity, scope, accuracy, completeness, safety, or utility of the licensed content, including any implied warranties of merchantability, fitness for a particular purpose, or otherwise. Acknowledgements ---------------- Funded by Grant agreement PID2020-115734RB-C21 "SSL-ADA" and Grant agreement PID2020-115734RB-C22 "PGAS-ADA" ![](MICINN_Gob_AEI_1.jpg) For more information about our team members and how to contact us, visit our website URL ---------------------------------------------------------------------------------------- Folder structure and content ---------------------------- * : contains RGB images with a resolution of 2048x1024 in PNG format. * : contains the pixel-level semantic segmentation labels in grayscale (value = Class ID) and colour (value = Class RGB) respectively in PNG format. We follow the 19 training classes defined on Cityscapes: name: 'road', trainId: 0, color: (128, 64,128) name: 'sidewalk', trainId: 1, color: (244, 35,232) name: 'building', trainId: 2, color: ( 70, 70, 70) name: 'wall', trainId: 3, color: (102,102,156) name: 'fence', trainId: 4, color: (190,153,153) name: 'pole', trainId: 5, color: (153,153,153) name: 'traffic light', trainId: 6, color: (250,170, 30) name: 'traffic sign', trainId: 7, color: (220,220, 0) name: 'vegetation', trainId: 8, color: (107,142, 35) name: 'terrain', trainId: 9, color: (152,251,152) name: 'sky', trainId: 10, color: ( 70,130,180) name: 'person', trainId: 11, color: (220, 20, 60) name: 'rider', trainId: 12, color: (255, 0, 0) name: 'car', trainId: 13, color: ( 0, 0,142) name: 'truck', trainId: 14, color: ( 0, 0, 70) name: 'bus', trainId: 15, color: ( 0, 60,100) name: 'train', trainId: 16, color: ( 0, 80,100) name: 'motorcycle', trainId: 17, color: ( 0, 0,230) name: 'bicycle', trainId: 18, color: (119, 11, 32) name: 'unlabeled', trainId: 19, color: ( 0, 0, 0) * : contains the instance segmentation of the dynamic objects of the image in PNG format. Each instance is codified using the RGB channels, where RG corresponds to the instance number and B to the class ID. Dynamic objects are Person, Rider, Car, Truck, Bus, Train, Motorcycle and Bicycle. * : contains the 2D bounding boxes and Instances information for all the dynamic objects in the image up to 110 meters of distance from the camera and bigger than 150 pixels. We provide the annotations in a json file with the next structure: + bbox: provides the bounding box size determined by the top left corner (xMin, yMin) and Bottom right corner (xMax, YMax). + color: corresponds to the colour of the instance in the panoptic instance segmentation map inside panoptic folder. + label: defines the class name + occlusion\_percentage: provides the occlusion percentatge of the object. Being 0 not occluded and 100 fully occluded. * : contains the depth map of the image in EXR format. Download locally with huggingface\_hub library ---------------------------------------------- * Install huggingface\_hub library * You can download the dataset on Python this way: * More information about how to download and additional options can be found here
[]
[ "TAGS\n#task_categories-object-detection #task_categories-image-segmentation #task_categories-depth-estimation #size_categories-1K<n<10K #language-English #license-cc-by-sa-4.0 #Urban Scenario #Autonomous Driving #Synthethic data #arxiv-2312.12176 #region-us \n" ]
88f577186f9dc9b47ccdcf0812215cbc4d7849a0
# Dataset Card for "bus_few4_05x" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FanChen0116/bus_few4_05x
[ "region:us" ]
2024-01-19T11:47:05+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-from_location", "2": "B-from_location", "3": "B-leaving_date", "4": "I-leaving_date", "5": "I-to_location", "6": "B-to_location"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 6172, "num_examples": 35}, {"name": "validation", "num_bytes": 6900, "num_examples": 35}, {"name": "test", "num_bytes": 70618, "num_examples": 377}], "download_size": 0, "dataset_size": 83690}}
2024-01-19T12:50:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "bus_few4_05x" More Information needed
[ "# Dataset Card for \"bus_few4_05x\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"bus_few4_05x\"\n\nMore Information needed" ]
16f56e7f511235c68917d0789fa4424f6c63bcf6
# Dataset Card for "bus_few4_05x_empty" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FanChen0116/bus_few4_05x_empty
[ "region:us" ]
2024-01-19T11:47:59+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-from_location", "2": "B-from_location", "3": "B-leaving_date", "4": "I-leaving_date", "5": "I-to_location", "6": "B-to_location"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 5491, "num_examples": 35}, {"name": "validation", "num_bytes": 6128, "num_examples": 35}, {"name": "test", "num_bytes": 70618, "num_examples": 377}], "download_size": 0, "dataset_size": 82237}}
2024-01-19T12:50:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for "bus_few4_05x_empty" More Information needed
[ "# Dataset Card for \"bus_few4_05x_empty\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"bus_few4_05x_empty\"\n\nMore Information needed" ]
7b4ea4209d0ad6be34cb872ff1186873ead3ca28
# Dataset Card for "bus_few4_05x_pvi" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FanChen0116/bus_few4_05x_pvi
[ "region:us" ]
2024-01-19T11:48:18+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-from_location", "2": "B-from_location", "3": "B-leaving_date", "4": "I-leaving_date", "5": "I-to_location", "6": "B-to_location"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 6172, "num_examples": 35}, {"name": "validation", "num_bytes": 6900, "num_examples": 35}, {"name": "test", "num_bytes": 70618, "num_examples": 377}], "download_size": 0, "dataset_size": 83690}}
2024-01-19T12:58:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for "bus_few4_05x_pvi" More Information needed
[ "# Dataset Card for \"bus_few4_05x_pvi\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"bus_few4_05x_pvi\"\n\nMore Information needed" ]
86d748cfee97aa690a0c2b079b6b7c4728bff6b9
# MoeSpeech [**日本語はこちら**](#moespeech日本語版readme-japanese-version) **このデータセットは、著作権法第三十条の四の情報解析(機械学習等)の目的でのみ使用が許可されています。それ以外の用途での使用は[ライセンス](https://huggingface.co/spaces/litagin/moe-speech-license)により禁止されています。** This dataset is only permitted for use under Article 30-4 of the Copyright Law of Japan for data analysis (such as machine learning) purposes. Any use for purposes other than those specified is prohibited by the [license](https://huggingface.co/spaces/litagin/moe-speech-license). ### Changelog Updates planned regularly: additions of characters, audio filtering, new audio files for existing characters, etc. - 2024-02-01: Added [Moe Speech Similarity Map](https://huggingface.co/spaces/litagin/moe-speech-similarity-map) - 2024-01-31: Version 0.4.1: Remove some more inappropriate audios (621 hours) - 2024-01-31: Version 0.4 - Manual removing and adding: (50 / 473) [**finished_uuids.txt**](finished_uuids.txt) - By Using [this model](https://github.com/litagin02/processed-speech-detector) and manual check, many (1.5k) inappropriate audios (processed to sound as if it's coming through a phone or from behind a wall) are removed. - 473 characters, 395k files, 622 hours, 184GB - 2024-01-27: Version 0.3 - Start manually removing audio and add new audio files for randomly chosen uuids, see [**finished_uuids.txt**](finished_uuids.txt) for the list of characters that have been processed so far (24 / 473). - Converted all wav files to 44.1kHz 16bit mono, so the file size is now much smaller. - 473 characters, 395k files, 623 hours, **184GB** - Add [samples](https://huggingface.co/datasets/litagin/moe-speech/tree/main/samples) folder, which contains one file per character. - 2024-01-25: **UPDATE LICENSE** (see [LICENSE](LICENSE.md)), changed to **prohibit redistribution** of the dataset. - 2024-01-24: Version 0.2, Added 24 characters (473 characters, 394k files, 622 hours, 368GB) - 2024-01-22: Version 0.1, Initial version (449 characters, 363k files, 581 hours) ## Dataset Description - **Point of Contact:** [litagin]([email protected]) ### Dataset Summary - A high-quality dataset of character acting speech audio by Japanese professional voice actors, recorded in a studio, free of noise and background music (both male and female characters). - Each audio file is a monaural 44.1kHz 16bit WAV file of 2-15 seconds. - The dataset is organized into folders for each character. Each character is anonymized and has a random 8-character alphanumeric identifier generated by `uuid.uuid4().hex[:8]`. - Currently, it includes a total of **473** characters, approximately **395k** audio files, and a total of about **623 hours** and **184GB** of audio. - The audio has been mechanically filtered for quality, suitable for tasks such as TTS. - See [Moe Speech Similarity Map](https://huggingface.co/spaces/litagin/moe-speech-similarity-map) for a visualization of the similarity between uuids. ![stats](stats.png) See [info.csv](info.csv) for more details. ### Supported Tasks and Leaderboards - May be useful for research and development of voice-related tasks such as voice conversion and character voice synthesis with rich emotions, especially focused on Japanese moe culture. - By performing transcriptions through voice recognition, the content of the dialogues might be usable for language models and similar applications. ### Languages The language of the audio in the dataset is predominantly Japanese, with the possibility of a very small number of English phrases. ## Dataset Structure ``` ├── info.csv |── samples/ | ├── {uuid1}_000.wav | ├── {uuid2}_0000.wav | ... └── data/ ├── {uuid1}/ │ └── wav/ │ ├── {uuid1}_000.wav │ ├── {uuid1}_001.wav │ ├── {uuid1}_002.wav │ ... ├── {uuid2}/ │ └── wav/ │ ├── {uuid2}_0000.wav │ ├── {uuid2}_0001.wav │ ├── {uuid2}_0002.wav │ ... ... ``` [info.csv](info.csv): ``` name,num_files,total_duration_min,f0_mean {uuid1},516,45.21,211.2 ... ``` - `name`: Character identifier (a random 8-character alphanumeric string) - `num_files`: Number of audio files - `total_duration_min`: Total duration of audio files (in minutes) - `f0_mean`: Average fundamental frequency (Hz) of the audio files, representing the pitch of the voice - The order is in ascending `name` ## Download Using the huggingface-cli can be convenient. Create a token from the [Hugging Face settings page](https://huggingface.co/settings/tokens) and then log in using the following commands: ```bash pip install -U "huggingface_hub[cli]" huggingface-cli login ``` To download samples (one file per character): ```bash huggingface-cli download litagin/moe-speech --repo-type dataset --local-dir path/to/download/ --local-dir-use-symlinks False --include "samples/*" ``` To download data for a specific identifier `{uuid}`: ```bash huggingface-cli download litagin/moe-speech --repo-type dataset --local-dir path/to/download/ --local-dir-use-symlinks False --include "data/{uuid}/*" ``` To download all data (be mindful of the storage requirements): ```bash huggingface-cli download litagin/moe-speech --repo-type dataset --local-dir path/to/download/ --local-dir-use-symlinks False ``` For more details, refer to the [Hugging Face CLI documentation](https://huggingface.co/docs/huggingface_hub/guides/cli). Alternatively, you can use git-lfs in combination with git clone to download everything. Due to the large size of the dataset, downloading it all at once with the huggingface-cli method often fails. The following commands should work if you have already logged in using huggingface-cli: ```bash git lfs install git clone https://huggingface.co/datasets/litagin/moe-speech ``` ## Dataset Creation ### Curation Rationale In recent years, text-to-speech (TTS) technology has advanced, allowing for the synthesis of speech with controlled, rich emotions. However, most existing Japanese TTS corpora consist of sentences unsuitable for character dialogues, read with deliberately suppressed emotions due to compatibility issues with existing TTS technologies. As a result, there are few emotion speech corpora for dialogues available to the general machine learning user. In this context, the role of a corpus containing Japanese character acting dialogues is significant, and this dataset has been developed to promote research and development in emotion TTS and voice conversion. ### Source Data Recordings from PC games that were purchased through legal means and are personally owned. #### Initial Data Collection and Normalization The audio files were filtered as follows: Filter audios with the following conditions. 1. Mono only (convert to mono if stereo but left channel == right channel) 2. Duration: 2-15 seconds 3. [NISQA](https://github.com/gabrielmittag/NISQA) mos quality score: decide threshold from histogram for each character, often Q1 (because mos score largely depends on the speaker) 4. Get speech ratio (speech duration / total duration) using [Silero VAD](https://github.com/snakers4/silero-vad), and require >=0.5. 5. The number of resulting audios for each character should be >= 100, and the total duration should be >= 15 minutes. ### Annotations - Fundamental frequencies were obtained using the dio function in pyworld, and their averages were calculated. ### Personal and Sensitive Information Due to the nature of the dataset's source, it may contain dialogues with the following characteristics: - Lines with sexual content. - Sexual sounds (though many of these should have been excluded during the filtering process). - (There is interest in creating a dataset focused solely on sexual sounds, but mechanical creation is challenging, and collaborators are sought.) To prevent misuse for enjoyment purposes, the following measures have been taken: - Game names and character names are concealed, no categorization by game is done in folder organization, and random alphanumeric strings are used as character identifiers. - The order of audio files in each character folder is randomized to prevent identification of the sequence of dialogues. ## Considerations for Using the Data ### Discussion of Biases Due to its nature, the dataset may exhibit certain biases, such as: - A tendency for a larger volume of data for female characters. ### Other Known Limitations - The same voice actor may play multiple characters, or the same character may appear across multiple games. In such cases, they are assigned different identifiers. - Although the audio has been filtered for quality, not all files have been individually checked. Therefore, it's possible that some files may include: - Audio processed to sound like it has an echo. - Audio processed to sound as if it's coming through a phone or from behind a wall. **TODO**: Exclude such audio through manual effort or some other means. ## Additional Information ### Licensing Information Please refer to [LICENSE](LICENSE.md) for details. It is essential to read and understand the LICENSE before using this dataset to ensure compliance with its terms. ### Disclaimer - The providers of this dataset are not responsible for any troubles or damages arising from the use of this dataset. - Users must comply with the laws of their country or region when using this dataset. The legal basis for publishing this dataset is as follows: [Copyright Law of Japan (Law No. 48 of May 6, 1970) Article 30-4](https://www.japaneselawtranslation.go.jp/ja/laws/view/4207#je_ch2sc3sb5at4): (Quotation starts) Article 30-4: It is permissible to exploit a work, in any way and to the extent considered necessary, in any of the following cases, or in any other case in which it is not a person's purpose to personally enjoy or cause another person to enjoy the thoughts or sentiments expressed in that work; provided, however, that this does not apply if the action would unreasonably prejudice the interests of the copyright owner in light of the nature or purpose of the work or the circumstances of its exploitation: (i) if it is done for use in testing to develop or put into practical use technology that is connected with the recording of sounds or visuals of a work or other such exploitation; (ii) if it is done for use in data analysis (meaning the extraction, comparison, classification, or other statistical analysis of the constituent language, sounds, images, or other elemental data from a large number of works or a large volume of other such data; the same applies in Article 47-5, paragraph (1), item (ii)); (iii) if it is exploited in the course of computer data processing or otherwise exploited in a way that does not involve what is expressed in the work being perceived by the human senses (for works of computer programming, such exploitation excludes the execution of the work on a computer), beyond as set forth in the preceding two items. (End of quotation) - This dataset is considered to fall under the second category mentioned above. - The dataset is structured to meet the condition of "person's purpose to personally enjoy or cause another person to enjoy the thoughts or sentiments expressed in that work", as specified in [LICENSE](LICENSE.md), and users are prohibited from using it for enjoyment purposes. - Regarding "this does not apply if the action would unreasonably prejudice the interests of the copyright owner in light of the nature or purpose of the work or the circumstances of its exploitation", this dataset conceals the source game, voice actor names, and character names, and the order of the audio files is randomized. Additionally, identifiers with the same source are not disclosed, making it impossible to use this dataset for the original purpose of the work (enjoying the game's scenario with voice and images), and thus it is believed that the publication of this dataset does not unfairly harm the interests of the copyright holder. - Even if a user tries to ignore the license and use it for enjoyment, it is practically impossible to recreate the original game's scenario as there are over 400 speakers with numerous audio files for each, making it difficult to identify speakers with the same source. - As stated in [LICENSE](LICENSE.md), any use that "unfairly harms the interests of the copyright holder", as well as actions that hinder the considerations mentioned above (providing information to third parties about the voice actors or original works, or redistributing in a way that makes these associations identifiable), are prohibited. # MoeSpeech日本語版README [Japanese Version] ### Changelog 随時更新予定:キャラクターの追加、音声のフィルタリング、既存キャラクターの音声ファイルの新たな追加等 - 2024-02-01: [Moe Speech Similarity Map](https://huggingface.co/spaces/litagin/moe-speech-similarity-map)を追加 - 2024-01-31: Version 0.4(.1) - 手作業による除外と追加: (50 / 473) [**finished_uuids.txt**](finished_uuids.txt) - [このモデル](https://github.com/litagin02/processed-speech-detector)と手作業チェックにより、多数(1.5k)の不適切な音声(電話越し・壁越しであるかのような加工がされた音声等)を除外 - 473キャラクター、395kファイル、622時間、184GB - 2024-01-27: Version 0.3 - ランダムに選んだ識別子について、手作業による音声の除外と音声ファイルの追加を開始、これまでに処理したキャラクターのリストは[**finished_uuids.txt**](finished_uuids.txt)を参照(24 / 473) - 全てのwavファイルを44.1kHz 16bit monoに変換したため、ファイルサイズが大幅に小さくなった - 473キャラクター、395kファイル、623時間、**184GB** - [samples](https://huggingface.co/datasets/litagin/moe-speech/tree/main/samples) フォルダを追加、1キャラクター1ファイル - 2024-01-25: **ライセンスのアップデート** (see [LICENSE](LICENSE.md))、**再配布を禁止**に変更 - 2024-01-24: Version 0.2, Added 24 characters (473 characters, 394k files, 622 hours, 368GB) - 2024-01-22: Version 0.1, (449 characters, 363k files, 581 hours, 343GB) ## Dataset Description ### Dataset Summary - 日本人プロ声優による高音質(スタジオ録音)でノイズ・BGM等無しのキャラクター演技セリフ発話音声データセット(男性・女性キャラクター両方含む) - 1音声は2-15秒のモノラル44.1kHz 16bit wavファイル - キャラクターごとにフォルダ分けされている(各キャラクターは匿名化され、`uuid.uuid4().hex[:8]`によるランダムな8文字の英数字による識別子を持つ) - 現在は合計**473**キャラクター、約**39万**の音声ファイル、合計約**623時間**、**184GB**の音声が含まれる - TTS等のタスクに使える質になるよう、機械的に音声の質によりフィルタリング済み - 各uuid間の類似度の地図は [Moe Speech Similarity Map](https://huggingface.co/spaces/litagin/moe-speech-similarity-map) で見ることができます ![stats](stats.png) 詳細は[info.csv](info.csv)を参照。 ### Supported Tasks and Leaderboards - 日本の萌え文化に特化した音声変換や感情豊かなキャラクター音声合成等の音声関連タスクの研究や開発に使えるかもしれない。 - 音声認識により書き起こしを行うことで、セリフ内容を言語モデル等に利用できるかもしれない。 ### Languages データセット内の音声の言語は(極めて少数の英文セリフ等の可能性を除き)日本語のみ。 ## Dataset Structure ``` ├── info.csv |── samples/ | ├── {uuid1}_000.wav | ├── {uuid2}_0000.wav | ... └── data/ ├── {uuid1}/ │ └── wav/ │ ├── {uuid1}_000.wav │ ├── {uuid1}_001.wav │ ├── {uuid1}_002.wav │ ... ├── {uuid2}/ │ └── wav/ │ ├── {uuid2}_0000.wav │ ├── {uuid2}_0001.wav │ ├── {uuid2}_0002.wav │ ... ... ``` [info.csv](info.csv): ``` name,num_files,total_duration_min,f0_mean {uuid1},516,45.21,211.2 ... ``` - `name`: キャラクター識別子(8文字のランダムな英数字) - `num_files`: 音声ファイル数 - `total_duration_min`: 音声ファイルの合計時間(分) - `f0_mean`: 音声ファイルの平均基本周波数(Hz)の平均、つまり声の高さ - 並び順は`name`の昇順 ## Download huggingface-cliを使うと便利です。 [Huggung Faceの設定ページ](https://huggingface.co/settings/tokens)からトークンを作り、以下でログインします。 ```bash pip install -U "huggingface_hub[cli]" huggingface-cli login ``` サンプル(1キャラクター1ファイル)をダウンロードする場合。 ```bash huggingface-cli download litagin/moe-speech --repo-type dataset --local-dir path/to/download/ --local-dir-use-symlinks False --include "samples/*" ``` 識別名`{uuid}`のデータのみをダウンロードする場合。 ```bash huggingface-cli download litagin/moe-speech --repo-type dataset --local-dir path/to/download/ --local-dir-use-symlinks False --include "data/{uuid}/*" ``` 全てのデータをダウンロードする場合(容量に注意してください)。 ```bash huggingface-cli download litagin/moe-speech --repo-type dataset --local-dir path/to/download/ --local-dir-use-symlinks False ``` 詳細は[Hugging Face CLIのドキュメント](https://huggingface.co/docs/huggingface_hub/guides/cli)を参照。 または、git-lfsとgit cloneを組み合わせて全てをダウンロードすることもできます。データセットのサイズが大きいため、huggingface-cliを使って一度に全てをダウンロードしようとするとしばしば失敗します。以下のコマンドは、すでにhuggingface-cliを使ってログインしている場合には機能するはずです: ```bash git lfs install git clone https://huggingface.co/datasets/litagin/moe-speech ``` ## Dataset Creation ### Curation Rationale 近年、感情音声合成の技術が発展して、豊かな感情を制御しながら音声合成することが可能になってきている。しかしこれまでの日本語音声合成コーパスは既存のTTS技術との相性の問題で、キャラクターのセリフとしては似合わない文章を、意図的に感情を抑えて読み上げるコーパスがほとんどであり、一般の機械学習ユーザーが利用できるセリフ感情音声コーパスは数少ない。このような状況で、日本語キャラクター演技を行っているセリフコーパスが果たす役割は大きいと考えられ、その感情音声合成や音声変換の研究発展の促進を目的として作成した。 ### Source Data 正規の手段で購入して個人的に所持しているPCゲームから録音したもの #### Initial Data Collection and Normalization 以下のように音声ファイルをフィルタリングした。 Filter audios with the following conditions. 1. Mono only (convert to mono if stereo but left channel == right channel) 2. Duration: 2-15 seconds 3. [NISQA](https://github.com/gabrielmittag/NISQA) mos quality score: decide threshold from histogram for each character, often Q1 (because mos score largely depends on the speaker) 4. Get speech ratio (speech duration / total duration) using [Silero VAD](https://github.com/snakers4/silero-vad), and require >=0.5. 5. The number of resulting audios for each character should be >= 100, and the total duration should be >= 15 minutes. ### Annotations - 基本周波数はpyworldのdioで取得し、その平均を取った。 ### Personal and Sensitive Information データセットの元の都合上、以下のようなセリフが含まれている可能性がある。 - 性的な内容のセリフ - 性的の音声(ただしこれはフィルタリングの過程で多くが除外されているはず) - (性的な音声のみによるデータセットも作りたいが作成を機械的に行うのが困難そうなため分からず、協力者求む) 享受目的での利用を防ぐため、以下のような手段を取っている。 - ゲーム名やキャラクター名を伏せ、ゲームによるフォルダ分け類別はせず、またキャラクター識別子としてランダムな英数字の名前を使用 - 各キャラクターフォルダ内の音声ファイルの並び順はランダムにし、セリフの順番を特定できないようにする ## Considerations for Using the Data ### Discussion of Biases 性質上、以下のようなバイアスがある可能性がある。 - 女性キャラクターの方がデータ量が多い傾向がある ### Other Known Limitations - 同一声優が複数のキャラクターを演じている場合や、同一キャラクターでも複数ゲームにまたがっている場合があり、その場合は別の識別子を持つ。 - 音声品質によりフィルタリングしているが、全てをチェックしたわけではないので、以下のような音声がたまに入っている可能性がある。 - エコーが入ったような加工がされた音声 - 電話越し・壁越しであるかのような加工がされた音声 **TODO**: このような音声を手作業か何らかの手段により除外する。 ## Additional Information ### Licensing Information [LICENSE](LICENSE.md)を参照。このデータセットを利用する場合は、必ずLICENSEを読んで利用条件を確認すること。 English: Please refer to [LICENSE](LICENSE.md). If you use this dataset, be sure to read the LICENSE and check the usage conditions. ### Disclaimer - このデータセットの利用によって発生したいかなるトラブルや損害に対しても、データセットの提供者は責任を負わない。 - このデータセットの利用に際して、自身の国または地域の法律に従うこと。 このデータセットを公開している根拠は、以下の通り。 [著作権法(昭和45年5月6日法律第48号)第三十条の四](https://elaws.e-gov.go.jp/document?lawid=345AC0000000048#Mp-At_30_4): (以下引用) 著作物は、次に掲げる場合その他の当該著作物に表現された思想又は感情を自ら享受し又は他人に享受させることを目的としない場合には、その必要と認められる限度において、いずれの方法によるかを問わず、利用することができる。ただし、当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる場合は、この限りでない。  一 著作物の録音、録画その他の利用に係る技術の開発又は実用化のための試験の用に供する場合  二 情報解析(多数の著作物その他の大量の情報から、当該情報を構成する言語、音、影像その他の要素に係る情報を抽出し、比較、分類その他の解析を行うことをいう。第四十七条の五第一項第二号において同じ。)の用に供する場合  三 前二号に掲げる場合のほか、著作物の表現についての人の知覚による認識を伴うことなく当該著作物を電子計算機による情報処理の過程における利用その他の利用(プログラムの著作物にあつては、当該著作物の電子計算機における実行を除く。)に供する場合 (引用終わり) - このデータセットは、上記の第二号に該当すると考えられる。 - 「著作物に表現された思想又は感情を自ら享受し又は他人に享受させることを目的としない場合」という条件を満たすように配慮したデータセットの構造となっており、[LICENSE](LICENSE.md)にある通り、利用者は享受目的の利用を禁止されている。 - 「当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる場合」については、本データセットでは参照元や声優名やキャラクター名を伏せている上に音声の順番もシャッフルされており、また同一参照元を持つ識別子も公開していないことから、当該著作物(ゲーム)の使用用途(シナリオを音声と絵をあわせて楽しむ)で利用することは不可能であり、このデータセットの公開によって著作権者の利益を不当に害することはないと考えられる。 - もし仮に利用者がライセンスを無視し享受利用しようとしたとしても、話者数が400人以上で多数のため同じソースを持つ話者識別名の特定は困難であり、また音声ファイルも1話者につき100ファイル以上であるので、元のゲームのシナリオを再現することは事実上不可能である。 - [LICENSE](LICENSE.md)にある通り、「当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる」ような利用方法は禁じられており、また以上の配慮を妨げるような行為(声優や元著作物との対応についての第三者への情報提供やそれらが分かるような形での再配布)は禁止されている。
litagin/moe-speech
[ "task_categories:text-to-speech", "task_categories:audio-to-audio", "task_categories:audio-classification", "task_ids:speaker-identification", "multilinguality:monolingual", "size_categories:100K<n<1M", "language:ja", "license:other", "speech", "audio", "japanese", "anime", "voice", "not-for-all-audiences", "region:us" ]
2024-01-19T12:20:19+00:00
{"language": ["ja"], "license": ["other"], "multilinguality": ["monolingual"], "size_categories": ["100K<n<1M"], "task_categories": ["text-to-speech", "audio-to-audio", "audio-classification"], "task_ids": ["speaker-identification"], "pretty_name": "MoeSpeech", "license_link": "LICENSE.md", "tags": ["speech", "audio", "japanese", "anime", "voice", "not-for-all-audiences"], "extra_gated_prompt": "You agree to the terms of the [LICENSE](https://huggingface.co/spaces/litagin/moe-speech-license) when using this dataset. Please read it carefully.\n\u3053\u306e\u30c7\u30fc\u30bf\u30bb\u30c3\u30c8\u3092\u4f7f\u7528\u3059\u308b\u969b\u306b\u306f\u3001[LICENSE](https://huggingface.co/spaces/litagin/moe-speech-license)\u306e\u6761\u4ef6\u306b\u540c\u610f\u3057\u305f\u3053\u3068\u306b\u306a\u308a\u307e\u3059\u3002\u5fc5\u305a\u304a\u8aad\u307f\u304f\u3060\u3055\u3044\u3002 ", "extra_gated_fields": {"Please provide a detailed description of the specific tasks or applications for which you intend to use this dataset, DO NOT use general statements such as 'for research', 'learning' or 'personal use\" (\u3053\u306e\u30c7\u30fc\u30bf\u30bb\u30c3\u30c8\u3092\u4f7f\u7528\u3059\u308b\u5177\u4f53\u7684\u306a\u30bf\u30b9\u30af\u3084\u7528\u9014\u3092\u3001\u66d6\u6627\u306a\u8868\u73fe\u3067\u306f\u306a\u304f\uff08\u4f8b\u3048\u3070\u300c\u7814\u7a76\u300d\u300c\u6a5f\u68b0\u5b66\u7fd2\u300d\u300c\u79c1\u7684\u5229\u7528\u300d\u306fNG\uff09\u3001\u8a73\u7d30\u306b\u8a18\u8ff0\u3057\u3066\u304f\u3060\u3055\u3044)": "text", "I have read and agree to the terms of the LICENSE (\u5229\u7528\u898f\u7d04\u3092\u8aad\u3093\u3067\u540c\u610f\u3057\u307e\u3057\u305f)": "checkbox"}, "viewer": false}
2024-02-01T05:53:54+00:00
[]
[ "ja" ]
TAGS #task_categories-text-to-speech #task_categories-audio-to-audio #task_categories-audio-classification #task_ids-speaker-identification #multilinguality-monolingual #size_categories-100K<n<1M #language-Japanese #license-other #speech #audio #japanese #anime #voice #not-for-all-audiences #region-us
# MoeSpeech 日本語はこちら このデータセットは、著作権法第三十条の四の情報解析(機械学習等)の目的でのみ使用が許可されています。それ以外の用途での使用はライセンスにより禁止されています。 This dataset is only permitted for use under Article 30-4 of the Copyright Law of Japan for data analysis (such as machine learning) purposes. Any use for purposes other than those specified is prohibited by the license. ### Changelog Updates planned regularly: additions of characters, audio filtering, new audio files for existing characters, etc. - 2024-02-01: Added Moe Speech Similarity Map - 2024-01-31: Version 0.4.1: Remove some more inappropriate audios (621 hours) - 2024-01-31: Version 0.4 - Manual removing and adding: (50 / 473) finished_uuids.txt - By Using this model and manual check, many (1.5k) inappropriate audios (processed to sound as if it's coming through a phone or from behind a wall) are removed. - 473 characters, 395k files, 622 hours, 184GB - 2024-01-27: Version 0.3 - Start manually removing audio and add new audio files for randomly chosen uuids, see finished_uuids.txt for the list of characters that have been processed so far (24 / 473). - Converted all wav files to 44.1kHz 16bit mono, so the file size is now much smaller. - 473 characters, 395k files, 623 hours, 184GB - Add samples folder, which contains one file per character. - 2024-01-25: UPDATE LICENSE (see LICENSE), changed to prohibit redistribution of the dataset. - 2024-01-24: Version 0.2, Added 24 characters (473 characters, 394k files, 622 hours, 368GB) - 2024-01-22: Version 0.1, Initial version (449 characters, 363k files, 581 hours) ## Dataset Description - Point of Contact: litagin ### Dataset Summary - A high-quality dataset of character acting speech audio by Japanese professional voice actors, recorded in a studio, free of noise and background music (both male and female characters). - Each audio file is a monaural 44.1kHz 16bit WAV file of 2-15 seconds. - The dataset is organized into folders for each character. Each character is anonymized and has a random 8-character alphanumeric identifier generated by 'uuid.uuid4().hex[:8]'. - Currently, it includes a total of 473 characters, approximately 395k audio files, and a total of about 623 hours and 184GB of audio. - The audio has been mechanically filtered for quality, suitable for tasks such as TTS. - See Moe Speech Similarity Map for a visualization of the similarity between uuids. !stats See URL for more details. ### Supported Tasks and Leaderboards - May be useful for research and development of voice-related tasks such as voice conversion and character voice synthesis with rich emotions, especially focused on Japanese moe culture. - By performing transcriptions through voice recognition, the content of the dialogues might be usable for language models and similar applications. ### Languages The language of the audio in the dataset is predominantly Japanese, with the possibility of a very small number of English phrases. ## Dataset Structure URL: - 'name': Character identifier (a random 8-character alphanumeric string) - 'num_files': Number of audio files - 'total_duration_min': Total duration of audio files (in minutes) - 'f0_mean': Average fundamental frequency (Hz) of the audio files, representing the pitch of the voice - The order is in ascending 'name' ## Download Using the huggingface-cli can be convenient. Create a token from the Hugging Face settings page and then log in using the following commands: To download samples (one file per character): To download data for a specific identifier '{uuid}': To download all data (be mindful of the storage requirements): For more details, refer to the Hugging Face CLI documentation. Alternatively, you can use git-lfs in combination with git clone to download everything. Due to the large size of the dataset, downloading it all at once with the huggingface-cli method often fails. The following commands should work if you have already logged in using huggingface-cli: ## Dataset Creation ### Curation Rationale In recent years, text-to-speech (TTS) technology has advanced, allowing for the synthesis of speech with controlled, rich emotions. However, most existing Japanese TTS corpora consist of sentences unsuitable for character dialogues, read with deliberately suppressed emotions due to compatibility issues with existing TTS technologies. As a result, there are few emotion speech corpora for dialogues available to the general machine learning user. In this context, the role of a corpus containing Japanese character acting dialogues is significant, and this dataset has been developed to promote research and development in emotion TTS and voice conversion. ### Source Data Recordings from PC games that were purchased through legal means and are personally owned. #### Initial Data Collection and Normalization The audio files were filtered as follows: Filter audios with the following conditions. 1. Mono only (convert to mono if stereo but left channel == right channel) 2. Duration: 2-15 seconds 3. NISQA mos quality score: decide threshold from histogram for each character, often Q1 (because mos score largely depends on the speaker) 4. Get speech ratio (speech duration / total duration) using Silero VAD, and require >=0.5. 5. The number of resulting audios for each character should be >= 100, and the total duration should be >= 15 minutes. ### Annotations - Fundamental frequencies were obtained using the dio function in pyworld, and their averages were calculated. ### Personal and Sensitive Information Due to the nature of the dataset's source, it may contain dialogues with the following characteristics: - Lines with sexual content. - Sexual sounds (though many of these should have been excluded during the filtering process). - (There is interest in creating a dataset focused solely on sexual sounds, but mechanical creation is challenging, and collaborators are sought.) To prevent misuse for enjoyment purposes, the following measures have been taken: - Game names and character names are concealed, no categorization by game is done in folder organization, and random alphanumeric strings are used as character identifiers. - The order of audio files in each character folder is randomized to prevent identification of the sequence of dialogues. ## Considerations for Using the Data ### Discussion of Biases Due to its nature, the dataset may exhibit certain biases, such as: - A tendency for a larger volume of data for female characters. ### Other Known Limitations - The same voice actor may play multiple characters, or the same character may appear across multiple games. In such cases, they are assigned different identifiers. - Although the audio has been filtered for quality, not all files have been individually checked. Therefore, it's possible that some files may include: - Audio processed to sound like it has an echo. - Audio processed to sound as if it's coming through a phone or from behind a wall. TODO: Exclude such audio through manual effort or some other means. ## Additional Information ### Licensing Information Please refer to LICENSE for details. It is essential to read and understand the LICENSE before using this dataset to ensure compliance with its terms. ### Disclaimer - The providers of this dataset are not responsible for any troubles or damages arising from the use of this dataset. - Users must comply with the laws of their country or region when using this dataset. The legal basis for publishing this dataset is as follows: Copyright Law of Japan (Law No. 48 of May 6, 1970) Article 30-4: (Quotation starts) Article 30-4: It is permissible to exploit a work, in any way and to the extent considered necessary, in any of the following cases, or in any other case in which it is not a person's purpose to personally enjoy or cause another person to enjoy the thoughts or sentiments expressed in that work; provided, however, that this does not apply if the action would unreasonably prejudice the interests of the copyright owner in light of the nature or purpose of the work or the circumstances of its exploitation: (i) if it is done for use in testing to develop or put into practical use technology that is connected with the recording of sounds or visuals of a work or other such exploitation; (ii) if it is done for use in data analysis (meaning the extraction, comparison, classification, or other statistical analysis of the constituent language, sounds, images, or other elemental data from a large number of works or a large volume of other such data; the same applies in Article 47-5, paragraph (1), item (ii)); (iii) if it is exploited in the course of computer data processing or otherwise exploited in a way that does not involve what is expressed in the work being perceived by the human senses (for works of computer programming, such exploitation excludes the execution of the work on a computer), beyond as set forth in the preceding two items. (End of quotation) - This dataset is considered to fall under the second category mentioned above. - The dataset is structured to meet the condition of "person's purpose to personally enjoy or cause another person to enjoy the thoughts or sentiments expressed in that work", as specified in LICENSE, and users are prohibited from using it for enjoyment purposes. - Regarding "this does not apply if the action would unreasonably prejudice the interests of the copyright owner in light of the nature or purpose of the work or the circumstances of its exploitation", this dataset conceals the source game, voice actor names, and character names, and the order of the audio files is randomized. Additionally, identifiers with the same source are not disclosed, making it impossible to use this dataset for the original purpose of the work (enjoying the game's scenario with voice and images), and thus it is believed that the publication of this dataset does not unfairly harm the interests of the copyright holder. - Even if a user tries to ignore the license and use it for enjoyment, it is practically impossible to recreate the original game's scenario as there are over 400 speakers with numerous audio files for each, making it difficult to identify speakers with the same source. - As stated in LICENSE, any use that "unfairly harms the interests of the copyright holder", as well as actions that hinder the considerations mentioned above (providing information to third parties about the voice actors or original works, or redistributing in a way that makes these associations identifiable), are prohibited. # MoeSpeech日本語版README [Japanese Version] ### Changelog 随時更新予定:キャラクターの追加、音声のフィルタリング、既存キャラクターの音声ファイルの新たな追加等 - 2024-02-01: Moe Speech Similarity Mapを追加 - 2024-01-31: Version 0.4(.1) - 手作業による除外と追加: (50 / 473) finished_uuids.txt - このモデルと手作業チェックにより、多数(1.5k)の不適切な音声(電話越し・壁越しであるかのような加工がされた音声等)を除外 - 473キャラクター、395kファイル、622時間、184GB - 2024-01-27: Version 0.3 - ランダムに選んだ識別子について、手作業による音声の除外と音声ファイルの追加を開始、これまでに処理したキャラクターのリストはfinished_uuids.txtを参照(24 / 473) - 全てのwavファイルを44.1kHz 16bit monoに変換したため、ファイルサイズが大幅に小さくなった - 473キャラクター、395kファイル、623時間、184GB - samples フォルダを追加、1キャラクター1ファイル - 2024-01-25: ライセンスのアップデート (see LICENSE)、再配布を禁止に変更 - 2024-01-24: Version 0.2, Added 24 characters (473 characters, 394k files, 622 hours, 368GB) - 2024-01-22: Version 0.1, (449 characters, 363k files, 581 hours, 343GB) ## Dataset Description ### Dataset Summary - 日本人プロ声優による高音質(スタジオ録音)でノイズ・BGM等無しのキャラクター演技セリフ発話音声データセット(男性・女性キャラクター両方含む) - 1音声は2-15秒のモノラル44.1kHz 16bit wavファイル - キャラクターごとにフォルダ分けされている(各キャラクターは匿名化され、'uuid.uuid4().hex[:8]'によるランダムな8文字の英数字による識別子を持つ) - 現在は合計473キャラクター、約39万の音声ファイル、合計約623時間、184GBの音声が含まれる - TTS等のタスクに使える質になるよう、機械的に音声の質によりフィルタリング済み - 各uuid間の類似度の地図は Moe Speech Similarity Map で見ることができます !stats 詳細はinfo.csvを参照。 ### Supported Tasks and Leaderboards - 日本の萌え文化に特化した音声変換や感情豊かなキャラクター音声合成等の音声関連タスクの研究や開発に使えるかもしれない。 - 音声認識により書き起こしを行うことで、セリフ内容を言語モデル等に利用できるかもしれない。 ### Languages データセット内の音声の言語は(極めて少数の英文セリフ等の可能性を除き)日本語のみ。 ## Dataset Structure URL: - 'name': キャラクター識別子(8文字のランダムな英数字) - 'num_files': 音声ファイル数 - 'total_duration_min': 音声ファイルの合計時間(分) - 'f0_mean': 音声ファイルの平均基本周波数(Hz)の平均、つまり声の高さ - 並び順は'name'の昇順 ## Download huggingface-cliを使うと便利です。 Huggung Faceの設定ページからトークンを作り、以下でログインします。 サンプル(1キャラクター1ファイル)をダウンロードする場合。 識別名'{uuid}'のデータのみをダウンロードする場合。 全てのデータをダウンロードする場合(容量に注意してください)。 詳細はHugging Face CLIのドキュメントを参照。 または、git-lfsとgit cloneを組み合わせて全てをダウンロードすることもできます。データセットのサイズが大きいため、huggingface-cliを使って一度に全てをダウンロードしようとするとしばしば失敗します。以下のコマンドは、すでにhuggingface-cliを使ってログインしている場合には機能するはずです: ## Dataset Creation ### Curation Rationale 近年、感情音声合成の技術が発展して、豊かな感情を制御しながら音声合成することが可能になってきている。しかしこれまでの日本語音声合成コーパスは既存のTTS技術との相性の問題で、キャラクターのセリフとしては似合わない文章を、意図的に感情を抑えて読み上げるコーパスがほとんどであり、一般の機械学習ユーザーが利用できるセリフ感情音声コーパスは数少ない。このような状況で、日本語キャラクター演技を行っているセリフコーパスが果たす役割は大きいと考えられ、その感情音声合成や音声変換の研究発展の促進を目的として作成した。 ### Source Data 正規の手段で購入して個人的に所持しているPCゲームから録音したもの #### Initial Data Collection and Normalization 以下のように音声ファイルをフィルタリングした。 Filter audios with the following conditions. 1. Mono only (convert to mono if stereo but left channel == right channel) 2. Duration: 2-15 seconds 3. NISQA mos quality score: decide threshold from histogram for each character, often Q1 (because mos score largely depends on the speaker) 4. Get speech ratio (speech duration / total duration) using Silero VAD, and require >=0.5. 5. The number of resulting audios for each character should be >= 100, and the total duration should be >= 15 minutes. ### Annotations - 基本周波数はpyworldのdioで取得し、その平均を取った。 ### Personal and Sensitive Information データセットの元の都合上、以下のようなセリフが含まれている可能性がある。 - 性的な内容のセリフ - 性的の音声(ただしこれはフィルタリングの過程で多くが除外されているはず) - (性的な音声のみによるデータセットも作りたいが作成を機械的に行うのが困難そうなため分からず、協力者求む) 享受目的での利用を防ぐため、以下のような手段を取っている。 - ゲーム名やキャラクター名を伏せ、ゲームによるフォルダ分け類別はせず、またキャラクター識別子としてランダムな英数字の名前を使用 - 各キャラクターフォルダ内の音声ファイルの並び順はランダムにし、セリフの順番を特定できないようにする ## Considerations for Using the Data ### Discussion of Biases 性質上、以下のようなバイアスがある可能性がある。 - 女性キャラクターの方がデータ量が多い傾向がある ### Other Known Limitations - 同一声優が複数のキャラクターを演じている場合や、同一キャラクターでも複数ゲームにまたがっている場合があり、その場合は別の識別子を持つ。 - 音声品質によりフィルタリングしているが、全てをチェックしたわけではないので、以下のような音声がたまに入っている可能性がある。 - エコーが入ったような加工がされた音声 - 電話越し・壁越しであるかのような加工がされた音声 TODO: このような音声を手作業か何らかの手段により除外する。 ## Additional Information ### Licensing Information LICENSEを参照。このデータセットを利用する場合は、必ずLICENSEを読んで利用条件を確認すること。 English: Please refer to LICENSE. If you use this dataset, be sure to read the LICENSE and check the usage conditions. ### Disclaimer - このデータセットの利用によって発生したいかなるトラブルや損害に対しても、データセットの提供者は責任を負わない。 - このデータセットの利用に際して、自身の国または地域の法律に従うこと。 このデータセットを公開している根拠は、以下の通り。 著作権法(昭和45年5月6日法律第48号)第三十条の四: (以下引用) 著作物は、次に掲げる場合その他の当該著作物に表現された思想又は感情を自ら享受し又は他人に享受させることを目的としない場合には、その必要と認められる限度において、いずれの方法によるかを問わず、利用することができる。ただし、当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる場合は、この限りでない。  一 著作物の録音、録画その他の利用に係る技術の開発又は実用化のための試験の用に供する場合  二 情報解析(多数の著作物その他の大量の情報から、当該情報を構成する言語、音、影像その他の要素に係る情報を抽出し、比較、分類その他の解析を行うことをいう。第四十七条の五第一項第二号において同じ。)の用に供する場合  三 前二号に掲げる場合のほか、著作物の表現についての人の知覚による認識を伴うことなく当該著作物を電子計算機による情報処理の過程における利用その他の利用(プログラムの著作物にあつては、当該著作物の電子計算機における実行を除く。)に供する場合 (引用終わり) - このデータセットは、上記の第二号に該当すると考えられる。 - 「著作物に表現された思想又は感情を自ら享受し又は他人に享受させることを目的としない場合」という条件を満たすように配慮したデータセットの構造となっており、LICENSEにある通り、利用者は享受目的の利用を禁止されている。 - 「当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる場合」については、本データセットでは参照元や声優名やキャラクター名を伏せている上に音声の順番もシャッフルされており、また同一参照元を持つ識別子も公開していないことから、当該著作物(ゲーム)の使用用途(シナリオを音声と絵をあわせて楽しむ)で利用することは不可能であり、このデータセットの公開によって著作権者の利益を不当に害することはないと考えられる。 - もし仮に利用者がライセンスを無視し享受利用しようとしたとしても、話者数が400人以上で多数のため同じソースを持つ話者識別名の特定は困難であり、また音声ファイルも1話者につき100ファイル以上であるので、元のゲームのシナリオを再現することは事実上不可能である。 - LICENSEにある通り、「当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる」ような利用方法は禁じられており、また以上の配慮を妨げるような行為(声優や元著作物との対応についての第三者への情報提供やそれらが分かるような形での再配布)は禁止されている。
[ "# MoeSpeech\n\n日本語はこちら\n\nこのデータセットは、著作権法第三十条の四の情報解析(機械学習等)の目的でのみ使用が許可されています。それ以外の用途での使用はライセンスにより禁止されています。\n\nThis dataset is only permitted for use under Article 30-4 of the Copyright Law of Japan for data analysis (such as machine learning) purposes. Any use for purposes other than those specified is prohibited by the license.", "### Changelog\n\nUpdates planned regularly: additions of characters, audio filtering, new audio files for existing characters, etc.\n\n- 2024-02-01: Added Moe Speech Similarity Map\n- 2024-01-31: Version 0.4.1: Remove some more inappropriate audios (621 hours)\n- 2024-01-31: Version 0.4\n - Manual removing and adding: (50 / 473) finished_uuids.txt\n - By Using this model and manual check, many (1.5k) inappropriate audios (processed to sound as if it's coming through a phone or from behind a wall) are removed.\n - 473 characters, 395k files, 622 hours, 184GB\n- 2024-01-27: Version 0.3\n - Start manually removing audio and add new audio files for randomly chosen uuids, see finished_uuids.txt for the list of characters that have been processed so far (24 / 473).\n - Converted all wav files to 44.1kHz 16bit mono, so the file size is now much smaller.\n - 473 characters, 395k files, 623 hours, 184GB\n - Add samples folder, which contains one file per character.\n- 2024-01-25: UPDATE LICENSE (see LICENSE), changed to prohibit redistribution of the dataset.\n- 2024-01-24: Version 0.2, Added 24 characters (473 characters, 394k files, 622 hours, 368GB)\n- 2024-01-22: Version 0.1, Initial version (449 characters, 363k files, 581 hours)", "## Dataset Description\n\n- Point of Contact: litagin", "### Dataset Summary\n\n- A high-quality dataset of character acting speech audio by Japanese professional voice actors, recorded in a studio, free of noise and background music (both male and female characters).\n- Each audio file is a monaural 44.1kHz 16bit WAV file of 2-15 seconds.\n- The dataset is organized into folders for each character. Each character is anonymized and has a random 8-character alphanumeric identifier generated by 'uuid.uuid4().hex[:8]'.\n- Currently, it includes a total of 473 characters, approximately 395k audio files, and a total of about 623 hours and 184GB of audio.\n- The audio has been mechanically filtered for quality, suitable for tasks such as TTS.\n- See Moe Speech Similarity Map for a visualization of the similarity between uuids.\n\n!stats\n\nSee URL for more details.", "### Supported Tasks and Leaderboards\n\n- May be useful for research and development of voice-related tasks such as voice conversion and character voice synthesis with rich emotions, especially focused on Japanese moe culture.\n- By performing transcriptions through voice recognition, the content of the dialogues might be usable for language models and similar applications.", "### Languages\n\nThe language of the audio in the dataset is predominantly Japanese, with the possibility of a very small number of English phrases.", "## Dataset Structure\n\n\n\nURL:\n\n\n- 'name': Character identifier (a random 8-character alphanumeric string)\n- 'num_files': Number of audio files\n- 'total_duration_min': Total duration of audio files (in minutes)\n- 'f0_mean': Average fundamental frequency (Hz) of the audio files, representing the pitch of the voice\n- The order is in ascending 'name'", "## Download\n\nUsing the huggingface-cli can be convenient.\nCreate a token from the Hugging Face settings page and then log in using the following commands:\n\n\n\nTo download samples (one file per character):\n\n\nTo download data for a specific identifier '{uuid}':\n\nTo download all data (be mindful of the storage requirements):\n\n\nFor more details, refer to the Hugging Face CLI documentation.\n\nAlternatively, you can use git-lfs in combination with git clone to download everything. Due to the large size of the dataset, downloading it all at once with the huggingface-cli method often fails. The following commands should work if you have already logged in using huggingface-cli:", "## Dataset Creation", "### Curation Rationale\n\nIn recent years, text-to-speech (TTS) technology has advanced, allowing for the synthesis of speech with controlled, rich emotions. However, most existing Japanese TTS corpora consist of sentences unsuitable for character dialogues, read with deliberately suppressed emotions due to compatibility issues with existing TTS technologies. As a result, there are few emotion speech corpora for dialogues available to the general machine learning user. In this context, the role of a corpus containing Japanese character acting dialogues is significant, and this dataset has been developed to promote research and development in emotion TTS and voice conversion.", "### Source Data\n\nRecordings from PC games that were purchased through legal means and are personally owned.", "#### Initial Data Collection and Normalization\n\nThe audio files were filtered as follows:\n\nFilter audios with the following conditions.\n\n1. Mono only (convert to mono if stereo but left channel == right channel)\n2. Duration: 2-15 seconds\n3. NISQA mos quality score: decide threshold from histogram for each character, often Q1 (because mos score largely depends on the speaker)\n4. Get speech ratio (speech duration / total duration) using Silero VAD, and require >=0.5.\n5. The number of resulting audios for each character should be >= 100, and the total duration should be >= 15 minutes.", "### Annotations\n\n- Fundamental frequencies were obtained using the dio function in pyworld, and their averages were calculated.", "### Personal and Sensitive Information\n\nDue to the nature of the dataset's source, it may contain dialogues with the following characteristics:\n- Lines with sexual content.\n- Sexual sounds (though many of these should have been excluded during the filtering process).\n- (There is interest in creating a dataset focused solely on sexual sounds, but mechanical creation is challenging, and collaborators are sought.)\n\nTo prevent misuse for enjoyment purposes, the following measures have been taken:\n- Game names and character names are concealed, no categorization by game is done in folder organization, and random alphanumeric strings are used as character identifiers.\n- The order of audio files in each character folder is randomized to prevent identification of the sequence of dialogues.", "## Considerations for Using the Data", "### Discussion of Biases\n\nDue to its nature, the dataset may exhibit certain biases, such as:\n- A tendency for a larger volume of data for female characters.", "### Other Known Limitations\n\n- The same voice actor may play multiple characters, or the same character may appear across multiple games. In such cases, they are assigned different identifiers.\n\n- Although the audio has been filtered for quality, not all files have been individually checked. Therefore, it's possible that some files may include:\n - Audio processed to sound like it has an echo.\n - Audio processed to sound as if it's coming through a phone or from behind a wall.\n\nTODO: Exclude such audio through manual effort or some other means.", "## Additional Information", "### Licensing Information\n\nPlease refer to LICENSE for details. It is essential to read and understand the LICENSE before using this dataset to ensure compliance with its terms.", "### Disclaimer\n\n- The providers of this dataset are not responsible for any troubles or damages arising from the use of this dataset.\n- Users must comply with the laws of their country or region when using this dataset.\n\nThe legal basis for publishing this dataset is as follows:\nCopyright Law of Japan (Law No. 48 of May 6, 1970) Article 30-4:\n\n(Quotation starts)\n\nArticle 30-4: It is permissible to exploit a work, in any way and to the extent considered necessary, in any of the following cases, or in any other case in which it is not a person's purpose to personally enjoy or cause another person to enjoy the thoughts or sentiments expressed in that work; provided, however, that this does not apply if the action would unreasonably prejudice the interests of the copyright owner in light of the nature or purpose of the work or the circumstances of its exploitation:\n\n(i) if it is done for use in testing to develop or put into practical use technology that is connected with the recording of sounds or visuals of a work or other such exploitation;\n\n(ii) if it is done for use in data analysis (meaning the extraction, comparison, classification, or other statistical analysis of the constituent language, sounds, images, or other elemental data from a large number of works or a large volume of other such data; the same applies in Article 47-5, paragraph (1), item (ii));\n\n(iii) if it is exploited in the course of computer data processing or otherwise exploited in a way that does not involve what is expressed in the work being perceived by the human senses (for works of computer programming, such exploitation excludes the execution of the work on a computer), beyond as set forth in the preceding two items.\n\n(End of quotation)\n\n- This dataset is considered to fall under the second category mentioned above.\n- The dataset is structured to meet the condition of \"person's purpose to personally enjoy or cause another person to enjoy the thoughts or sentiments expressed in that work\", as specified in LICENSE, and users are prohibited from using it for enjoyment purposes.\n- Regarding \"this does not apply if the action would unreasonably prejudice the interests of the copyright owner in light of the nature or purpose of the work or the circumstances of its exploitation\", this dataset conceals the source game, voice actor names, and character names, and the order of the audio files is randomized. Additionally, identifiers with the same source are not disclosed, making it impossible to use this dataset for the original purpose of the work (enjoying the game's scenario with voice and images), and thus it is believed that the publication of this dataset does not unfairly harm the interests of the copyright holder.\n- Even if a user tries to ignore the license and use it for enjoyment, it is practically impossible to recreate the original game's scenario as there are over 400 speakers with numerous audio files for each, making it difficult to identify speakers with the same source.\n- As stated in LICENSE, any use that \"unfairly harms the interests of the copyright holder\", as well as actions that hinder the considerations mentioned above (providing information to third parties about the voice actors or original works, or redistributing in a way that makes these associations identifiable), are prohibited.", "# MoeSpeech日本語版README [Japanese Version]", "### Changelog\n\n随時更新予定:キャラクターの追加、音声のフィルタリング、既存キャラクターの音声ファイルの新たな追加等\n\n- 2024-02-01: Moe Speech Similarity Mapを追加\n- 2024-01-31: Version 0.4(.1)\n - 手作業による除外と追加: (50 / 473) finished_uuids.txt\n - このモデルと手作業チェックにより、多数(1.5k)の不適切な音声(電話越し・壁越しであるかのような加工がされた音声等)を除外\n - 473キャラクター、395kファイル、622時間、184GB\n- 2024-01-27: Version 0.3\n - ランダムに選んだ識別子について、手作業による音声の除外と音声ファイルの追加を開始、これまでに処理したキャラクターのリストはfinished_uuids.txtを参照(24 / 473)\n - 全てのwavファイルを44.1kHz 16bit monoに変換したため、ファイルサイズが大幅に小さくなった\n - 473キャラクター、395kファイル、623時間、184GB\n - samples フォルダを追加、1キャラクター1ファイル\n- 2024-01-25: ライセンスのアップデート (see LICENSE)、再配布を禁止に変更\n- 2024-01-24: Version 0.2, Added 24 characters (473 characters, 394k files, 622 hours, 368GB)\n- 2024-01-22: Version 0.1, (449 characters, 363k files, 581 hours, 343GB)", "## Dataset Description", "### Dataset Summary\n\n- 日本人プロ声優による高音質(スタジオ録音)でノイズ・BGM等無しのキャラクター演技セリフ発話音声データセット(男性・女性キャラクター両方含む)\n- 1音声は2-15秒のモノラル44.1kHz 16bit wavファイル\n- キャラクターごとにフォルダ分けされている(各キャラクターは匿名化され、'uuid.uuid4().hex[:8]'によるランダムな8文字の英数字による識別子を持つ)\n- 現在は合計473キャラクター、約39万の音声ファイル、合計約623時間、184GBの音声が含まれる\n- TTS等のタスクに使える質になるよう、機械的に音声の質によりフィルタリング済み\n- 各uuid間の類似度の地図は Moe Speech Similarity Map で見ることができます\n\n!stats\n\n詳細はinfo.csvを参照。", "### Supported Tasks and Leaderboards\n\n- 日本の萌え文化に特化した音声変換や感情豊かなキャラクター音声合成等の音声関連タスクの研究や開発に使えるかもしれない。\n- 音声認識により書き起こしを行うことで、セリフ内容を言語モデル等に利用できるかもしれない。", "### Languages\n\nデータセット内の音声の言語は(極めて少数の英文セリフ等の可能性を除き)日本語のみ。", "## Dataset Structure\n\n\n\nURL:\n\n\n- 'name': キャラクター識別子(8文字のランダムな英数字)\n- 'num_files': 音声ファイル数\n- 'total_duration_min': 音声ファイルの合計時間(分)\n- 'f0_mean': 音声ファイルの平均基本周波数(Hz)の平均、つまり声の高さ\n- 並び順は'name'の昇順", "## Download\n\nhuggingface-cliを使うと便利です。\nHuggung Faceの設定ページからトークンを作り、以下でログインします。\n\n\n\nサンプル(1キャラクター1ファイル)をダウンロードする場合。\n\n\n識別名'{uuid}'のデータのみをダウンロードする場合。\n\n\n全てのデータをダウンロードする場合(容量に注意してください)。\n\n\n詳細はHugging Face CLIのドキュメントを参照。\n\nまたは、git-lfsとgit cloneを組み合わせて全てをダウンロードすることもできます。データセットのサイズが大きいため、huggingface-cliを使って一度に全てをダウンロードしようとするとしばしば失敗します。以下のコマンドは、すでにhuggingface-cliを使ってログインしている場合には機能するはずです:", "## Dataset Creation", "### Curation Rationale\n\n近年、感情音声合成の技術が発展して、豊かな感情を制御しながら音声合成することが可能になってきている。しかしこれまでの日本語音声合成コーパスは既存のTTS技術との相性の問題で、キャラクターのセリフとしては似合わない文章を、意図的に感情を抑えて読み上げるコーパスがほとんどであり、一般の機械学習ユーザーが利用できるセリフ感情音声コーパスは数少ない。このような状況で、日本語キャラクター演技を行っているセリフコーパスが果たす役割は大きいと考えられ、その感情音声合成や音声変換の研究発展の促進を目的として作成した。", "### Source Data\n\n正規の手段で購入して個人的に所持しているPCゲームから録音したもの", "#### Initial Data Collection and Normalization\n\n以下のように音声ファイルをフィルタリングした。\n\nFilter audios with the following conditions.\n\n1. Mono only (convert to mono if stereo but left channel == right channel)\n2. Duration: 2-15 seconds\n3. NISQA mos quality score: decide threshold from histogram for each character, often Q1 (because mos score largely depends on the speaker)\n4. Get speech ratio (speech duration / total duration) using Silero VAD, and require >=0.5.\n5. The number of resulting audios for each character should be >= 100, and the total duration should be >= 15 minutes.", "### Annotations\n\n- 基本周波数はpyworldのdioで取得し、その平均を取った。", "### Personal and Sensitive Information\n\nデータセットの元の都合上、以下のようなセリフが含まれている可能性がある。\n- 性的な内容のセリフ\n- 性的の音声(ただしこれはフィルタリングの過程で多くが除外されているはず)\n- (性的な音声のみによるデータセットも作りたいが作成を機械的に行うのが困難そうなため分からず、協力者求む)\n\n享受目的での利用を防ぐため、以下のような手段を取っている。\n- ゲーム名やキャラクター名を伏せ、ゲームによるフォルダ分け類別はせず、またキャラクター識別子としてランダムな英数字の名前を使用\n- 各キャラクターフォルダ内の音声ファイルの並び順はランダムにし、セリフの順番を特定できないようにする", "## Considerations for Using the Data", "### Discussion of Biases\n\n性質上、以下のようなバイアスがある可能性がある。\n- 女性キャラクターの方がデータ量が多い傾向がある", "### Other Known Limitations\n\n- 同一声優が複数のキャラクターを演じている場合や、同一キャラクターでも複数ゲームにまたがっている場合があり、その場合は別の識別子を持つ。\n\n- 音声品質によりフィルタリングしているが、全てをチェックしたわけではないので、以下のような音声がたまに入っている可能性がある。\n - エコーが入ったような加工がされた音声\n - 電話越し・壁越しであるかのような加工がされた音声\n\nTODO: このような音声を手作業か何らかの手段により除外する。", "## Additional Information", "### Licensing Information\n\nLICENSEを参照。このデータセットを利用する場合は、必ずLICENSEを読んで利用条件を確認すること。\n\nEnglish:\n\nPlease refer to LICENSE. If you use this dataset, be sure to read the LICENSE and check the usage conditions.", "### Disclaimer\n\n- このデータセットの利用によって発生したいかなるトラブルや損害に対しても、データセットの提供者は責任を負わない。\n- このデータセットの利用に際して、自身の国または地域の法律に従うこと。\n\n\nこのデータセットを公開している根拠は、以下の通り。\n著作権法(昭和45年5月6日法律第48号)第三十条の四:\n\n(以下引用)\n\n著作物は、次に掲げる場合その他の当該著作物に表現された思想又は感情を自ら享受し又は他人に享受させることを目的としない場合には、その必要と認められる限度において、いずれの方法によるかを問わず、利用することができる。ただし、当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる場合は、この限りでない。\n\n 一 著作物の録音、録画その他の利用に係る技術の開発又は実用化のための試験の用に供する場合\n\n 二 情報解析(多数の著作物その他の大量の情報から、当該情報を構成する言語、音、影像その他の要素に係る情報を抽出し、比較、分類その他の解析を行うことをいう。第四十七条の五第一項第二号において同じ。)の用に供する場合\n\n 三 前二号に掲げる場合のほか、著作物の表現についての人の知覚による認識を伴うことなく当該著作物を電子計算機による情報処理の過程における利用その他の利用(プログラムの著作物にあつては、当該著作物の電子計算機における実行を除く。)に供する場合\n\n(引用終わり)\n\n- このデータセットは、上記の第二号に該当すると考えられる。\n- 「著作物に表現された思想又は感情を自ら享受し又は他人に享受させることを目的としない場合」という条件を満たすように配慮したデータセットの構造となっており、LICENSEにある通り、利用者は享受目的の利用を禁止されている。\n- 「当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる場合」については、本データセットでは参照元や声優名やキャラクター名を伏せている上に音声の順番もシャッフルされており、また同一参照元を持つ識別子も公開していないことから、当該著作物(ゲーム)の使用用途(シナリオを音声と絵をあわせて楽しむ)で利用することは不可能であり、このデータセットの公開によって著作権者の利益を不当に害することはないと考えられる。\n- もし仮に利用者がライセンスを無視し享受利用しようとしたとしても、話者数が400人以上で多数のため同じソースを持つ話者識別名の特定は困難であり、また音声ファイルも1話者につき100ファイル以上であるので、元のゲームのシナリオを再現することは事実上不可能である。\n- LICENSEにある通り、「当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる」ような利用方法は禁じられており、また以上の配慮を妨げるような行為(声優や元著作物との対応についての第三者への情報提供やそれらが分かるような形での再配布)は禁止されている。" ]
[ "TAGS\n#task_categories-text-to-speech #task_categories-audio-to-audio #task_categories-audio-classification #task_ids-speaker-identification #multilinguality-monolingual #size_categories-100K<n<1M #language-Japanese #license-other #speech #audio #japanese #anime #voice #not-for-all-audiences #region-us \n", "# MoeSpeech\n\n日本語はこちら\n\nこのデータセットは、著作権法第三十条の四の情報解析(機械学習等)の目的でのみ使用が許可されています。それ以外の用途での使用はライセンスにより禁止されています。\n\nThis dataset is only permitted for use under Article 30-4 of the Copyright Law of Japan for data analysis (such as machine learning) purposes. Any use for purposes other than those specified is prohibited by the license.", "### Changelog\n\nUpdates planned regularly: additions of characters, audio filtering, new audio files for existing characters, etc.\n\n- 2024-02-01: Added Moe Speech Similarity Map\n- 2024-01-31: Version 0.4.1: Remove some more inappropriate audios (621 hours)\n- 2024-01-31: Version 0.4\n - Manual removing and adding: (50 / 473) finished_uuids.txt\n - By Using this model and manual check, many (1.5k) inappropriate audios (processed to sound as if it's coming through a phone or from behind a wall) are removed.\n - 473 characters, 395k files, 622 hours, 184GB\n- 2024-01-27: Version 0.3\n - Start manually removing audio and add new audio files for randomly chosen uuids, see finished_uuids.txt for the list of characters that have been processed so far (24 / 473).\n - Converted all wav files to 44.1kHz 16bit mono, so the file size is now much smaller.\n - 473 characters, 395k files, 623 hours, 184GB\n - Add samples folder, which contains one file per character.\n- 2024-01-25: UPDATE LICENSE (see LICENSE), changed to prohibit redistribution of the dataset.\n- 2024-01-24: Version 0.2, Added 24 characters (473 characters, 394k files, 622 hours, 368GB)\n- 2024-01-22: Version 0.1, Initial version (449 characters, 363k files, 581 hours)", "## Dataset Description\n\n- Point of Contact: litagin", "### Dataset Summary\n\n- A high-quality dataset of character acting speech audio by Japanese professional voice actors, recorded in a studio, free of noise and background music (both male and female characters).\n- Each audio file is a monaural 44.1kHz 16bit WAV file of 2-15 seconds.\n- The dataset is organized into folders for each character. Each character is anonymized and has a random 8-character alphanumeric identifier generated by 'uuid.uuid4().hex[:8]'.\n- Currently, it includes a total of 473 characters, approximately 395k audio files, and a total of about 623 hours and 184GB of audio.\n- The audio has been mechanically filtered for quality, suitable for tasks such as TTS.\n- See Moe Speech Similarity Map for a visualization of the similarity between uuids.\n\n!stats\n\nSee URL for more details.", "### Supported Tasks and Leaderboards\n\n- May be useful for research and development of voice-related tasks such as voice conversion and character voice synthesis with rich emotions, especially focused on Japanese moe culture.\n- By performing transcriptions through voice recognition, the content of the dialogues might be usable for language models and similar applications.", "### Languages\n\nThe language of the audio in the dataset is predominantly Japanese, with the possibility of a very small number of English phrases.", "## Dataset Structure\n\n\n\nURL:\n\n\n- 'name': Character identifier (a random 8-character alphanumeric string)\n- 'num_files': Number of audio files\n- 'total_duration_min': Total duration of audio files (in minutes)\n- 'f0_mean': Average fundamental frequency (Hz) of the audio files, representing the pitch of the voice\n- The order is in ascending 'name'", "## Download\n\nUsing the huggingface-cli can be convenient.\nCreate a token from the Hugging Face settings page and then log in using the following commands:\n\n\n\nTo download samples (one file per character):\n\n\nTo download data for a specific identifier '{uuid}':\n\nTo download all data (be mindful of the storage requirements):\n\n\nFor more details, refer to the Hugging Face CLI documentation.\n\nAlternatively, you can use git-lfs in combination with git clone to download everything. Due to the large size of the dataset, downloading it all at once with the huggingface-cli method often fails. The following commands should work if you have already logged in using huggingface-cli:", "## Dataset Creation", "### Curation Rationale\n\nIn recent years, text-to-speech (TTS) technology has advanced, allowing for the synthesis of speech with controlled, rich emotions. However, most existing Japanese TTS corpora consist of sentences unsuitable for character dialogues, read with deliberately suppressed emotions due to compatibility issues with existing TTS technologies. As a result, there are few emotion speech corpora for dialogues available to the general machine learning user. In this context, the role of a corpus containing Japanese character acting dialogues is significant, and this dataset has been developed to promote research and development in emotion TTS and voice conversion.", "### Source Data\n\nRecordings from PC games that were purchased through legal means and are personally owned.", "#### Initial Data Collection and Normalization\n\nThe audio files were filtered as follows:\n\nFilter audios with the following conditions.\n\n1. Mono only (convert to mono if stereo but left channel == right channel)\n2. Duration: 2-15 seconds\n3. NISQA mos quality score: decide threshold from histogram for each character, often Q1 (because mos score largely depends on the speaker)\n4. Get speech ratio (speech duration / total duration) using Silero VAD, and require >=0.5.\n5. The number of resulting audios for each character should be >= 100, and the total duration should be >= 15 minutes.", "### Annotations\n\n- Fundamental frequencies were obtained using the dio function in pyworld, and their averages were calculated.", "### Personal and Sensitive Information\n\nDue to the nature of the dataset's source, it may contain dialogues with the following characteristics:\n- Lines with sexual content.\n- Sexual sounds (though many of these should have been excluded during the filtering process).\n- (There is interest in creating a dataset focused solely on sexual sounds, but mechanical creation is challenging, and collaborators are sought.)\n\nTo prevent misuse for enjoyment purposes, the following measures have been taken:\n- Game names and character names are concealed, no categorization by game is done in folder organization, and random alphanumeric strings are used as character identifiers.\n- The order of audio files in each character folder is randomized to prevent identification of the sequence of dialogues.", "## Considerations for Using the Data", "### Discussion of Biases\n\nDue to its nature, the dataset may exhibit certain biases, such as:\n- A tendency for a larger volume of data for female characters.", "### Other Known Limitations\n\n- The same voice actor may play multiple characters, or the same character may appear across multiple games. In such cases, they are assigned different identifiers.\n\n- Although the audio has been filtered for quality, not all files have been individually checked. Therefore, it's possible that some files may include:\n - Audio processed to sound like it has an echo.\n - Audio processed to sound as if it's coming through a phone or from behind a wall.\n\nTODO: Exclude such audio through manual effort or some other means.", "## Additional Information", "### Licensing Information\n\nPlease refer to LICENSE for details. It is essential to read and understand the LICENSE before using this dataset to ensure compliance with its terms.", "### Disclaimer\n\n- The providers of this dataset are not responsible for any troubles or damages arising from the use of this dataset.\n- Users must comply with the laws of their country or region when using this dataset.\n\nThe legal basis for publishing this dataset is as follows:\nCopyright Law of Japan (Law No. 48 of May 6, 1970) Article 30-4:\n\n(Quotation starts)\n\nArticle 30-4: It is permissible to exploit a work, in any way and to the extent considered necessary, in any of the following cases, or in any other case in which it is not a person's purpose to personally enjoy or cause another person to enjoy the thoughts or sentiments expressed in that work; provided, however, that this does not apply if the action would unreasonably prejudice the interests of the copyright owner in light of the nature or purpose of the work or the circumstances of its exploitation:\n\n(i) if it is done for use in testing to develop or put into practical use technology that is connected with the recording of sounds or visuals of a work or other such exploitation;\n\n(ii) if it is done for use in data analysis (meaning the extraction, comparison, classification, or other statistical analysis of the constituent language, sounds, images, or other elemental data from a large number of works or a large volume of other such data; the same applies in Article 47-5, paragraph (1), item (ii));\n\n(iii) if it is exploited in the course of computer data processing or otherwise exploited in a way that does not involve what is expressed in the work being perceived by the human senses (for works of computer programming, such exploitation excludes the execution of the work on a computer), beyond as set forth in the preceding two items.\n\n(End of quotation)\n\n- This dataset is considered to fall under the second category mentioned above.\n- The dataset is structured to meet the condition of \"person's purpose to personally enjoy or cause another person to enjoy the thoughts or sentiments expressed in that work\", as specified in LICENSE, and users are prohibited from using it for enjoyment purposes.\n- Regarding \"this does not apply if the action would unreasonably prejudice the interests of the copyright owner in light of the nature or purpose of the work or the circumstances of its exploitation\", this dataset conceals the source game, voice actor names, and character names, and the order of the audio files is randomized. Additionally, identifiers with the same source are not disclosed, making it impossible to use this dataset for the original purpose of the work (enjoying the game's scenario with voice and images), and thus it is believed that the publication of this dataset does not unfairly harm the interests of the copyright holder.\n- Even if a user tries to ignore the license and use it for enjoyment, it is practically impossible to recreate the original game's scenario as there are over 400 speakers with numerous audio files for each, making it difficult to identify speakers with the same source.\n- As stated in LICENSE, any use that \"unfairly harms the interests of the copyright holder\", as well as actions that hinder the considerations mentioned above (providing information to third parties about the voice actors or original works, or redistributing in a way that makes these associations identifiable), are prohibited.", "# MoeSpeech日本語版README [Japanese Version]", "### Changelog\n\n随時更新予定:キャラクターの追加、音声のフィルタリング、既存キャラクターの音声ファイルの新たな追加等\n\n- 2024-02-01: Moe Speech Similarity Mapを追加\n- 2024-01-31: Version 0.4(.1)\n - 手作業による除外と追加: (50 / 473) finished_uuids.txt\n - このモデルと手作業チェックにより、多数(1.5k)の不適切な音声(電話越し・壁越しであるかのような加工がされた音声等)を除外\n - 473キャラクター、395kファイル、622時間、184GB\n- 2024-01-27: Version 0.3\n - ランダムに選んだ識別子について、手作業による音声の除外と音声ファイルの追加を開始、これまでに処理したキャラクターのリストはfinished_uuids.txtを参照(24 / 473)\n - 全てのwavファイルを44.1kHz 16bit monoに変換したため、ファイルサイズが大幅に小さくなった\n - 473キャラクター、395kファイル、623時間、184GB\n - samples フォルダを追加、1キャラクター1ファイル\n- 2024-01-25: ライセンスのアップデート (see LICENSE)、再配布を禁止に変更\n- 2024-01-24: Version 0.2, Added 24 characters (473 characters, 394k files, 622 hours, 368GB)\n- 2024-01-22: Version 0.1, (449 characters, 363k files, 581 hours, 343GB)", "## Dataset Description", "### Dataset Summary\n\n- 日本人プロ声優による高音質(スタジオ録音)でノイズ・BGM等無しのキャラクター演技セリフ発話音声データセット(男性・女性キャラクター両方含む)\n- 1音声は2-15秒のモノラル44.1kHz 16bit wavファイル\n- キャラクターごとにフォルダ分けされている(各キャラクターは匿名化され、'uuid.uuid4().hex[:8]'によるランダムな8文字の英数字による識別子を持つ)\n- 現在は合計473キャラクター、約39万の音声ファイル、合計約623時間、184GBの音声が含まれる\n- TTS等のタスクに使える質になるよう、機械的に音声の質によりフィルタリング済み\n- 各uuid間の類似度の地図は Moe Speech Similarity Map で見ることができます\n\n!stats\n\n詳細はinfo.csvを参照。", "### Supported Tasks and Leaderboards\n\n- 日本の萌え文化に特化した音声変換や感情豊かなキャラクター音声合成等の音声関連タスクの研究や開発に使えるかもしれない。\n- 音声認識により書き起こしを行うことで、セリフ内容を言語モデル等に利用できるかもしれない。", "### Languages\n\nデータセット内の音声の言語は(極めて少数の英文セリフ等の可能性を除き)日本語のみ。", "## Dataset Structure\n\n\n\nURL:\n\n\n- 'name': キャラクター識別子(8文字のランダムな英数字)\n- 'num_files': 音声ファイル数\n- 'total_duration_min': 音声ファイルの合計時間(分)\n- 'f0_mean': 音声ファイルの平均基本周波数(Hz)の平均、つまり声の高さ\n- 並び順は'name'の昇順", "## Download\n\nhuggingface-cliを使うと便利です。\nHuggung Faceの設定ページからトークンを作り、以下でログインします。\n\n\n\nサンプル(1キャラクター1ファイル)をダウンロードする場合。\n\n\n識別名'{uuid}'のデータのみをダウンロードする場合。\n\n\n全てのデータをダウンロードする場合(容量に注意してください)。\n\n\n詳細はHugging Face CLIのドキュメントを参照。\n\nまたは、git-lfsとgit cloneを組み合わせて全てをダウンロードすることもできます。データセットのサイズが大きいため、huggingface-cliを使って一度に全てをダウンロードしようとするとしばしば失敗します。以下のコマンドは、すでにhuggingface-cliを使ってログインしている場合には機能するはずです:", "## Dataset Creation", "### Curation Rationale\n\n近年、感情音声合成の技術が発展して、豊かな感情を制御しながら音声合成することが可能になってきている。しかしこれまでの日本語音声合成コーパスは既存のTTS技術との相性の問題で、キャラクターのセリフとしては似合わない文章を、意図的に感情を抑えて読み上げるコーパスがほとんどであり、一般の機械学習ユーザーが利用できるセリフ感情音声コーパスは数少ない。このような状況で、日本語キャラクター演技を行っているセリフコーパスが果たす役割は大きいと考えられ、その感情音声合成や音声変換の研究発展の促進を目的として作成した。", "### Source Data\n\n正規の手段で購入して個人的に所持しているPCゲームから録音したもの", "#### Initial Data Collection and Normalization\n\n以下のように音声ファイルをフィルタリングした。\n\nFilter audios with the following conditions.\n\n1. Mono only (convert to mono if stereo but left channel == right channel)\n2. Duration: 2-15 seconds\n3. NISQA mos quality score: decide threshold from histogram for each character, often Q1 (because mos score largely depends on the speaker)\n4. Get speech ratio (speech duration / total duration) using Silero VAD, and require >=0.5.\n5. The number of resulting audios for each character should be >= 100, and the total duration should be >= 15 minutes.", "### Annotations\n\n- 基本周波数はpyworldのdioで取得し、その平均を取った。", "### Personal and Sensitive Information\n\nデータセットの元の都合上、以下のようなセリフが含まれている可能性がある。\n- 性的な内容のセリフ\n- 性的の音声(ただしこれはフィルタリングの過程で多くが除外されているはず)\n- (性的な音声のみによるデータセットも作りたいが作成を機械的に行うのが困難そうなため分からず、協力者求む)\n\n享受目的での利用を防ぐため、以下のような手段を取っている。\n- ゲーム名やキャラクター名を伏せ、ゲームによるフォルダ分け類別はせず、またキャラクター識別子としてランダムな英数字の名前を使用\n- 各キャラクターフォルダ内の音声ファイルの並び順はランダムにし、セリフの順番を特定できないようにする", "## Considerations for Using the Data", "### Discussion of Biases\n\n性質上、以下のようなバイアスがある可能性がある。\n- 女性キャラクターの方がデータ量が多い傾向がある", "### Other Known Limitations\n\n- 同一声優が複数のキャラクターを演じている場合や、同一キャラクターでも複数ゲームにまたがっている場合があり、その場合は別の識別子を持つ。\n\n- 音声品質によりフィルタリングしているが、全てをチェックしたわけではないので、以下のような音声がたまに入っている可能性がある。\n - エコーが入ったような加工がされた音声\n - 電話越し・壁越しであるかのような加工がされた音声\n\nTODO: このような音声を手作業か何らかの手段により除外する。", "## Additional Information", "### Licensing Information\n\nLICENSEを参照。このデータセットを利用する場合は、必ずLICENSEを読んで利用条件を確認すること。\n\nEnglish:\n\nPlease refer to LICENSE. If you use this dataset, be sure to read the LICENSE and check the usage conditions.", "### Disclaimer\n\n- このデータセットの利用によって発生したいかなるトラブルや損害に対しても、データセットの提供者は責任を負わない。\n- このデータセットの利用に際して、自身の国または地域の法律に従うこと。\n\n\nこのデータセットを公開している根拠は、以下の通り。\n著作権法(昭和45年5月6日法律第48号)第三十条の四:\n\n(以下引用)\n\n著作物は、次に掲げる場合その他の当該著作物に表現された思想又は感情を自ら享受し又は他人に享受させることを目的としない場合には、その必要と認められる限度において、いずれの方法によるかを問わず、利用することができる。ただし、当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる場合は、この限りでない。\n\n 一 著作物の録音、録画その他の利用に係る技術の開発又は実用化のための試験の用に供する場合\n\n 二 情報解析(多数の著作物その他の大量の情報から、当該情報を構成する言語、音、影像その他の要素に係る情報を抽出し、比較、分類その他の解析を行うことをいう。第四十七条の五第一項第二号において同じ。)の用に供する場合\n\n 三 前二号に掲げる場合のほか、著作物の表現についての人の知覚による認識を伴うことなく当該著作物を電子計算機による情報処理の過程における利用その他の利用(プログラムの著作物にあつては、当該著作物の電子計算機における実行を除く。)に供する場合\n\n(引用終わり)\n\n- このデータセットは、上記の第二号に該当すると考えられる。\n- 「著作物に表現された思想又は感情を自ら享受し又は他人に享受させることを目的としない場合」という条件を満たすように配慮したデータセットの構造となっており、LICENSEにある通り、利用者は享受目的の利用を禁止されている。\n- 「当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる場合」については、本データセットでは参照元や声優名やキャラクター名を伏せている上に音声の順番もシャッフルされており、また同一参照元を持つ識別子も公開していないことから、当該著作物(ゲーム)の使用用途(シナリオを音声と絵をあわせて楽しむ)で利用することは不可能であり、このデータセットの公開によって著作権者の利益を不当に害することはないと考えられる。\n- もし仮に利用者がライセンスを無視し享受利用しようとしたとしても、話者数が400人以上で多数のため同じソースを持つ話者識別名の特定は困難であり、また音声ファイルも1話者につき100ファイル以上であるので、元のゲームのシナリオを再現することは事実上不可能である。\n- LICENSEにある通り、「当該著作物の種類及び用途並びに当該利用の態様に照らし著作権者の利益を不当に害することとなる」ような利用方法は禁じられており、また以上の配慮を妨げるような行為(声優や元著作物との対応についての第三者への情報提供やそれらが分かるような形での再配布)は禁止されている。" ]
76ce75297e5a589e0efbe01c315555995b346abb
# Dataset Card for "esc50_extract_unit" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Codec-SUPERB/esc50_extract_unit
[ "region:us" ]
2024-01-19T12:21:48+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k", "path": "data/encodec_24k-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "academicodec_hifi_16k_320d", "num_bytes": 16081006, "num_examples": 2000}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 16081006, "num_examples": 2000}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 24081006, "num_examples": 2000}, {"name": "audiodec_24k_320d", "num_bytes": 51441006, "num_examples": 2000}, {"name": "dac_16k", "num_bytes": 98065006, "num_examples": 2000}, {"name": "dac_24k", "num_bytes": 272177006, "num_examples": 2000}, {"name": "dac_44k", "num_bytes": 83065006, "num_examples": 2000}, {"name": "encodec_24k", "num_bytes": 12097006, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 128817006, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 128817006, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 128305006, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 64305006, "num_examples": 2000}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 128305006, "num_examples": 2000}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 64305006, "num_examples": 2000}, {"name": "speech_tokenizer_16k", "num_bytes": 32113006, "num_examples": 2000}], "download_size": 180540016, "dataset_size": 1248055090}}
2024-01-19T12:22:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for "esc50_extract_unit" More Information needed
[ "# Dataset Card for \"esc50_extract_unit\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"esc50_extract_unit\"\n\nMore Information needed" ]
bd597db7d0370dc1e0143697234ac7c7b83f65af
# Dataset Card for hebrewText Dataset ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contact Information](#contact-information) - [Acknowledgements](#acknowledgements) ## Dataset Description ### Dataset Summary The "Historical Figures and Events Dataset" is a rich compilation of data designed for natural language processing (NLP) tasks, particularly in the realm of historical context understanding and biographical analysis. This dataset is an invaluable resource for developing and evaluating machine learning models in reading comprehension, question answering, and contextual analysis, especially where historical and biographical content is concerned. ### Key Details: - **Purpose**: - The dataset aims to aid in the training of NLP models that can effectively understand and interpret historical contexts and biographical information. - It facilitates tasks like reading comprehension and question answering by providing contextually rich prompts and their corresponding answers. - **Size**: - The dataset is divided into two parts: training and testing sets. - The **training set** is approximately **3.5 MB (3537 KB)**, providing a substantial amount of data for model training. - The **testing set** is around **0.9 MB (916 KB)**, which is ideal for evaluating the performance of the models trained on the aforementioned training set. - **Content**: - The dataset includes diverse entries related to historical figures and events. Each entry is composed of a title (subject), a detailed context, a question (prompt), and a corresponding text answer. - It covers a wide range of historical periods and figures, offering a panoramic view of history through its curated content. - **Format**: - Data is structured to facilitate easy parsing and utilization in various NLP tasks. The format is consistent across the training and testing sets, ensuring uniformity for model development and evaluation. ### Represented Data: - The dataset represents a careful curation of historical information, transformed into a structured format that's particularly useful for AI and machine learning applications in the educational and research domains. - It is designed to challenge and enhance the capabilities of AI models in understanding complex, context-heavy textual information, particularly in the historical and biographical domain. By providing a detailed and contextually rich dataset, this collection serves as a bridge between historical information and advanced NLP applications, fostering the development of AI systems that can comprehend and interact with human history in a meaningful way. ### Supported Tasks and Leaderboards The "Historical Figures and Events Dataset" is designed to support a variety of natural language processing (NLP) tasks. Its rich content and structured format make it particularly suitable for the following tasks: 1. **Question Answering (QA)**: - **Task Description**: Models are tasked with providing accurate answers to questions based on the context provided. This involves understanding and processing the historical context and extracting or generating relevant answers. - **Use in Leaderboards**: The dataset can be used in competitions and benchmarks focusing on QA accuracy, where models are evaluated based on their ability to correctly answer questions given a context. 2. **Reading Comprehension**: - **Task Description**: This involves models comprehending a passage of text (context) and answering questions related to it. It tests the model's ability to understand, interpret, and recall information from historical narratives and biographies. - **Use in Leaderboards**: Reading comprehension leaderboards may rank models based on their performance in understanding and responding to complex historical texts. 3. **Text Classification**: - **Task Description**: Models can be trained to classify texts based on various historical aspects, such as time periods, geographical locations, or types of events and figures. - **Use in Leaderboards**: Models can be evaluated on their ability to accurately categorize historical texts, with potential leaderboards focusing on classification accuracy. 4. **Named Entity Recognition (NER)**: - **Task Description**: This involves identifying and classifying key historical entities (like names of people, places, events) within the text. - **Use in Leaderboards**: Models can be ranked based on their precision, recall, and F1 score in correctly identifying and classifying named entities. 5. **Sentiment Analysis**: - **Task Description**: Although more challenging in a historical context, models can be trained to analyze the sentiment or tone (positive, negative, neutral) of texts related to historical events or figures. - **Use in Leaderboards**: Sentiment analysis accuracy, especially in distinguishing nuanced tones in historical texts, can be a benchmark for model evaluation. 6. **Language Modeling and Generation**: - **Task Description**: The dataset can be used to train models for generating historically relevant text or completing partial historical narratives. - **Use in Leaderboards**: Effectiveness can be measured in terms of the model's ability to generate coherent, contextually appropriate, and historically accurate text. ### Leaderboard Contributions The dataset's diverse and rich content makes it an excellent candidate for inclusion in NLP leaderboards across these tasks. Its unique focus on historical contexts adds value to the existing benchmarks and competitions, helping advance the state-of-the-art in NLP applications within the domain of history and biographical studies. ### Languages Hebrew ## Dataset Structure ### Data Instances This data instance represents a single entry from the dataset and includes the following components: subject: This field contains the title or subject of the data entry. In this example, it is "אהרון רוזנפלד (איש עסקים)", which translates to "Aharon Rosenfeld (Businessman)". context: This field provides background information or a narrative associated with the subject. Here, it briefly discusses the activities of Aharon Rosenfeld's company in the early 20th century, focusing on his business dealings and import of building materials from Belgium. prompt: This is the question or prompt related to the context. In this case, the prompt is asking about Rosenfeld's role in the 1920s, seeking information that can be inferred from the given context. text: The answer to the prompt based on the context. The example response indicates that Rosenfeld was involved in creating connections in Belgium and importing building materials during that period. ### Data Fields - **subject**: "אהרון רוזנפלד (איש עסקים)" - **context**: "בתחילה עסקה חברתו מכל הבא ליד אך עם הגברת הבנייה בשנות ה-20 יצר רוזנפלד קשרים בבלגיה..." - **prompt**: "מה היה תפקידו של אהרון רוזנפלד בשנות ה-20?" - **text**: "ייצר קשרים בבלגיה וייבא חומרי בניין" ### Data Splits 80/20 ## Dataset Creation ### Curation Rationale The dataset, comprising historical figures and events, was created with the intention of supporting natural language processing (NLP) tasks related to understanding and interpreting historical contexts and biographical information. It is designed particularly for training and evaluating machine learning models on tasks like reading comprehension, question answering, and contextual analysis. ### Source Data Scrapping the web. ### Annotations ## Annotations in the Dataset The dataset contains the following types of annotations: 1. **Answer Text (`text`)**: - This annotation includes the exact text that answers the question posed in the `prompt`. - It is a direct or paraphrased extract from the `context`. 2. **Answer Start Index (`answer_start`)**: - This numerical annotation indicates the starting position of the answer text within the `context`. - It helps in locating the answer within the larger body of text. ## Creation of Annotations 1. **Manual Annotation**: - A team of annotators, likely with expertise in historical content and language processing, manually reads the context and formulates relevant questions (`prompts`). - They then identify and extract the precise segment of text that answers each question, along with its starting index in the context. - This process ensures that the annotations are accurate and contextually relevant. 2. **Quality Checks**: - After the initial annotation, a second round of review is conducted by separate annotators to ensure the accuracy and consistency of the annotations. - Discrepancies, if any, are resolved through discussion or reference to authoritative sources. 3. **Automated Tools**: - In some cases, automated tools might be used to assist annotators. - For instance, text processing tools can help in identifying potential answer spans or calculating the `answer_start` indices. - However, manual oversight is essential to ensure the correctness of these automated annotations. 4. **Iterative Refinement**: - The dataset undergoes several iterations of review and refinement. - This iterative process helps in improving the quality of the annotations, making them more reliable for training machine learning models. The annotation process is critical in ensuring the dataset's utility for tasks such as question answering and reading comprehension. Accurate annotations enable the effective training and evaluation of NLP models, particularly in understanding and responding to queries based on historical and biographical contexts. ## Considerations for Using the Data ### Discussion of Biases ## Discussion of Biases in the Dataset When discussing potential biases in a dataset, especially one that deals with historical figures and events, it's important to consider several factors that could influence the representation and interpretation of data: 1. **Historical Context Bias**: - The dataset focuses on specific historical periods or figures, which may not represent a broader range of historical contexts or perspectives. This can lead to a skewed understanding of history, favoring certain narratives over others. 2. **Cultural and Geographical Bias**: - If the dataset predominantly features figures or events from specific cultures or geographical regions, it might lack diversity. This can result in models that are less accurate or relevant when applied to global or multicultural contexts. 3. **Language Bias**: - Since the dataset is in Hebrew, the language itself might influence the type of content included. Certain nuances, expressions, or historical events might be overrepresented or underrepresented due to linguistic factors. 4. **Annotation Bias**: - The manual process of annotating questions and answers could introduce biases based on the annotators' perspectives, knowledge, or cultural backgrounds. Their interpretations of historical events or figures might affect the dataset's objectivity. 5. **Selection Bias**: - The criteria used for selecting which figures or events to include in the dataset could introduce bias. This might lead to the overrepresentation of certain types of figures (e.g., political leaders, famous personalities) while neglecting others (e.g., lesser-known individuals, marginalized groups). 6. **Temporal Bias**: - Changes in societal norms and historical interpretations over time can lead to temporal bias. How certain events or figures were perceived in the past may differ from current perspectives, affecting the dataset's relevance and accuracy. To mitigate these biases, it's important to: - Ensure diversity in the selection of historical figures and events. - Include multiple perspectives and narratives, especially from underrepresented groups. - Conduct thorough quality checks and revisions of the annotations to minimize subjective biases. - Continuously update and expand the dataset to reflect new research and changing societal understandings of history. ## Additional Information ### Dataset Curators Andrei Ross - Hooking Software. ### Licensing Information When choosing a license for your dataset, it's important to consider how you want others to use it. Here are some common types of licenses you might consider: 1. **MIT License**: - A permissive and open-source license. It allows for almost unrestricted freedom with the dataset, including commercial use, modification, distribution, and private use. - It's a good choice if you want to encourage a wide range of uses, including commercial and academic. 2. **Creative Commons Licenses**: - Creative Commons offers several licenses, ranging from the most permissive (CC BY, which only requires attribution) to more restrictive options (like CC BY-NC-ND, which allows others to download the works but not use them commercially or create derivatives). - These are suitable for datasets where some control over usage is desired. 3. **GNU General Public License (GPL)**: - This is a copyleft license that requires any distributed adaptations or derivatives of the dataset to be available under the same license. - It's ideal for ensuring that derivative works remain open and free. 4. **Apache License 2.0**: - Similar to the MIT License but also provides an express grant of patent rights from contributors to users. - It's a good choice for datasets that might be used in patented or patentable technologies. ### Citation Information Ross, A. & Atias, E. (2024). Historical Figures and Events Dataset. Version 1.1. Available at https://huggingface.co/datasets/hooking-dev/hebrewText ### Contact Information [email protected]
hooking-dev/HebrewInstructions
[ "task_categories:text-classification", "multilinguality:monolingual", "size_categories:1K<n<10K", "source_datasets:original", "language:he", "license:apache-2.0", "region:us" ]
2024-01-19T12:33:41+00:00
{"language": ["he"], "license": ["apache-2.0"], "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "source_datasets": ["original"], "task_categories": ["text-classification"]}
2024-01-19T14:53:59+00:00
[]
[ "he" ]
TAGS #task_categories-text-classification #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Hebrew #license-apache-2.0 #region-us
# Dataset Card for hebrewText Dataset ## Table of Contents - Dataset Description - Dataset Summary - Supported Tasks and Leaderboards - Languages - Dataset Structure - Data Instances - Data Fields - Data Splits - Dataset Creation - Curation Rationale - Source Data - Annotations - Personal and Sensitive Information - Considerations for Using the Data - Social Impact of Dataset - Discussion of Biases - Other Known Limitations - Additional Information - Dataset Curators - Licensing Information - Citation Information - Contact Information - Acknowledgements ## Dataset Description ### Dataset Summary The "Historical Figures and Events Dataset" is a rich compilation of data designed for natural language processing (NLP) tasks, particularly in the realm of historical context understanding and biographical analysis. This dataset is an invaluable resource for developing and evaluating machine learning models in reading comprehension, question answering, and contextual analysis, especially where historical and biographical content is concerned. ### Key Details: - Purpose: - The dataset aims to aid in the training of NLP models that can effectively understand and interpret historical contexts and biographical information. - It facilitates tasks like reading comprehension and question answering by providing contextually rich prompts and their corresponding answers. - Size: - The dataset is divided into two parts: training and testing sets. - The training set is approximately 3.5 MB (3537 KB), providing a substantial amount of data for model training. - The testing set is around 0.9 MB (916 KB), which is ideal for evaluating the performance of the models trained on the aforementioned training set. - Content: - The dataset includes diverse entries related to historical figures and events. Each entry is composed of a title (subject), a detailed context, a question (prompt), and a corresponding text answer. - It covers a wide range of historical periods and figures, offering a panoramic view of history through its curated content. - Format: - Data is structured to facilitate easy parsing and utilization in various NLP tasks. The format is consistent across the training and testing sets, ensuring uniformity for model development and evaluation. ### Represented Data: - The dataset represents a careful curation of historical information, transformed into a structured format that's particularly useful for AI and machine learning applications in the educational and research domains. - It is designed to challenge and enhance the capabilities of AI models in understanding complex, context-heavy textual information, particularly in the historical and biographical domain. By providing a detailed and contextually rich dataset, this collection serves as a bridge between historical information and advanced NLP applications, fostering the development of AI systems that can comprehend and interact with human history in a meaningful way. ### Supported Tasks and Leaderboards The "Historical Figures and Events Dataset" is designed to support a variety of natural language processing (NLP) tasks. Its rich content and structured format make it particularly suitable for the following tasks: 1. Question Answering (QA): - Task Description: Models are tasked with providing accurate answers to questions based on the context provided. This involves understanding and processing the historical context and extracting or generating relevant answers. - Use in Leaderboards: The dataset can be used in competitions and benchmarks focusing on QA accuracy, where models are evaluated based on their ability to correctly answer questions given a context. 2. Reading Comprehension: - Task Description: This involves models comprehending a passage of text (context) and answering questions related to it. It tests the model's ability to understand, interpret, and recall information from historical narratives and biographies. - Use in Leaderboards: Reading comprehension leaderboards may rank models based on their performance in understanding and responding to complex historical texts. 3. Text Classification: - Task Description: Models can be trained to classify texts based on various historical aspects, such as time periods, geographical locations, or types of events and figures. - Use in Leaderboards: Models can be evaluated on their ability to accurately categorize historical texts, with potential leaderboards focusing on classification accuracy. 4. Named Entity Recognition (NER): - Task Description: This involves identifying and classifying key historical entities (like names of people, places, events) within the text. - Use in Leaderboards: Models can be ranked based on their precision, recall, and F1 score in correctly identifying and classifying named entities. 5. Sentiment Analysis: - Task Description: Although more challenging in a historical context, models can be trained to analyze the sentiment or tone (positive, negative, neutral) of texts related to historical events or figures. - Use in Leaderboards: Sentiment analysis accuracy, especially in distinguishing nuanced tones in historical texts, can be a benchmark for model evaluation. 6. Language Modeling and Generation: - Task Description: The dataset can be used to train models for generating historically relevant text or completing partial historical narratives. - Use in Leaderboards: Effectiveness can be measured in terms of the model's ability to generate coherent, contextually appropriate, and historically accurate text. ### Leaderboard Contributions The dataset's diverse and rich content makes it an excellent candidate for inclusion in NLP leaderboards across these tasks. Its unique focus on historical contexts adds value to the existing benchmarks and competitions, helping advance the state-of-the-art in NLP applications within the domain of history and biographical studies. ### Languages Hebrew ## Dataset Structure ### Data Instances This data instance represents a single entry from the dataset and includes the following components: subject: This field contains the title or subject of the data entry. In this example, it is "אהרון רוזנפלד (איש עסקים)", which translates to "Aharon Rosenfeld (Businessman)". context: This field provides background information or a narrative associated with the subject. Here, it briefly discusses the activities of Aharon Rosenfeld's company in the early 20th century, focusing on his business dealings and import of building materials from Belgium. prompt: This is the question or prompt related to the context. In this case, the prompt is asking about Rosenfeld's role in the 1920s, seeking information that can be inferred from the given context. text: The answer to the prompt based on the context. The example response indicates that Rosenfeld was involved in creating connections in Belgium and importing building materials during that period. ### Data Fields - subject: "אהרון רוזנפלד (איש עסקים)" - context: "בתחילה עסקה חברתו מכל הבא ליד אך עם הגברת הבנייה בשנות ה-20 יצר רוזנפלד קשרים בבלגיה..." - prompt: "מה היה תפקידו של אהרון רוזנפלד בשנות ה-20?" - text: "ייצר קשרים בבלגיה וייבא חומרי בניין" ### Data Splits 80/20 ## Dataset Creation ### Curation Rationale The dataset, comprising historical figures and events, was created with the intention of supporting natural language processing (NLP) tasks related to understanding and interpreting historical contexts and biographical information. It is designed particularly for training and evaluating machine learning models on tasks like reading comprehension, question answering, and contextual analysis. ### Source Data Scrapping the web. ### Annotations ## Annotations in the Dataset The dataset contains the following types of annotations: 1. Answer Text ('text'): - This annotation includes the exact text that answers the question posed in the 'prompt'. - It is a direct or paraphrased extract from the 'context'. 2. Answer Start Index ('answer_start'): - This numerical annotation indicates the starting position of the answer text within the 'context'. - It helps in locating the answer within the larger body of text. ## Creation of Annotations 1. Manual Annotation: - A team of annotators, likely with expertise in historical content and language processing, manually reads the context and formulates relevant questions ('prompts'). - They then identify and extract the precise segment of text that answers each question, along with its starting index in the context. - This process ensures that the annotations are accurate and contextually relevant. 2. Quality Checks: - After the initial annotation, a second round of review is conducted by separate annotators to ensure the accuracy and consistency of the annotations. - Discrepancies, if any, are resolved through discussion or reference to authoritative sources. 3. Automated Tools: - In some cases, automated tools might be used to assist annotators. - For instance, text processing tools can help in identifying potential answer spans or calculating the 'answer_start' indices. - However, manual oversight is essential to ensure the correctness of these automated annotations. 4. Iterative Refinement: - The dataset undergoes several iterations of review and refinement. - This iterative process helps in improving the quality of the annotations, making them more reliable for training machine learning models. The annotation process is critical in ensuring the dataset's utility for tasks such as question answering and reading comprehension. Accurate annotations enable the effective training and evaluation of NLP models, particularly in understanding and responding to queries based on historical and biographical contexts. ## Considerations for Using the Data ### Discussion of Biases ## Discussion of Biases in the Dataset When discussing potential biases in a dataset, especially one that deals with historical figures and events, it's important to consider several factors that could influence the representation and interpretation of data: 1. Historical Context Bias: - The dataset focuses on specific historical periods or figures, which may not represent a broader range of historical contexts or perspectives. This can lead to a skewed understanding of history, favoring certain narratives over others. 2. Cultural and Geographical Bias: - If the dataset predominantly features figures or events from specific cultures or geographical regions, it might lack diversity. This can result in models that are less accurate or relevant when applied to global or multicultural contexts. 3. Language Bias: - Since the dataset is in Hebrew, the language itself might influence the type of content included. Certain nuances, expressions, or historical events might be overrepresented or underrepresented due to linguistic factors. 4. Annotation Bias: - The manual process of annotating questions and answers could introduce biases based on the annotators' perspectives, knowledge, or cultural backgrounds. Their interpretations of historical events or figures might affect the dataset's objectivity. 5. Selection Bias: - The criteria used for selecting which figures or events to include in the dataset could introduce bias. This might lead to the overrepresentation of certain types of figures (e.g., political leaders, famous personalities) while neglecting others (e.g., lesser-known individuals, marginalized groups). 6. Temporal Bias: - Changes in societal norms and historical interpretations over time can lead to temporal bias. How certain events or figures were perceived in the past may differ from current perspectives, affecting the dataset's relevance and accuracy. To mitigate these biases, it's important to: - Ensure diversity in the selection of historical figures and events. - Include multiple perspectives and narratives, especially from underrepresented groups. - Conduct thorough quality checks and revisions of the annotations to minimize subjective biases. - Continuously update and expand the dataset to reflect new research and changing societal understandings of history. ## Additional Information ### Dataset Curators Andrei Ross - Hooking Software. ### Licensing Information When choosing a license for your dataset, it's important to consider how you want others to use it. Here are some common types of licenses you might consider: 1. MIT License: - A permissive and open-source license. It allows for almost unrestricted freedom with the dataset, including commercial use, modification, distribution, and private use. - It's a good choice if you want to encourage a wide range of uses, including commercial and academic. 2. Creative Commons Licenses: - Creative Commons offers several licenses, ranging from the most permissive (CC BY, which only requires attribution) to more restrictive options (like CC BY-NC-ND, which allows others to download the works but not use them commercially or create derivatives). - These are suitable for datasets where some control over usage is desired. 3. GNU General Public License (GPL): - This is a copyleft license that requires any distributed adaptations or derivatives of the dataset to be available under the same license. - It's ideal for ensuring that derivative works remain open and free. 4. Apache License 2.0: - Similar to the MIT License but also provides an express grant of patent rights from contributors to users. - It's a good choice for datasets that might be used in patented or patentable technologies. Ross, A. & Atias, E. (2024). Historical Figures and Events Dataset. Version 1.1. Available at URL ### Contact Information it@URL
[ "# Dataset Card for hebrewText Dataset", "## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contact Information\n- Acknowledgements", "## Dataset Description", "### Dataset Summary\n\nThe \"Historical Figures and Events Dataset\" is a rich compilation of data designed for natural language processing (NLP) tasks, particularly in the realm of historical context understanding and biographical analysis. This dataset is an invaluable resource for developing and evaluating machine learning models in reading comprehension, question answering, and contextual analysis, especially where historical and biographical content is concerned.", "### Key Details:\n\n- Purpose: \n - The dataset aims to aid in the training of NLP models that can effectively understand and interpret historical contexts and biographical information.\n - It facilitates tasks like reading comprehension and question answering by providing contextually rich prompts and their corresponding answers.\n\n- Size:\n - The dataset is divided into two parts: training and testing sets.\n - The training set is approximately 3.5 MB (3537 KB), providing a substantial amount of data for model training.\n - The testing set is around 0.9 MB (916 KB), which is ideal for evaluating the performance of the models trained on the aforementioned training set.\n\n- Content:\n - The dataset includes diverse entries related to historical figures and events. Each entry is composed of a title (subject), a detailed context, a question (prompt), and a corresponding text answer.\n - It covers a wide range of historical periods and figures, offering a panoramic view of history through its curated content.\n\n- Format:\n - Data is structured to facilitate easy parsing and utilization in various NLP tasks. The format is consistent across the training and testing sets, ensuring uniformity for model development and evaluation.", "### Represented Data:\n\n- The dataset represents a careful curation of historical information, transformed into a structured format that's particularly useful for AI and machine learning applications in the educational and research domains.\n\n- It is designed to challenge and enhance the capabilities of AI models in understanding complex, context-heavy textual information, particularly in the historical and biographical domain.\n\nBy providing a detailed and contextually rich dataset, this collection serves as a bridge between historical information and advanced NLP applications, fostering the development of AI systems that can comprehend and interact with human history in a meaningful way.", "### Supported Tasks and Leaderboards\n\nThe \"Historical Figures and Events Dataset\" is designed to support a variety of natural language processing (NLP) tasks. Its rich content and structured format make it particularly suitable for the following tasks:\n\n1. Question Answering (QA):\n - Task Description: Models are tasked with providing accurate answers to questions based on the context provided. This involves understanding and processing the historical context and extracting or generating relevant answers.\n - Use in Leaderboards: The dataset can be used in competitions and benchmarks focusing on QA accuracy, where models are evaluated based on their ability to correctly answer questions given a context.\n\n2. Reading Comprehension:\n - Task Description: This involves models comprehending a passage of text (context) and answering questions related to it. It tests the model's ability to understand, interpret, and recall information from historical narratives and biographies.\n - Use in Leaderboards: Reading comprehension leaderboards may rank models based on their performance in understanding and responding to complex historical texts.\n\n3. Text Classification:\n - Task Description: Models can be trained to classify texts based on various historical aspects, such as time periods, geographical locations, or types of events and figures.\n - Use in Leaderboards: Models can be evaluated on their ability to accurately categorize historical texts, with potential leaderboards focusing on classification accuracy.\n\n4. Named Entity Recognition (NER):\n - Task Description: This involves identifying and classifying key historical entities (like names of people, places, events) within the text.\n - Use in Leaderboards: Models can be ranked based on their precision, recall, and F1 score in correctly identifying and classifying named entities.\n\n5. Sentiment Analysis:\n - Task Description: Although more challenging in a historical context, models can be trained to analyze the sentiment or tone (positive, negative, neutral) of texts related to historical events or figures.\n - Use in Leaderboards: Sentiment analysis accuracy, especially in distinguishing nuanced tones in historical texts, can be a benchmark for model evaluation.\n\n6. Language Modeling and Generation:\n - Task Description: The dataset can be used to train models for generating historically relevant text or completing partial historical narratives.\n - Use in Leaderboards: Effectiveness can be measured in terms of the model's ability to generate coherent, contextually appropriate, and historically accurate text.", "### Leaderboard Contributions\n\nThe dataset's diverse and rich content makes it an excellent candidate for inclusion in NLP leaderboards across these tasks. Its unique focus on historical contexts adds value to the existing benchmarks and competitions, helping advance the state-of-the-art in NLP applications within the domain of history and biographical studies.", "### Languages\n\nHebrew", "## Dataset Structure", "### Data Instances\n\nThis data instance represents a single entry from the dataset and includes the following components:\n\nsubject: This field contains the title or subject of the data entry. In this example, it is \"אהרון רוזנפלד (איש עסקים)\", which translates to \"Aharon Rosenfeld (Businessman)\".\n\ncontext: This field provides background information or a narrative associated with the subject. Here, it briefly discusses the activities of Aharon Rosenfeld's company in the early 20th century, focusing on his business dealings and import of building materials from Belgium.\n\nprompt: This is the question or prompt related to the context. In this case, the prompt is asking about Rosenfeld's role in the 1920s, seeking information that can be inferred from the given context.\n\ntext: The answer to the prompt based on the context. The example response indicates that Rosenfeld was involved in creating connections in Belgium and importing building materials during that period.", "### Data Fields\n\n- subject: \"אהרון רוזנפלד (איש עסקים)\"\n- context: \"בתחילה עסקה חברתו מכל הבא ליד אך עם הגברת הבנייה בשנות ה-20 יצר רוזנפלד קשרים בבלגיה...\"\n- prompt: \"מה היה תפקידו של אהרון רוזנפלד בשנות ה-20?\"\n- text: \"ייצר קשרים בבלגיה וייבא חומרי בניין\"", "### Data Splits\n\n80/20", "## Dataset Creation", "### Curation Rationale\n\nThe dataset, comprising historical figures and events, was created with the intention of supporting natural language processing (NLP) tasks related to understanding and interpreting historical contexts and biographical information. It is designed particularly for training and evaluating machine learning models on tasks like reading comprehension, question answering, and contextual analysis.", "### Source Data\n\nScrapping the web.", "### Annotations", "## Annotations in the Dataset\n\nThe dataset contains the following types of annotations:\n\n1. Answer Text ('text'): \n - This annotation includes the exact text that answers the question posed in the 'prompt'. \n - It is a direct or paraphrased extract from the 'context'.\n\n2. Answer Start Index ('answer_start'): \n - This numerical annotation indicates the starting position of the answer text within the 'context'. \n - It helps in locating the answer within the larger body of text.", "## Creation of Annotations\n\n1. Manual Annotation:\n - A team of annotators, likely with expertise in historical content and language processing, manually reads the context and formulates relevant questions ('prompts').\n - They then identify and extract the precise segment of text that answers each question, along with its starting index in the context. \n - This process ensures that the annotations are accurate and contextually relevant.\n\n2. Quality Checks:\n - After the initial annotation, a second round of review is conducted by separate annotators to ensure the accuracy and consistency of the annotations.\n - Discrepancies, if any, are resolved through discussion or reference to authoritative sources.\n\n3. Automated Tools:\n - In some cases, automated tools might be used to assist annotators.\n - For instance, text processing tools can help in identifying potential answer spans or calculating the 'answer_start' indices.\n - However, manual oversight is essential to ensure the correctness of these automated annotations.\n\n4. Iterative Refinement:\n - The dataset undergoes several iterations of review and refinement.\n - This iterative process helps in improving the quality of the annotations, making them more reliable for training machine learning models.\n\nThe annotation process is critical in ensuring the dataset's utility for tasks such as question answering and reading comprehension. Accurate annotations enable the effective training and evaluation of NLP models, particularly in understanding and responding to queries based on historical and biographical contexts.", "## Considerations for Using the Data", "### Discussion of Biases", "## Discussion of Biases in the Dataset\n\nWhen discussing potential biases in a dataset, especially one that deals with historical figures and events, it's important to consider several factors that could influence the representation and interpretation of data:\n\n1. Historical Context Bias:\n - The dataset focuses on specific historical periods or figures, which may not represent a broader range of historical contexts or perspectives. This can lead to a skewed understanding of history, favoring certain narratives over others.\n\n2. Cultural and Geographical Bias:\n - If the dataset predominantly features figures or events from specific cultures or geographical regions, it might lack diversity. This can result in models that are less accurate or relevant when applied to global or multicultural contexts.\n\n3. Language Bias:\n - Since the dataset is in Hebrew, the language itself might influence the type of content included. Certain nuances, expressions, or historical events might be overrepresented or underrepresented due to linguistic factors.\n\n4. Annotation Bias:\n - The manual process of annotating questions and answers could introduce biases based on the annotators' perspectives, knowledge, or cultural backgrounds. Their interpretations of historical events or figures might affect the dataset's objectivity.\n\n5. Selection Bias:\n - The criteria used for selecting which figures or events to include in the dataset could introduce bias. This might lead to the overrepresentation of certain types of figures (e.g., political leaders, famous personalities) while neglecting others (e.g., lesser-known individuals, marginalized groups).\n\n6. Temporal Bias:\n - Changes in societal norms and historical interpretations over time can lead to temporal bias. How certain events or figures were perceived in the past may differ from current perspectives, affecting the dataset's relevance and accuracy.\n\nTo mitigate these biases, it's important to:\n\n- Ensure diversity in the selection of historical figures and events.\n- Include multiple perspectives and narratives, especially from underrepresented groups.\n- Conduct thorough quality checks and revisions of the annotations to minimize subjective biases.\n- Continuously update and expand the dataset to reflect new research and changing societal understandings of history.", "## Additional Information", "### Dataset Curators\n\nAndrei Ross - Hooking Software.", "### Licensing Information\n\nWhen choosing a license for your dataset, it's important to consider how you want others to use it. Here are some common types of licenses you might consider:\n\n1. MIT License:\n - A permissive and open-source license. It allows for almost unrestricted freedom with the dataset, including commercial use, modification, distribution, and private use. \n - It's a good choice if you want to encourage a wide range of uses, including commercial and academic.\n\n2. Creative Commons Licenses:\n - Creative Commons offers several licenses, ranging from the most permissive (CC BY, which only requires attribution) to more restrictive options (like CC BY-NC-ND, which allows others to download the works but not use them commercially or create derivatives).\n - These are suitable for datasets where some control over usage is desired.\n\n3. GNU General Public License (GPL):\n - This is a copyleft license that requires any distributed adaptations or derivatives of the dataset to be available under the same license. \n - It's ideal for ensuring that derivative works remain open and free.\n\n4. Apache License 2.0:\n - Similar to the MIT License but also provides an express grant of patent rights from contributors to users.\n - It's a good choice for datasets that might be used in patented or patentable technologies.\n\n\n\n\nRoss, A. & Atias, E. (2024). Historical Figures and Events Dataset. Version 1.1. Available at URL", "### Contact Information\n\nit@URL" ]
[ "TAGS\n#task_categories-text-classification #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Hebrew #license-apache-2.0 #region-us \n", "# Dataset Card for hebrewText Dataset", "## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contact Information\n- Acknowledgements", "## Dataset Description", "### Dataset Summary\n\nThe \"Historical Figures and Events Dataset\" is a rich compilation of data designed for natural language processing (NLP) tasks, particularly in the realm of historical context understanding and biographical analysis. This dataset is an invaluable resource for developing and evaluating machine learning models in reading comprehension, question answering, and contextual analysis, especially where historical and biographical content is concerned.", "### Key Details:\n\n- Purpose: \n - The dataset aims to aid in the training of NLP models that can effectively understand and interpret historical contexts and biographical information.\n - It facilitates tasks like reading comprehension and question answering by providing contextually rich prompts and their corresponding answers.\n\n- Size:\n - The dataset is divided into two parts: training and testing sets.\n - The training set is approximately 3.5 MB (3537 KB), providing a substantial amount of data for model training.\n - The testing set is around 0.9 MB (916 KB), which is ideal for evaluating the performance of the models trained on the aforementioned training set.\n\n- Content:\n - The dataset includes diverse entries related to historical figures and events. Each entry is composed of a title (subject), a detailed context, a question (prompt), and a corresponding text answer.\n - It covers a wide range of historical periods and figures, offering a panoramic view of history through its curated content.\n\n- Format:\n - Data is structured to facilitate easy parsing and utilization in various NLP tasks. The format is consistent across the training and testing sets, ensuring uniformity for model development and evaluation.", "### Represented Data:\n\n- The dataset represents a careful curation of historical information, transformed into a structured format that's particularly useful for AI and machine learning applications in the educational and research domains.\n\n- It is designed to challenge and enhance the capabilities of AI models in understanding complex, context-heavy textual information, particularly in the historical and biographical domain.\n\nBy providing a detailed and contextually rich dataset, this collection serves as a bridge between historical information and advanced NLP applications, fostering the development of AI systems that can comprehend and interact with human history in a meaningful way.", "### Supported Tasks and Leaderboards\n\nThe \"Historical Figures and Events Dataset\" is designed to support a variety of natural language processing (NLP) tasks. Its rich content and structured format make it particularly suitable for the following tasks:\n\n1. Question Answering (QA):\n - Task Description: Models are tasked with providing accurate answers to questions based on the context provided. This involves understanding and processing the historical context and extracting or generating relevant answers.\n - Use in Leaderboards: The dataset can be used in competitions and benchmarks focusing on QA accuracy, where models are evaluated based on their ability to correctly answer questions given a context.\n\n2. Reading Comprehension:\n - Task Description: This involves models comprehending a passage of text (context) and answering questions related to it. It tests the model's ability to understand, interpret, and recall information from historical narratives and biographies.\n - Use in Leaderboards: Reading comprehension leaderboards may rank models based on their performance in understanding and responding to complex historical texts.\n\n3. Text Classification:\n - Task Description: Models can be trained to classify texts based on various historical aspects, such as time periods, geographical locations, or types of events and figures.\n - Use in Leaderboards: Models can be evaluated on their ability to accurately categorize historical texts, with potential leaderboards focusing on classification accuracy.\n\n4. Named Entity Recognition (NER):\n - Task Description: This involves identifying and classifying key historical entities (like names of people, places, events) within the text.\n - Use in Leaderboards: Models can be ranked based on their precision, recall, and F1 score in correctly identifying and classifying named entities.\n\n5. Sentiment Analysis:\n - Task Description: Although more challenging in a historical context, models can be trained to analyze the sentiment or tone (positive, negative, neutral) of texts related to historical events or figures.\n - Use in Leaderboards: Sentiment analysis accuracy, especially in distinguishing nuanced tones in historical texts, can be a benchmark for model evaluation.\n\n6. Language Modeling and Generation:\n - Task Description: The dataset can be used to train models for generating historically relevant text or completing partial historical narratives.\n - Use in Leaderboards: Effectiveness can be measured in terms of the model's ability to generate coherent, contextually appropriate, and historically accurate text.", "### Leaderboard Contributions\n\nThe dataset's diverse and rich content makes it an excellent candidate for inclusion in NLP leaderboards across these tasks. Its unique focus on historical contexts adds value to the existing benchmarks and competitions, helping advance the state-of-the-art in NLP applications within the domain of history and biographical studies.", "### Languages\n\nHebrew", "## Dataset Structure", "### Data Instances\n\nThis data instance represents a single entry from the dataset and includes the following components:\n\nsubject: This field contains the title or subject of the data entry. In this example, it is \"אהרון רוזנפלד (איש עסקים)\", which translates to \"Aharon Rosenfeld (Businessman)\".\n\ncontext: This field provides background information or a narrative associated with the subject. Here, it briefly discusses the activities of Aharon Rosenfeld's company in the early 20th century, focusing on his business dealings and import of building materials from Belgium.\n\nprompt: This is the question or prompt related to the context. In this case, the prompt is asking about Rosenfeld's role in the 1920s, seeking information that can be inferred from the given context.\n\ntext: The answer to the prompt based on the context. The example response indicates that Rosenfeld was involved in creating connections in Belgium and importing building materials during that period.", "### Data Fields\n\n- subject: \"אהרון רוזנפלד (איש עסקים)\"\n- context: \"בתחילה עסקה חברתו מכל הבא ליד אך עם הגברת הבנייה בשנות ה-20 יצר רוזנפלד קשרים בבלגיה...\"\n- prompt: \"מה היה תפקידו של אהרון רוזנפלד בשנות ה-20?\"\n- text: \"ייצר קשרים בבלגיה וייבא חומרי בניין\"", "### Data Splits\n\n80/20", "## Dataset Creation", "### Curation Rationale\n\nThe dataset, comprising historical figures and events, was created with the intention of supporting natural language processing (NLP) tasks related to understanding and interpreting historical contexts and biographical information. It is designed particularly for training and evaluating machine learning models on tasks like reading comprehension, question answering, and contextual analysis.", "### Source Data\n\nScrapping the web.", "### Annotations", "## Annotations in the Dataset\n\nThe dataset contains the following types of annotations:\n\n1. Answer Text ('text'): \n - This annotation includes the exact text that answers the question posed in the 'prompt'. \n - It is a direct or paraphrased extract from the 'context'.\n\n2. Answer Start Index ('answer_start'): \n - This numerical annotation indicates the starting position of the answer text within the 'context'. \n - It helps in locating the answer within the larger body of text.", "## Creation of Annotations\n\n1. Manual Annotation:\n - A team of annotators, likely with expertise in historical content and language processing, manually reads the context and formulates relevant questions ('prompts').\n - They then identify and extract the precise segment of text that answers each question, along with its starting index in the context. \n - This process ensures that the annotations are accurate and contextually relevant.\n\n2. Quality Checks:\n - After the initial annotation, a second round of review is conducted by separate annotators to ensure the accuracy and consistency of the annotations.\n - Discrepancies, if any, are resolved through discussion or reference to authoritative sources.\n\n3. Automated Tools:\n - In some cases, automated tools might be used to assist annotators.\n - For instance, text processing tools can help in identifying potential answer spans or calculating the 'answer_start' indices.\n - However, manual oversight is essential to ensure the correctness of these automated annotations.\n\n4. Iterative Refinement:\n - The dataset undergoes several iterations of review and refinement.\n - This iterative process helps in improving the quality of the annotations, making them more reliable for training machine learning models.\n\nThe annotation process is critical in ensuring the dataset's utility for tasks such as question answering and reading comprehension. Accurate annotations enable the effective training and evaluation of NLP models, particularly in understanding and responding to queries based on historical and biographical contexts.", "## Considerations for Using the Data", "### Discussion of Biases", "## Discussion of Biases in the Dataset\n\nWhen discussing potential biases in a dataset, especially one that deals with historical figures and events, it's important to consider several factors that could influence the representation and interpretation of data:\n\n1. Historical Context Bias:\n - The dataset focuses on specific historical periods or figures, which may not represent a broader range of historical contexts or perspectives. This can lead to a skewed understanding of history, favoring certain narratives over others.\n\n2. Cultural and Geographical Bias:\n - If the dataset predominantly features figures or events from specific cultures or geographical regions, it might lack diversity. This can result in models that are less accurate or relevant when applied to global or multicultural contexts.\n\n3. Language Bias:\n - Since the dataset is in Hebrew, the language itself might influence the type of content included. Certain nuances, expressions, or historical events might be overrepresented or underrepresented due to linguistic factors.\n\n4. Annotation Bias:\n - The manual process of annotating questions and answers could introduce biases based on the annotators' perspectives, knowledge, or cultural backgrounds. Their interpretations of historical events or figures might affect the dataset's objectivity.\n\n5. Selection Bias:\n - The criteria used for selecting which figures or events to include in the dataset could introduce bias. This might lead to the overrepresentation of certain types of figures (e.g., political leaders, famous personalities) while neglecting others (e.g., lesser-known individuals, marginalized groups).\n\n6. Temporal Bias:\n - Changes in societal norms and historical interpretations over time can lead to temporal bias. How certain events or figures were perceived in the past may differ from current perspectives, affecting the dataset's relevance and accuracy.\n\nTo mitigate these biases, it's important to:\n\n- Ensure diversity in the selection of historical figures and events.\n- Include multiple perspectives and narratives, especially from underrepresented groups.\n- Conduct thorough quality checks and revisions of the annotations to minimize subjective biases.\n- Continuously update and expand the dataset to reflect new research and changing societal understandings of history.", "## Additional Information", "### Dataset Curators\n\nAndrei Ross - Hooking Software.", "### Licensing Information\n\nWhen choosing a license for your dataset, it's important to consider how you want others to use it. Here are some common types of licenses you might consider:\n\n1. MIT License:\n - A permissive and open-source license. It allows for almost unrestricted freedom with the dataset, including commercial use, modification, distribution, and private use. \n - It's a good choice if you want to encourage a wide range of uses, including commercial and academic.\n\n2. Creative Commons Licenses:\n - Creative Commons offers several licenses, ranging from the most permissive (CC BY, which only requires attribution) to more restrictive options (like CC BY-NC-ND, which allows others to download the works but not use them commercially or create derivatives).\n - These are suitable for datasets where some control over usage is desired.\n\n3. GNU General Public License (GPL):\n - This is a copyleft license that requires any distributed adaptations or derivatives of the dataset to be available under the same license. \n - It's ideal for ensuring that derivative works remain open and free.\n\n4. Apache License 2.0:\n - Similar to the MIT License but also provides an express grant of patent rights from contributors to users.\n - It's a good choice for datasets that might be used in patented or patentable technologies.\n\n\n\n\nRoss, A. & Atias, E. (2024). Historical Figures and Events Dataset. Version 1.1. Available at URL", "### Contact Information\n\nit@URL" ]
9ff9477df55a0586683443003a84c6de7277feb1
# Dataset Card for "checkmate_in_one" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
fxmeng/checkmate_in_one
[ "region:us" ]
2024-01-19T12:34:52+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train_mc", "path": "data/train_mc-*"}, {"split": "test_mc", "path": "data/test_mc-*"}, {"split": "train_open", "path": "data/train_open-*"}, {"split": "test_open", "path": "data/test_open-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "board_svg", "dtype": "string"}], "splits": [{"name": "train_mc", "num_bytes": 95572281, "num_examples": 3398}, {"name": "test_mc", "num_bytes": 2798511, "num_examples": 100}, {"name": "train_open", "num_bytes": 95393971, "num_examples": 3398}, {"name": "test_open", "num_bytes": 2793236, "num_examples": 100}], "download_size": 34633974, "dataset_size": 196557999}}
2024-01-19T13:51:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "checkmate_in_one" More Information needed
[ "# Dataset Card for \"checkmate_in_one\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"checkmate_in_one\"\n\nMore Information needed" ]
4ed1a5e83c4ab05e44017c22fd77ef0fdf7b1ed5
# Dataset Card for "chess_annotation" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
fxmeng/chess_annotation
[ "region:us" ]
2024-01-19T12:42:32+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train_mc", "path": "data/train_mc-*"}, {"split": "test_mc", "path": "data/test_mc-*"}, {"split": "train_open", "path": "data/train_open-*"}, {"split": "test_open", "path": "data/test_open-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "board_svg", "dtype": "string"}], "splits": [{"name": "train_mc", "num_bytes": 89387563, "num_examples": 2899}, {"name": "test_mc", "num_bytes": 3081009, "num_examples": 100}, {"name": "train_open", "num_bytes": 88156600, "num_examples": 2899}, {"name": "test_open", "num_bytes": 3039592, "num_examples": 100}], "download_size": 32995616, "dataset_size": 183664764}}
2024-01-19T13:57:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "chess_annotation" More Information needed
[ "# Dataset Card for \"chess_annotation\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"chess_annotation\"\n\nMore Information needed" ]
b251e9b0d27860c587e159cb7de53c213e384352
# Dataset Card for "chess_opening" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
fxmeng/chess_opening
[ "region:us" ]
2024-01-19T12:44:34+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train_mc", "path": "data/train_mc-*"}, {"split": "test_mc", "path": "data/test_mc-*"}, {"split": "train_open", "path": "data/train_open-*"}, {"split": "test_open", "path": "data/test_open-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "board_svg", "dtype": "string"}], "splits": [{"name": "train_mc", "num_bytes": 94216938, "num_examples": 2979}, {"name": "test_mc", "num_bytes": 3161700, "num_examples": 100}, {"name": "train_open", "num_bytes": 93400355, "num_examples": 2979}, {"name": "test_open", "num_bytes": 3134241, "num_examples": 100}], "download_size": 33786361, "dataset_size": 193913234}}
2024-01-19T14:02:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "chess_opening" More Information needed
[ "# Dataset Card for \"chess_opening\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"chess_opening\"\n\nMore Information needed" ]
c84e714181121dfbae56083d141d210556083479
# Dataset Card for "chess_state_value" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
fxmeng/chess_state_value
[ "region:us" ]
2024-01-19T12:45:55+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train_mc", "path": "data/train_mc-*"}, {"split": "test_mc", "path": "data/test_mc-*"}, {"split": "train_open", "path": "data/train_open-*"}, {"split": "test_open", "path": "data/test_open-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "board_svg", "dtype": "string"}], "splits": [{"name": "train_mc", "num_bytes": 115713657, "num_examples": 3900}, {"name": "test_mc", "num_bytes": 2978441, "num_examples": 100}, {"name": "train_open", "num_bytes": 115370457, "num_examples": 3900}, {"name": "test_open", "num_bytes": 2969641, "num_examples": 100}], "download_size": 41616558, "dataset_size": 237032196}}
2024-01-19T14:05:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for "chess_state_value" More Information needed
[ "# Dataset Card for \"chess_state_value\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"chess_state_value\"\n\nMore Information needed" ]
cc60c2793644a9ee3c53d27878ccc70355a67fc5
# Dataset Card for "general_policy" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
fxmeng/general_policy
[ "region:us" ]
2024-01-19T12:50:37+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train_mc", "path": "data/train_mc-*"}, {"split": "test_mc", "path": "data/test_mc-*"}, {"split": "train_open", "path": "data/train_open-*"}, {"split": "test_open", "path": "data/test_open-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "board_svg", "dtype": "string"}], "splits": [{"name": "train_mc", "num_bytes": 111734126, "num_examples": 3760}, {"name": "test_mc", "num_bytes": 2964558, "num_examples": 100}, {"name": "train_open", "num_bytes": 116456269, "num_examples": 3844}, {"name": "test_open", "num_bytes": 3023141, "num_examples": 100}], "download_size": 32711459, "dataset_size": 234178094}}
2024-01-20T13:48:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "general_policy" More Information needed
[ "# Dataset Card for \"general_policy\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"general_policy\"\n\nMore Information needed" ]
2135a02973ce5587fc83bbebbd42f824951c61bd
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset was translated from the original [Stanford Alpaca 52k](https://crfm.stanford.edu/2023/03/13/alpaca.html) instructions using Free DeepL API credits to **Italian**. We translated 6353 instructions. ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> The dataset consists of the following: - **instruction:** describes the task the model should perform. Each of the 52K instructions is unique. - **input:** optional context or input for the task. For example, when the instruction is "Summarize the following article", the input is the article. - **output:** the translated answer to the instruction as generated by text-davinci-003. - **instruction_id:** An id was given to each instruction starting from 0. This can be used if further translation is to be made on the Alpaca dataset. All the instructions and outputs related to coding are removed, since DeepL returns gibberish for code. ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> Taken from Stanford Alpaca - **Repository:** [https://github.com/tatsu-lab/stanford_alpaca?tab=readme-ov-file#data-release] ### Limitations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases, and limitations of the dataset. The dataset was translated with DeepL but no human evaluation was done. Additionally, instructions related to code are ommitted.
praveensonu/alpaca_it_6k
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:it", "instruction fine-tuning", "region:us" ]
2024-01-19T13:29:13+00:00
{"language": ["it"], "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "instruction_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2330734, "num_examples": 6353}], "download_size": 1441151, "dataset_size": 2330734}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["instruction fine-tuning"]}
2024-01-19T16:28:32+00:00
[]
[ "it" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Italian #instruction fine-tuning #region-us
# Dataset Card for Dataset Name This dataset was translated from the original Stanford Alpaca 52k instructions using Free DeepL API credits to Italian. We translated 6353 instructions. ### Dataset Description The dataset consists of the following: - instruction: describes the task the model should perform. Each of the 52K instructions is unique. - input: optional context or input for the task. For example, when the instruction is "Summarize the following article", the input is the article. - output: the translated answer to the instruction as generated by text-davinci-003. - instruction_id: An id was given to each instruction starting from 0. This can be used if further translation is to be made on the Alpaca dataset. All the instructions and outputs related to coding are removed, since DeepL returns gibberish for code. ### Dataset Sources [optional] Taken from Stanford Alpaca - Repository: [URL ### Limitations Users should be made aware of the risks, biases, and limitations of the dataset. The dataset was translated with DeepL but no human evaluation was done. Additionally, instructions related to code are ommitted.
[ "# Dataset Card for Dataset Name\n\n\nThis dataset was translated from the original Stanford Alpaca 52k instructions using Free DeepL API credits to Italian. We translated 6353 instructions.", "### Dataset Description\n\n\n\nThe dataset consists of the following:\n- instruction: describes the task the model should perform. Each of the 52K instructions is unique.\n- input: optional context or input for the task. For example, when the instruction is \"Summarize the following article\", the input is the article.\n- output: the translated answer to the instruction as generated by text-davinci-003.\n- instruction_id: An id was given to each instruction starting from 0. This can be used if further translation is to be made on the Alpaca dataset.\n\nAll the instructions and outputs related to coding are removed, since DeepL returns gibberish for code.", "### Dataset Sources [optional]\n\n\nTaken from Stanford Alpaca\n- Repository: [URL", "### Limitations\n\n\n\nUsers should be made aware of the risks, biases, and limitations of the dataset. The dataset was translated with DeepL but no human evaluation was done. Additionally, instructions related to code are ommitted." ]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Italian #instruction fine-tuning #region-us \n", "# Dataset Card for Dataset Name\n\n\nThis dataset was translated from the original Stanford Alpaca 52k instructions using Free DeepL API credits to Italian. We translated 6353 instructions.", "### Dataset Description\n\n\n\nThe dataset consists of the following:\n- instruction: describes the task the model should perform. Each of the 52K instructions is unique.\n- input: optional context or input for the task. For example, when the instruction is \"Summarize the following article\", the input is the article.\n- output: the translated answer to the instruction as generated by text-davinci-003.\n- instruction_id: An id was given to each instruction starting from 0. This can be used if further translation is to be made on the Alpaca dataset.\n\nAll the instructions and outputs related to coding are removed, since DeepL returns gibberish for code.", "### Dataset Sources [optional]\n\n\nTaken from Stanford Alpaca\n- Repository: [URL", "### Limitations\n\n\n\nUsers should be made aware of the risks, biases, and limitations of the dataset. The dataset was translated with DeepL but no human evaluation was done. Additionally, instructions related to code are ommitted." ]
dbf7afeed1e11589c38d207ec20c17cfda7b8d36
This dataset was generated by reformatting [`coref-data/conll2012_raw`](https://huggingface.co/datasets/coref-data/conll2012_raw) into the indiscrim coreference format. See that repo for dataset details. See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script. Please create an issue in the repo above or in this dataset repo for any questions.
coref-data/conll2012_indiscrim
[ "region:us" ]
2024-01-19T13:58:17+00:00
{"dataset_info": [{"config_name": "arabic_v4", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 28079286, "num_examples": 359}, {"name": "validation", "num_bytes": 3290181, "num_examples": 44}, {"name": "test", "num_bytes": 3291335, "num_examples": 44}], "download_size": 10482299, "dataset_size": 34660802}, {"config_name": "chinese_v4", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 42615831, "num_examples": 1810}, {"name": "validation", "num_bytes": 6204263, "num_examples": 252}, {"name": "test", "num_bytes": 5222313, "num_examples": 218}], "download_size": 16394351, "dataset_size": 54042407}, {"config_name": "english_v12", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 111506376, "num_examples": 11401}, {"name": "validation", "num_bytes": 15410161, "num_examples": 1491}, {"name": "test", "num_bytes": 11690955, "num_examples": 1326}], "download_size": 42684618, "dataset_size": 138607492}, {"config_name": "english_v4", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 97894328, "num_examples": 2802}, {"name": "validation", "num_bytes": 12281693, "num_examples": 343}, {"name": "test", "num_bytes": 12747804, "num_examples": 348}], "download_size": 30187595, "dataset_size": 122923825}], "configs": [{"config_name": "arabic_v4", "data_files": [{"split": "train", "path": "arabic_v4/train-*"}, {"split": "validation", "path": "arabic_v4/validation-*"}, {"split": "test", "path": "arabic_v4/test-*"}]}, {"config_name": "chinese_v4", "data_files": [{"split": "train", "path": "chinese_v4/train-*"}, {"split": "validation", "path": "chinese_v4/validation-*"}, {"split": "test", "path": "chinese_v4/test-*"}]}, {"config_name": "english_v12", "data_files": [{"split": "train", "path": "english_v12/train-*"}, {"split": "validation", "path": "english_v12/validation-*"}, {"split": "test", "path": "english_v12/test-*"}]}, {"config_name": "english_v4", "data_files": [{"split": "train", "path": "english_v4/train-*"}, {"split": "validation", "path": "english_v4/validation-*"}, {"split": "test", "path": "english_v4/test-*"}]}]}
2024-01-29T00:46:38+00:00
[]
[]
TAGS #region-us
This dataset was generated by reformatting 'coref-data/conll2012_raw' into the indiscrim coreference format. See that repo for dataset details. See ianporada/coref-data for additional conversion details and the conversion script. Please create an issue in the repo above or in this dataset repo for any questions.
[]
[ "TAGS\n#region-us \n" ]
8552f469b897d9280b70b9e1ef84ce00b90ca854
# Dataset Card for "cai-conversation-dev1705674984" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705674984
[ "region:us" ]
2024-01-19T14:38:08+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 241083, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 247808, "num_examples": 64}, {"name": "test_sft", "num_bytes": 245889, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 239866, "num_examples": 64}], "download_size": 493355, "dataset_size": 974646}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T14:38:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705674984" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705674984\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705674984\"\n\nMore Information needed" ]
83623cafe5ffbd3b52c62891bea40b1a63f29f31
# Italian version of the DROP Dataset Dataset based on the Italian translation provided by: - **Leonardo Ranaldi, Giulia Pucci, Elena Sofia Ruzzetti, Fabio Massimo Zanzotto, and André Freitas** - [Teasing LLMs adapted to Italian](https://github.com/LeonardRanaldi/italian-instruct-eval/tree/main) # Citations ``` @inproceedings{Dua2019DROP, author={Dheeru Dua and Yizhong Wang and Pradeep Dasigi and Gabriel Stanovsky and Sameer Singh and Matt Gardner}, title={ {DROP}: A Reading Comprehension Benchmark Requiring Discrete Reasoning Over Paragraphs}, booktitle={Proc. of NAACL}, year={2019} } @inproceedings{RanaldiPRZF23, author = {Leonardo Ranaldi and Giulia Pucci and Elena Sofia Ruzzetti and Fabio Massimo Zanzotto and Andr{\'{e}} Freitas}, title = {Teasing LLMs Adapted to Italian}, booktitle = {Proceedings of the 9th Italian Conference on Computational Linguistics, Venice, Italy, November 30 - December 2, 2023}, series = {{CEUR} Workshop Proceedings}, volume = {3596}, publisher = {CEUR-WS.org}, year = {2023}, url = {https://ceur-ws.org/Vol-3596/short18.pdf}, timestamp = {Tue, 02 Jan 2024 17:44:44 +0100}, } @misc{basile2023llamantino, title={LLaMAntino: LLaMA 2 Models for Effective Text Generation in Italian Language}, author={Pierpaolo Basile and Elio Musacchio and Marco Polignano and Lucia Siciliani and Giuseppe Fiameni and Giovanni Semeraro}, year={2023}, eprint={2312.09993}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` # Dataset Description - **Homepage:** https://allenai.org/data/drop - **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Paper:** https://aclanthology.org/N19-1246/ - **Paper:** https://arxiv.org/abs/1903.00161 - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of downloaded dataset files:** 8.30 MB - **Size of the generated dataset:** 110.91 MB - **Total amount of disk used:** 119.21 MB ### Dataset Summary DROP: A Reading Comprehension Benchmark Requiring Discrete Reasoning Over Paragraphs. . DROP is a crowdsourced, adversarially-created, 96k-question benchmark, in which a system must resolve references in a question, perhaps to multiple input positions, and perform discrete operations over them (such as addition, counting, or sorting). These operations require a much more comprehensive understanding of the content of paragraphs than what was necessary for prior datasets. ## Dataset Structure ### Data Instances #### default - **Size of downloaded dataset files:** 8.30 MB - **Size of the generated dataset:** 110.91 MB - **Total amount of disk used:** 119.21 MB An example of 'validation' looks as follows. ``` This example was too long and was cropped: { "answers_spans": { "spans": ["Chaz Schilens"] }, "passage": "\" Hoping to rebound from their loss to the Patriots, the Raiders stayed at home for a Week 16 duel with the Houston Texans. Oak...", "question": "Who scored the first touchdown of the game?" } ``` ### Data Fields The data fields are the same among all splits. #### default - `passage`: a `string` feature. - `question`: a `string` feature. - `answers_spans`: a dictionary feature containing: - `spans`: a `string` feature. ### Data Splits | name |train|validation| |-------|----:|---------:| |default|77409| 9536|
swap-uniba/drop_ita
[ "task_categories:question-answering", "task_categories:text2text-generation", "task_ids:extractive-qa", "task_ids:abstractive-qa", "annotations_creators:crowdsourced", "language_creators:crowdsourced", "multilinguality:monolingual", "size_categories:10K<n<100K", "source_datasets:original", "language:it", "license:cc-by-sa-4.0", "llm", "italian", "llamantino", "arxiv:2312.09993", "arxiv:1903.00161", "region:us" ]
2024-01-19T14:41:58+00:00
{"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced"], "language": ["it"], "license": ["cc-by-sa-4.0"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["original"], "task_categories": ["question-answering", "text2text-generation"], "task_ids": ["extractive-qa", "abstractive-qa"], "paperswithcode_id": "drop", "pretty_name": "DROP ITA", "dataset_info": {"features": [{"name": "section_id", "dtype": "string"}, {"name": "query_id", "dtype": "string"}, {"name": "passage", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers_spans", "sequence": [{"name": "spans", "dtype": "string"}, {"name": "types", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 105572506, "num_examples": 77400}, {"name": "validation", "num_bytes": 11737755, "num_examples": 9535}], "download_size": 11538387, "dataset_size": 117310261}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "tags": ["llm", "italian", "llamantino"]}
2024-01-19T14:45:52+00:00
[ "2312.09993", "1903.00161" ]
[ "it" ]
TAGS #task_categories-question-answering #task_categories-text2text-generation #task_ids-extractive-qa #task_ids-abstractive-qa #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Italian #license-cc-by-sa-4.0 #llm #italian #llamantino #arxiv-2312.09993 #arxiv-1903.00161 #region-us
Italian version of the DROP Dataset =================================== Dataset based on the Italian translation provided by: * Leonardo Ranaldi, Giulia Pucci, Elena Sofia Ruzzetti, Fabio Massimo Zanzotto, and André Freitas - Teasing LLMs adapted to Italian s Dataset Description =================== * Homepage: URL * Repository: * Paper: URL * Paper: URL * Point of Contact: * Size of downloaded dataset files: 8.30 MB * Size of the generated dataset: 110.91 MB * Total amount of disk used: 119.21 MB ### Dataset Summary DROP: A Reading Comprehension Benchmark Requiring Discrete Reasoning Over Paragraphs. . DROP is a crowdsourced, adversarially-created, 96k-question benchmark, in which a system must resolve references in a question, perhaps to multiple input positions, and perform discrete operations over them (such as addition, counting, or sorting). These operations require a much more comprehensive understanding of the content of paragraphs than what was necessary for prior datasets. Dataset Structure ----------------- ### Data Instances #### default * Size of downloaded dataset files: 8.30 MB * Size of the generated dataset: 110.91 MB * Total amount of disk used: 119.21 MB An example of 'validation' looks as follows. ### Data Fields The data fields are the same among all splits. #### default * 'passage': a 'string' feature. * 'question': a 'string' feature. * 'answers\_spans': a dictionary feature containing: + 'spans': a 'string' feature. ### Data Splits
[ "### Dataset Summary\n\n\nDROP: A Reading Comprehension Benchmark Requiring Discrete Reasoning Over Paragraphs.\n. DROP is a crowdsourced, adversarially-created, 96k-question benchmark, in which a system must resolve references in a\nquestion, perhaps to multiple input positions, and perform discrete operations over them (such as addition, counting, or\nsorting). These operations require a much more comprehensive understanding of the content of paragraphs than what was\nnecessary for prior datasets.\n\n\nDataset Structure\n-----------------", "### Data Instances", "#### default\n\n\n* Size of downloaded dataset files: 8.30 MB\n* Size of the generated dataset: 110.91 MB\n* Total amount of disk used: 119.21 MB\n\n\nAn example of 'validation' looks as follows.", "### Data Fields\n\n\nThe data fields are the same among all splits.", "#### default\n\n\n* 'passage': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers\\_spans': a dictionary feature containing:\n\t+ 'spans': a 'string' feature.", "### Data Splits" ]
[ "TAGS\n#task_categories-question-answering #task_categories-text2text-generation #task_ids-extractive-qa #task_ids-abstractive-qa #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Italian #license-cc-by-sa-4.0 #llm #italian #llamantino #arxiv-2312.09993 #arxiv-1903.00161 #region-us \n", "### Dataset Summary\n\n\nDROP: A Reading Comprehension Benchmark Requiring Discrete Reasoning Over Paragraphs.\n. DROP is a crowdsourced, adversarially-created, 96k-question benchmark, in which a system must resolve references in a\nquestion, perhaps to multiple input positions, and perform discrete operations over them (such as addition, counting, or\nsorting). These operations require a much more comprehensive understanding of the content of paragraphs than what was\nnecessary for prior datasets.\n\n\nDataset Structure\n-----------------", "### Data Instances", "#### default\n\n\n* Size of downloaded dataset files: 8.30 MB\n* Size of the generated dataset: 110.91 MB\n* Total amount of disk used: 119.21 MB\n\n\nAn example of 'validation' looks as follows.", "### Data Fields\n\n\nThe data fields are the same among all splits.", "#### default\n\n\n* 'passage': a 'string' feature.\n* 'question': a 'string' feature.\n* 'answers\\_spans': a dictionary feature containing:\n\t+ 'spans': a 'string' feature.", "### Data Splits" ]
c322a760b63e368eb1bc2ddda8f710c779fc6f7a
# Dataset Card for "cai-conversation-dev1705675844" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705675844
[ "region:us" ]
2024-01-19T14:52:19+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 195079, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 193819, "num_examples": 64}, {"name": "test_sft", "num_bytes": 189922, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 200666, "num_examples": 64}], "download_size": 426772, "dataset_size": 779486}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T14:52:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705675844" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705675844\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705675844\"\n\nMore Information needed" ]
1fca7cd4b2874f293fe6e0814928947d72fff218
# Dataset Card for Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-v1.0 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/belal-finetuned-llama2-v1.0](https://huggingface.co/abdulrahman-nuzha/belal-finetuned-llama2-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-v1.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T15:05:24.001804](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-v1.0/blob/main/results_2024-01-19T15-05-24.001804.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.43801355158032884, "acc_stderr": 0.03436903669406286, "acc_norm": 0.4425266029309796, "acc_norm_stderr": 0.03516837110056896, "mc1": 0.2484700122399021, "mc1_stderr": 0.01512742709652068, "mc2": 0.3909462972112631, "mc2_stderr": 0.013662088305733134 }, "harness|arc:challenge|25": { "acc": 0.4854948805460751, "acc_stderr": 0.014605241081370056, "acc_norm": 0.5281569965870307, "acc_norm_stderr": 0.014588204105102203 }, "harness|hellaswag|10": { "acc": 0.5795658235411273, "acc_stderr": 0.0049261984839487055, "acc_norm": 0.7775343557060347, "acc_norm_stderr": 0.004150522630231024 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.40131578947368424, "acc_stderr": 0.039889037033362836, "acc_norm": 0.40131578947368424, "acc_norm_stderr": 0.039889037033362836 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4377358490566038, "acc_stderr": 0.030533338430467516, "acc_norm": 0.4377358490566038, "acc_norm_stderr": 0.030533338430467516 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04155319955593146, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04155319955593146 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3699421965317919, "acc_stderr": 0.036812296333943194, "acc_norm": 0.3699421965317919, "acc_norm_stderr": 0.036812296333943194 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.18627450980392157, "acc_stderr": 0.038739587141493524, "acc_norm": 0.18627450980392157, "acc_norm_stderr": 0.038739587141493524 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.451063829787234, "acc_stderr": 0.032529096196131965, "acc_norm": 0.451063829787234, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.042663394431593935, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.042663394431593935 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.23544973544973544, "acc_stderr": 0.02185150982203172, "acc_norm": 0.23544973544973544, "acc_norm_stderr": 0.02185150982203172 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.04263906892795132, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.04263906892795132 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.4290322580645161, "acc_stderr": 0.02815603653823321, "acc_norm": 0.4290322580645161, "acc_norm_stderr": 0.02815603653823321 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3399014778325123, "acc_stderr": 0.0333276906841079, "acc_norm": 0.3399014778325123, "acc_norm_stderr": 0.0333276906841079 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5696969696969697, "acc_stderr": 0.03866225962879077, "acc_norm": 0.5696969696969697, "acc_norm_stderr": 0.03866225962879077 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.47474747474747475, "acc_stderr": 0.03557806245087314, "acc_norm": 0.47474747474747475, "acc_norm_stderr": 0.03557806245087314 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6321243523316062, "acc_stderr": 0.034801756684660366, "acc_norm": 0.6321243523316062, "acc_norm_stderr": 0.034801756684660366 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.39487179487179486, "acc_stderr": 0.02478431694215638, "acc_norm": 0.39487179487179486, "acc_norm_stderr": 0.02478431694215638 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.026962424325073838, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.026962424325073838 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3865546218487395, "acc_stderr": 0.0316314580755238, "acc_norm": 0.3865546218487395, "acc_norm_stderr": 0.0316314580755238 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.03734535676787198, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.03734535676787198 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5834862385321101, "acc_stderr": 0.021136376504030874, "acc_norm": 0.5834862385321101, "acc_norm_stderr": 0.021136376504030874 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.18055555555555555, "acc_stderr": 0.026232878971491656, "acc_norm": 0.18055555555555555, "acc_norm_stderr": 0.026232878971491656 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4803921568627451, "acc_stderr": 0.03506612560524867, "acc_norm": 0.4803921568627451, "acc_norm_stderr": 0.03506612560524867 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5443037974683544, "acc_stderr": 0.03241920684693335, "acc_norm": 0.5443037974683544, "acc_norm_stderr": 0.03241920684693335 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5246636771300448, "acc_stderr": 0.03351695167652628, "acc_norm": 0.5246636771300448, "acc_norm_stderr": 0.03351695167652628 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.44274809160305345, "acc_stderr": 0.04356447202665069, "acc_norm": 0.44274809160305345, "acc_norm_stderr": 0.04356447202665069 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.044120158066245044, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.044120158066245044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.46296296296296297, "acc_stderr": 0.04820403072760627, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.04820403072760627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.44785276073619634, "acc_stderr": 0.03906947479456602, "acc_norm": 0.44785276073619634, "acc_norm_stderr": 0.03906947479456602 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.49514563106796117, "acc_stderr": 0.049505043821289195, "acc_norm": 0.49514563106796117, "acc_norm_stderr": 0.049505043821289195 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6837606837606838, "acc_stderr": 0.030463656747340275, "acc_norm": 0.6837606837606838, "acc_norm_stderr": 0.030463656747340275 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.5977011494252874, "acc_stderr": 0.017535294529068948, "acc_norm": 0.5977011494252874, "acc_norm_stderr": 0.017535294529068948 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.48265895953757226, "acc_stderr": 0.026902900458666647, "acc_norm": 0.48265895953757226, "acc_norm_stderr": 0.026902900458666647 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23910614525139665, "acc_stderr": 0.014265554192331144, "acc_norm": 0.23910614525139665, "acc_norm_stderr": 0.014265554192331144 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4934640522875817, "acc_stderr": 0.028627470550556054, "acc_norm": 0.4934640522875817, "acc_norm_stderr": 0.028627470550556054 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5562700964630225, "acc_stderr": 0.028217683556652308, "acc_norm": 0.5562700964630225, "acc_norm_stderr": 0.028217683556652308 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.49382716049382713, "acc_stderr": 0.027818623962583295, "acc_norm": 0.49382716049382713, "acc_norm_stderr": 0.027818623962583295 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3191489361702128, "acc_stderr": 0.0278079901413202, "acc_norm": 0.3191489361702128, "acc_norm_stderr": 0.0278079901413202 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3272490221642764, "acc_stderr": 0.011983819806464732, "acc_norm": 0.3272490221642764, "acc_norm_stderr": 0.011983819806464732 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4375, "acc_stderr": 0.030134614954403924, "acc_norm": 0.4375, "acc_norm_stderr": 0.030134614954403924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4297385620915033, "acc_stderr": 0.020027122784928554, "acc_norm": 0.4297385620915033, "acc_norm_stderr": 0.020027122784928554 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4636363636363636, "acc_stderr": 0.047764491623961985, "acc_norm": 0.4636363636363636, "acc_norm_stderr": 0.047764491623961985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3510204081632653, "acc_stderr": 0.030555316755573637, "acc_norm": 0.3510204081632653, "acc_norm_stderr": 0.030555316755573637 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5870646766169154, "acc_stderr": 0.03481520803367348, "acc_norm": 0.5870646766169154, "acc_norm_stderr": 0.03481520803367348 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.037998574544796375, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.037998574544796375 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6257309941520468, "acc_stderr": 0.03711601185389481, "acc_norm": 0.6257309941520468, "acc_norm_stderr": 0.03711601185389481 }, "harness|truthfulqa:mc|0": { "mc1": 0.2484700122399021, "mc1_stderr": 0.01512742709652068, "mc2": 0.3909462972112631, "mc2_stderr": 0.013662088305733134 }, "harness|winogrande|5": { "acc": 0.7434885556432518, "acc_stderr": 0.012273648008759987 }, "harness|gsm8k|5": { "acc": 0.1068991660348749, "acc_stderr": 0.008510982565520473 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-v1.0
[ "region:us" ]
2024-01-19T15:07:48+00:00
{"pretty_name": "Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/belal-finetuned-llama2-v1.0](https://huggingface.co/abdulrahman-nuzha/belal-finetuned-llama2-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T15:05:24.001804](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-v1.0/blob/main/results_2024-01-19T15-05-24.001804.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43801355158032884,\n \"acc_stderr\": 0.03436903669406286,\n \"acc_norm\": 0.4425266029309796,\n \"acc_norm_stderr\": 0.03516837110056896,\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3909462972112631,\n \"mc2_stderr\": 0.013662088305733134\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4854948805460751,\n \"acc_stderr\": 0.014605241081370056,\n \"acc_norm\": 0.5281569965870307,\n \"acc_norm_stderr\": 0.014588204105102203\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5795658235411273,\n \"acc_stderr\": 0.0049261984839487055,\n \"acc_norm\": 0.7775343557060347,\n \"acc_norm_stderr\": 0.004150522630231024\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.039889037033362836,\n \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.039889037033362836\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.030533338430467516,\n \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.030533338430467516\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23544973544973544,\n \"acc_stderr\": 0.02185150982203172,\n \"acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.02185150982203172\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4290322580645161,\n \"acc_stderr\": 0.02815603653823321,\n \"acc_norm\": 0.4290322580645161,\n \"acc_norm_stderr\": 0.02815603653823321\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.47474747474747475,\n \"acc_stderr\": 0.03557806245087314,\n \"acc_norm\": 0.47474747474747475,\n \"acc_norm_stderr\": 0.03557806245087314\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.39487179487179486,\n \"acc_stderr\": 0.02478431694215638,\n \"acc_norm\": 0.39487179487179486,\n \"acc_norm_stderr\": 0.02478431694215638\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.0316314580755238,\n \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.0316314580755238\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5834862385321101,\n \"acc_stderr\": 0.021136376504030874,\n \"acc_norm\": 0.5834862385321101,\n \"acc_norm_stderr\": 0.021136376504030874\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491656,\n \"acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491656\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524867,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524867\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5443037974683544,\n \"acc_stderr\": 0.03241920684693335,\n \"acc_norm\": 0.5443037974683544,\n \"acc_norm_stderr\": 0.03241920684693335\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.44274809160305345,\n \"acc_stderr\": 0.04356447202665069,\n \"acc_norm\": 0.44274809160305345,\n \"acc_norm_stderr\": 0.04356447202665069\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44785276073619634,\n \"acc_stderr\": 0.03906947479456602,\n \"acc_norm\": 0.44785276073619634,\n \"acc_norm_stderr\": 0.03906947479456602\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n \"acc_stderr\": 0.030463656747340275,\n \"acc_norm\": 0.6837606837606838,\n \"acc_norm_stderr\": 0.030463656747340275\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5977011494252874,\n \"acc_stderr\": 0.017535294529068948,\n \"acc_norm\": 0.5977011494252874,\n \"acc_norm_stderr\": 0.017535294529068948\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.48265895953757226,\n \"acc_stderr\": 0.026902900458666647,\n \"acc_norm\": 0.48265895953757226,\n \"acc_norm_stderr\": 0.026902900458666647\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556054,\n \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556054\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n \"acc_stderr\": 0.028217683556652308,\n \"acc_norm\": 0.5562700964630225,\n \"acc_norm_stderr\": 0.028217683556652308\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.49382716049382713,\n \"acc_stderr\": 0.027818623962583295,\n \"acc_norm\": 0.49382716049382713,\n \"acc_norm_stderr\": 0.027818623962583295\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.0278079901413202,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.0278079901413202\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3272490221642764,\n \"acc_stderr\": 0.011983819806464732,\n \"acc_norm\": 0.3272490221642764,\n \"acc_norm_stderr\": 0.011983819806464732\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928554,\n \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928554\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3510204081632653,\n \"acc_stderr\": 0.030555316755573637,\n \"acc_norm\": 0.3510204081632653,\n \"acc_norm_stderr\": 0.030555316755573637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5870646766169154,\n \"acc_stderr\": 0.03481520803367348,\n \"acc_norm\": 0.5870646766169154,\n \"acc_norm_stderr\": 0.03481520803367348\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.037998574544796375,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.037998574544796375\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6257309941520468,\n \"acc_stderr\": 0.03711601185389481,\n \"acc_norm\": 0.6257309941520468,\n \"acc_norm_stderr\": 0.03711601185389481\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3909462972112631,\n \"mc2_stderr\": 0.013662088305733134\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \"acc_stderr\": 0.008510982565520473\n }\n}\n```", "repo_url": "https://huggingface.co/abdulrahman-nuzha/belal-finetuned-llama2-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|arc:challenge|25_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|gsm8k|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hellaswag|10_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T15-05-24.001804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["**/details_harness|winogrande|5_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T15-05-24.001804.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T15_05_24.001804", "path": ["results_2024-01-19T15-05-24.001804.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T15-05-24.001804.parquet"]}]}]}
2024-01-19T15:08:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-v1.0 Dataset automatically created during the evaluation run of model abdulrahman-nuzha/belal-finetuned-llama2-v1.0 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T15:05:24.001804(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-v1.0\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/belal-finetuned-llama2-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T15:05:24.001804(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-v1.0\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/belal-finetuned-llama2-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T15:05:24.001804(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
3c7c009805b438bb7efb70ec977a168e93731752
# Dataset Card for "cai-conversation-dev1705676837" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/cai-conversation-dev1705676837
[ "region:us" ]
2024-01-19T15:08:59+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 233708, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 232015, "num_examples": 64}, {"name": "test_sft", "num_bytes": 233215, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 231277, "num_examples": 64}], "download_size": 476958, "dataset_size": 930215}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T15:09:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705676837" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705676837\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705676837\"\n\nMore Information needed" ]
ee030c2168b88359319515c2baba399f292a1d36
# Dataset Card for Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2](https://huggingface.co/abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-1024-v2.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T15:11:16.361884](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-1024-v2.2/blob/main/results_2024-01-19T15-11-16.361884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4487446353511138, "acc_stderr": 0.034504979440505464, "acc_norm": 0.4534744253247318, "acc_norm_stderr": 0.03530926751067455, "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662592, "mc2": 0.40020648111023094, "mc2_stderr": 0.01385589773587115 }, "harness|arc:challenge|25": { "acc": 0.49146757679180886, "acc_stderr": 0.014609263165632186, "acc_norm": 0.5264505119453925, "acc_norm_stderr": 0.014590931358120172 }, "harness|hellaswag|10": { "acc": 0.5850428201553476, "acc_stderr": 0.004917076726623795, "acc_norm": 0.7781318462457678, "acc_norm_stderr": 0.004146537488135697 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3881578947368421, "acc_stderr": 0.03965842097512744, "acc_norm": 0.3881578947368421, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4679245283018868, "acc_stderr": 0.03070948699255655, "acc_norm": 0.4679245283018868, "acc_norm_stderr": 0.03070948699255655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4375, "acc_stderr": 0.04148415739394154, "acc_norm": 0.4375, "acc_norm_stderr": 0.04148415739394154 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.047609522856952344, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952344 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.42196531791907516, "acc_stderr": 0.0376574669386515, "acc_norm": 0.42196531791907516, "acc_norm_stderr": 0.0376574669386515 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.1568627450980392, "acc_stderr": 0.03618664819936245, "acc_norm": 0.1568627450980392, "acc_norm_stderr": 0.03618664819936245 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4340425531914894, "acc_stderr": 0.03240038086792747, "acc_norm": 0.4340425531914894, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.04339138322579861, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.04339138322579861 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.496551724137931, "acc_stderr": 0.041665675771015785, "acc_norm": 0.496551724137931, "acc_norm_stderr": 0.041665675771015785 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.023068188848261114, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.023068188848261114 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.25396825396825395, "acc_stderr": 0.03893259610604675, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.03893259610604675 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.47419354838709676, "acc_stderr": 0.028406095057653315, "acc_norm": 0.47419354838709676, "acc_norm_stderr": 0.028406095057653315 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3399014778325123, "acc_stderr": 0.0333276906841079, "acc_norm": 0.3399014778325123, "acc_norm_stderr": 0.0333276906841079 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5696969696969697, "acc_stderr": 0.03866225962879077, "acc_norm": 0.5696969696969697, "acc_norm_stderr": 0.03866225962879077 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5353535353535354, "acc_stderr": 0.03553436368828061, "acc_norm": 0.5353535353535354, "acc_norm_stderr": 0.03553436368828061 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6528497409326425, "acc_stderr": 0.03435696168361355, "acc_norm": 0.6528497409326425, "acc_norm_stderr": 0.03435696168361355 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4076923076923077, "acc_stderr": 0.024915243985987844, "acc_norm": 0.4076923076923077, "acc_norm_stderr": 0.024915243985987844 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.02659393910184408, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.02659393910184408 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3739495798319328, "acc_stderr": 0.031429466378837076, "acc_norm": 0.3739495798319328, "acc_norm_stderr": 0.031429466378837076 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5889908256880734, "acc_stderr": 0.021095050687277656, "acc_norm": 0.5889908256880734, "acc_norm_stderr": 0.021095050687277656 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3101851851851852, "acc_stderr": 0.03154696285656628, "acc_norm": 0.3101851851851852, "acc_norm_stderr": 0.03154696285656628 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5294117647058824, "acc_stderr": 0.03503235296367993, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.03503235296367993 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5443037974683544, "acc_stderr": 0.03241920684693335, "acc_norm": 0.5443037974683544, "acc_norm_stderr": 0.03241920684693335 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5426008968609866, "acc_stderr": 0.033435777055830646, "acc_norm": 0.5426008968609866, "acc_norm_stderr": 0.033435777055830646 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.45038167938931295, "acc_stderr": 0.04363643698524779, "acc_norm": 0.45038167938931295, "acc_norm_stderr": 0.04363643698524779 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.044120158066245044, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.044120158066245044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5, "acc_stderr": 0.04833682445228318, "acc_norm": 0.5, "acc_norm_stderr": 0.04833682445228318 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4171779141104294, "acc_stderr": 0.038741028598180814, "acc_norm": 0.4171779141104294, "acc_norm_stderr": 0.038741028598180814 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.4854368932038835, "acc_stderr": 0.049486373240266376, "acc_norm": 0.4854368932038835, "acc_norm_stderr": 0.049486373240266376 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6452991452991453, "acc_stderr": 0.03134250486245402, "acc_norm": 0.6452991452991453, "acc_norm_stderr": 0.03134250486245402 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6104725415070242, "acc_stderr": 0.017438082556264597, "acc_norm": 0.6104725415070242, "acc_norm_stderr": 0.017438082556264597 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.48265895953757226, "acc_stderr": 0.026902900458666647, "acc_norm": 0.48265895953757226, "acc_norm_stderr": 0.026902900458666647 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27262569832402234, "acc_stderr": 0.014893391735249619, "acc_norm": 0.27262569832402234, "acc_norm_stderr": 0.014893391735249619 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4803921568627451, "acc_stderr": 0.028607893699576066, "acc_norm": 0.4803921568627451, "acc_norm_stderr": 0.028607893699576066 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5562700964630225, "acc_stderr": 0.028217683556652308, "acc_norm": 0.5562700964630225, "acc_norm_stderr": 0.028217683556652308 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5370370370370371, "acc_stderr": 0.027744313443376536, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.027744313443376536 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36524822695035464, "acc_stderr": 0.028723863853281278, "acc_norm": 0.36524822695035464, "acc_norm_stderr": 0.028723863853281278 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3500651890482399, "acc_stderr": 0.012182552313215175, "acc_norm": 0.3500651890482399, "acc_norm_stderr": 0.012182552313215175 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5, "acc_stderr": 0.030372836961539352, "acc_norm": 0.5, "acc_norm_stderr": 0.030372836961539352 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4215686274509804, "acc_stderr": 0.019977422600227467, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.019977422600227467 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4727272727272727, "acc_stderr": 0.04782001791380063, "acc_norm": 0.4727272727272727, "acc_norm_stderr": 0.04782001791380063 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.4448979591836735, "acc_stderr": 0.031814251181977865, "acc_norm": 0.4448979591836735, "acc_norm_stderr": 0.031814251181977865 }, "harness|hendrycksTest-sociology|5": { "acc": 0.582089552238806, "acc_stderr": 0.03487558640462064, "acc_norm": 0.582089552238806, "acc_norm_stderr": 0.03487558640462064 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-virology|5": { "acc": 0.3614457831325301, "acc_stderr": 0.03740059382029321, "acc_norm": 0.3614457831325301, "acc_norm_stderr": 0.03740059382029321 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6198830409356725, "acc_stderr": 0.037229657413855394, "acc_norm": 0.6198830409356725, "acc_norm_stderr": 0.037229657413855394 }, "harness|truthfulqa:mc|0": { "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662592, "mc2": 0.40020648111023094, "mc2_stderr": 0.01385589773587115 }, "harness|winogrande|5": { "acc": 0.7411207576953434, "acc_stderr": 0.012310515810993376 }, "harness|gsm8k|5": { "acc": 0.10538286580742987, "acc_stderr": 0.008457575884041755 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-1024-v2.2
[ "region:us" ]
2024-01-19T15:13:38+00:00
{"pretty_name": "Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2](https://huggingface.co/abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-1024-v2.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T15:11:16.361884](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__belal-finetuned-llama2-1024-v2.2/blob/main/results_2024-01-19T15-11-16.361884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4487446353511138,\n \"acc_stderr\": 0.034504979440505464,\n \"acc_norm\": 0.4534744253247318,\n \"acc_norm_stderr\": 0.03530926751067455,\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662592,\n \"mc2\": 0.40020648111023094,\n \"mc2_stderr\": 0.01385589773587115\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.014609263165632186,\n \"acc_norm\": 0.5264505119453925,\n \"acc_norm_stderr\": 0.014590931358120172\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5850428201553476,\n \"acc_stderr\": 0.004917076726623795,\n \"acc_norm\": 0.7781318462457678,\n \"acc_norm_stderr\": 0.004146537488135697\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.03070948699255655,\n \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.03070948699255655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.03618664819936245,\n \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.03618664819936245\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261114,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261114\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47419354838709676,\n \"acc_stderr\": 0.028406095057653315,\n \"acc_norm\": 0.47419354838709676,\n \"acc_norm_stderr\": 0.028406095057653315\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5353535353535354,\n \"acc_stderr\": 0.03553436368828061,\n \"acc_norm\": 0.5353535353535354,\n \"acc_norm_stderr\": 0.03553436368828061\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4076923076923077,\n \"acc_stderr\": 0.024915243985987844,\n \"acc_norm\": 0.4076923076923077,\n \"acc_norm_stderr\": 0.024915243985987844\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184408,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184408\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5889908256880734,\n \"acc_stderr\": 0.021095050687277656,\n \"acc_norm\": 0.5889908256880734,\n \"acc_norm_stderr\": 0.021095050687277656\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656628,\n \"acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656628\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03503235296367993,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03503235296367993\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5443037974683544,\n \"acc_stderr\": 0.03241920684693335,\n \"acc_norm\": 0.5443037974683544,\n \"acc_norm_stderr\": 0.03241920684693335\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.049486373240266376,\n \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.049486373240266376\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6452991452991453,\n \"acc_stderr\": 0.03134250486245402,\n \"acc_norm\": 0.6452991452991453,\n \"acc_norm_stderr\": 0.03134250486245402\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6104725415070242,\n \"acc_stderr\": 0.017438082556264597,\n \"acc_norm\": 0.6104725415070242,\n \"acc_norm_stderr\": 0.017438082556264597\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.48265895953757226,\n \"acc_stderr\": 0.026902900458666647,\n \"acc_norm\": 0.48265895953757226,\n \"acc_norm_stderr\": 0.026902900458666647\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249619,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249619\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576066,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576066\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n \"acc_stderr\": 0.028217683556652308,\n \"acc_norm\": 0.5562700964630225,\n \"acc_norm_stderr\": 0.028217683556652308\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3500651890482399,\n \"acc_stderr\": 0.012182552313215175,\n \"acc_norm\": 0.3500651890482399,\n \"acc_norm_stderr\": 0.012182552313215175\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.019977422600227467,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.019977422600227467\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4448979591836735,\n \"acc_stderr\": 0.031814251181977865,\n \"acc_norm\": 0.4448979591836735,\n \"acc_norm_stderr\": 0.031814251181977865\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n \"acc_stderr\": 0.03487558640462064,\n \"acc_norm\": 0.582089552238806,\n \"acc_norm_stderr\": 0.03487558640462064\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n \"acc_stderr\": 0.03740059382029321,\n \"acc_norm\": 0.3614457831325301,\n \"acc_norm_stderr\": 0.03740059382029321\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.037229657413855394,\n \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.037229657413855394\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662592,\n \"mc2\": 0.40020648111023094,\n \"mc2_stderr\": 0.01385589773587115\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7411207576953434,\n \"acc_stderr\": 0.012310515810993376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10538286580742987,\n \"acc_stderr\": 0.008457575884041755\n }\n}\n```", "repo_url": "https://huggingface.co/abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|arc:challenge|25_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|gsm8k|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hellaswag|10_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T15-11-16.361884.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["**/details_harness|winogrande|5_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T15-11-16.361884.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T15_11_16.361884", "path": ["results_2024-01-19T15-11-16.361884.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T15-11-16.361884.parquet"]}]}]}
2024-01-19T15:14:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2 Dataset automatically created during the evaluation run of model abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T15:11:16.361884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T15:11:16.361884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/belal-finetuned-llama2-1024-v2.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T15:11:16.361884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
132452c1616dd365edfd13ef3d84dc8f044af3f9
# Dataset Card for "math_23k_standalone" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/math_23k_standalone
[ "region:us" ]
2024-01-19T15:15:33+00:00
{"dataset_info": {"features": [{"name": "text", "struct": [{"name": "asm", "dtype": "string"}, {"name": "c", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 26701775, "num_examples": 21104}], "download_size": 3651762, "dataset_size": 26701775}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-21T06:37:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for "math_23k_standalone" More Information needed
[ "# Dataset Card for \"math_23k_standalone\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"math_23k_standalone\"\n\nMore Information needed" ]
f7c05b77c665653081229c3d082af738ccf62b0d
# Dataset Card for Evaluation run of Rijgersberg/GEITje-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Rijgersberg/GEITje-7B](https://huggingface.co/Rijgersberg/GEITje-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Rijgersberg__GEITje-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T15:27:47.021143](https://huggingface.co/datasets/open-llm-leaderboard/details_Rijgersberg__GEITje-7B/blob/main/results_2024-01-19T15-27-47.021143.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4992727199727469, "acc_stderr": 0.03446458801176692, "acc_norm": 0.5044163057293031, "acc_norm_stderr": 0.0352231344135158, "mc1": 0.24724602203182375, "mc1_stderr": 0.015102404797359652, "mc2": 0.4045292267655127, "mc2_stderr": 0.013973702471401067 }, "harness|arc:challenge|25": { "acc": 0.4206484641638225, "acc_stderr": 0.014426211252508408, "acc_norm": 0.44795221843003413, "acc_norm_stderr": 0.01453201149821167 }, "harness|hellaswag|10": { "acc": 0.5500896235809599, "acc_stderr": 0.004964679845918434, "acc_norm": 0.7531368253335989, "acc_norm_stderr": 0.004303052185107719 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5197368421052632, "acc_stderr": 0.040657710025626036, "acc_norm": 0.5197368421052632, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.47924528301886793, "acc_stderr": 0.030746349975723463, "acc_norm": 0.47924528301886793, "acc_norm_stderr": 0.030746349975723463 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4861111111111111, "acc_stderr": 0.04179596617581, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.04179596617581 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4277456647398844, "acc_stderr": 0.03772446857518026, "acc_norm": 0.4277456647398844, "acc_norm_stderr": 0.03772446857518026 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.28431372549019607, "acc_stderr": 0.04488482852329017, "acc_norm": 0.28431372549019607, "acc_norm_stderr": 0.04488482852329017 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.04605661864718381, "acc_norm": 0.7, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.425531914893617, "acc_stderr": 0.03232146916224469, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.03232146916224469 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.32456140350877194, "acc_stderr": 0.044045561573747664, "acc_norm": 0.32456140350877194, "acc_norm_stderr": 0.044045561573747664 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.041665675771015785, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.041665675771015785 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.32275132275132273, "acc_stderr": 0.024078943243597016, "acc_norm": 0.32275132275132273, "acc_norm_stderr": 0.024078943243597016 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2619047619047619, "acc_stderr": 0.0393253768039287, "acc_norm": 0.2619047619047619, "acc_norm_stderr": 0.0393253768039287 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.567741935483871, "acc_stderr": 0.028181739720019416, "acc_norm": 0.567741935483871, "acc_norm_stderr": 0.028181739720019416 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39408866995073893, "acc_stderr": 0.034381579670365425, "acc_norm": 0.39408866995073893, "acc_norm_stderr": 0.034381579670365425 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03681050869161551, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03681050869161551 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6515151515151515, "acc_stderr": 0.033948539651564025, "acc_norm": 0.6515151515151515, "acc_norm_stderr": 0.033948539651564025 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.694300518134715, "acc_stderr": 0.03324837939758159, "acc_norm": 0.694300518134715, "acc_norm_stderr": 0.03324837939758159 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4512820512820513, "acc_stderr": 0.025230381238934837, "acc_norm": 0.4512820512820513, "acc_norm_stderr": 0.025230381238934837 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2851851851851852, "acc_stderr": 0.027528599210340496, "acc_norm": 0.2851851851851852, "acc_norm_stderr": 0.027528599210340496 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5252100840336135, "acc_stderr": 0.03243718055137411, "acc_norm": 0.5252100840336135, "acc_norm_stderr": 0.03243718055137411 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6165137614678899, "acc_stderr": 0.020847156641915984, "acc_norm": 0.6165137614678899, "acc_norm_stderr": 0.020847156641915984 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03388857118502325, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5833333333333334, "acc_stderr": 0.03460228327239172, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.03460228327239172 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5907172995780591, "acc_stderr": 0.03200704183359592, "acc_norm": 0.5907172995780591, "acc_norm_stderr": 0.03200704183359592 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5560538116591929, "acc_stderr": 0.03334625674242728, "acc_norm": 0.5560538116591929, "acc_norm_stderr": 0.03334625674242728 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6259541984732825, "acc_stderr": 0.042438692422305246, "acc_norm": 0.6259541984732825, "acc_norm_stderr": 0.042438692422305246 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.044120158066245044, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.044120158066245044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5925925925925926, "acc_stderr": 0.047500773411999854, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.047500773411999854 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6134969325153374, "acc_stderr": 0.03825825548848608, "acc_norm": 0.6134969325153374, "acc_norm_stderr": 0.03825825548848608 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.04493949068613539, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.04493949068613539 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280042, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280042 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7735042735042735, "acc_stderr": 0.027421007295392926, "acc_norm": 0.7735042735042735, "acc_norm_stderr": 0.027421007295392926 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.05021167315686779, "acc_norm": 0.52, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.665389527458493, "acc_stderr": 0.016873468641592154, "acc_norm": 0.665389527458493, "acc_norm_stderr": 0.016873468641592154 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5838150289017341, "acc_stderr": 0.026538189104705477, "acc_norm": 0.5838150289017341, "acc_norm_stderr": 0.026538189104705477 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24581005586592178, "acc_stderr": 0.014400296429225627, "acc_norm": 0.24581005586592178, "acc_norm_stderr": 0.014400296429225627 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.545751633986928, "acc_stderr": 0.028509807802626592, "acc_norm": 0.545751633986928, "acc_norm_stderr": 0.028509807802626592 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6077170418006431, "acc_stderr": 0.027731258647011998, "acc_norm": 0.6077170418006431, "acc_norm_stderr": 0.027731258647011998 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5555555555555556, "acc_stderr": 0.027648477877413317, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.027648477877413317 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36524822695035464, "acc_stderr": 0.028723863853281278, "acc_norm": 0.36524822695035464, "acc_norm_stderr": 0.028723863853281278 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3520208604954368, "acc_stderr": 0.0121981406053536, "acc_norm": 0.3520208604954368, "acc_norm_stderr": 0.0121981406053536 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.49264705882352944, "acc_stderr": 0.030369552523902173, "acc_norm": 0.49264705882352944, "acc_norm_stderr": 0.030369552523902173 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.47549019607843135, "acc_stderr": 0.020203517280261457, "acc_norm": 0.47549019607843135, "acc_norm_stderr": 0.020203517280261457 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5545454545454546, "acc_stderr": 0.047605488214603246, "acc_norm": 0.5545454545454546, "acc_norm_stderr": 0.047605488214603246 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5469387755102041, "acc_stderr": 0.031867859300041275, "acc_norm": 0.5469387755102041, "acc_norm_stderr": 0.031867859300041275 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.033333333333333326, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.033333333333333326 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.45180722891566266, "acc_stderr": 0.03874371556587953, "acc_norm": 0.45180722891566266, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.672514619883041, "acc_stderr": 0.03599335771456027, "acc_norm": 0.672514619883041, "acc_norm_stderr": 0.03599335771456027 }, "harness|truthfulqa:mc|0": { "mc1": 0.24724602203182375, "mc1_stderr": 0.015102404797359652, "mc2": 0.4045292267655127, "mc2_stderr": 0.013973702471401067 }, "harness|winogrande|5": { "acc": 0.7237569060773481, "acc_stderr": 0.01256681501569816 }, "harness|gsm8k|5": { "acc": 0.20166793025018953, "acc_stderr": 0.011052295889544378 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Rijgersberg__GEITje-7B
[ "region:us" ]
2024-01-19T15:30:16+00:00
{"pretty_name": "Evaluation run of Rijgersberg/GEITje-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Rijgersberg/GEITje-7B](https://huggingface.co/Rijgersberg/GEITje-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Rijgersberg__GEITje-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T15:27:47.021143](https://huggingface.co/datasets/open-llm-leaderboard/details_Rijgersberg__GEITje-7B/blob/main/results_2024-01-19T15-27-47.021143.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4992727199727469,\n \"acc_stderr\": 0.03446458801176692,\n \"acc_norm\": 0.5044163057293031,\n \"acc_norm_stderr\": 0.0352231344135158,\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.4045292267655127,\n \"mc2_stderr\": 0.013973702471401067\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4206484641638225,\n \"acc_stderr\": 0.014426211252508408,\n \"acc_norm\": 0.44795221843003413,\n \"acc_norm_stderr\": 0.01453201149821167\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5500896235809599,\n \"acc_stderr\": 0.004964679845918434,\n \"acc_norm\": 0.7531368253335989,\n \"acc_norm_stderr\": 0.004303052185107719\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.47924528301886793,\n \"acc_stderr\": 0.030746349975723463,\n \"acc_norm\": 0.47924528301886793,\n \"acc_norm_stderr\": 0.030746349975723463\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.567741935483871,\n \"acc_stderr\": 0.028181739720019416,\n \"acc_norm\": 0.567741935483871,\n \"acc_norm_stderr\": 0.028181739720019416\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365425,\n \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365425\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6515151515151515,\n \"acc_stderr\": 0.033948539651564025,\n \"acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.033948539651564025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.03324837939758159,\n \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.03324837939758159\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.025230381238934837,\n \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.025230381238934837\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6165137614678899,\n \"acc_stderr\": 0.020847156641915984,\n \"acc_norm\": 0.6165137614678899,\n \"acc_norm_stderr\": 0.020847156641915984\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239172,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239172\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5907172995780591,\n \"acc_stderr\": 0.03200704183359592,\n \"acc_norm\": 0.5907172995780591,\n \"acc_norm_stderr\": 0.03200704183359592\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.047500773411999854,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.047500773411999854\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848608,\n \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848608\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n \"acc_stderr\": 0.027421007295392926,\n \"acc_norm\": 0.7735042735042735,\n \"acc_norm_stderr\": 0.027421007295392926\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.665389527458493,\n \"acc_stderr\": 0.016873468641592154,\n \"acc_norm\": 0.665389527458493,\n \"acc_norm_stderr\": 0.016873468641592154\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.026538189104705477,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.026538189104705477\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225627,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225627\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626592,\n \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626592\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.6077170418006431,\n \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413317,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413317\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3520208604954368,\n \"acc_stderr\": 0.0121981406053536,\n \"acc_norm\": 0.3520208604954368,\n \"acc_norm_stderr\": 0.0121981406053536\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.47549019607843135,\n \"acc_stderr\": 0.020203517280261457,\n \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.020203517280261457\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.031867859300041275,\n \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.031867859300041275\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.033333333333333326,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033333333333333326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.03599335771456027,\n \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.03599335771456027\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.4045292267655127,\n \"mc2_stderr\": 0.013973702471401067\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7237569060773481,\n \"acc_stderr\": 0.01256681501569816\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20166793025018953,\n \"acc_stderr\": 0.011052295889544378\n }\n}\n```", "repo_url": "https://huggingface.co/Rijgersberg/GEITje-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|arc:challenge|25_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|gsm8k|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hellaswag|10_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T15-27-47.021143.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["**/details_harness|winogrande|5_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T15-27-47.021143.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T15_27_47.021143", "path": ["results_2024-01-19T15-27-47.021143.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T15-27-47.021143.parquet"]}]}]}
2024-01-19T15:30:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Rijgersberg/GEITje-7B Dataset automatically created during the evaluation run of model Rijgersberg/GEITje-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T15:27:47.021143(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Rijgersberg/GEITje-7B\n\n\n\nDataset automatically created during the evaluation run of model Rijgersberg/GEITje-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T15:27:47.021143(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Rijgersberg/GEITje-7B\n\n\n\nDataset automatically created during the evaluation run of model Rijgersberg/GEITje-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T15:27:47.021143(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
8ee8ae96d5e94a388b1d255de28040a4fd943e9f
# Dataset Card for Evaluation run of Rijgersberg/GEITje-7B-chat-v2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Rijgersberg/GEITje-7B-chat-v2](https://huggingface.co/Rijgersberg/GEITje-7B-chat-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Rijgersberg__GEITje-7B-chat-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T15:31:36.828021](https://huggingface.co/datasets/open-llm-leaderboard/details_Rijgersberg__GEITje-7B-chat-v2/blob/main/results_2024-01-19T15-31-36.828021.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.48880916708181743, "acc_stderr": 0.03442707322127932, "acc_norm": 0.4945295622293122, "acc_norm_stderr": 0.035197321298279766, "mc1": 0.2802937576499388, "mc1_stderr": 0.015723139524608767, "mc2": 0.4354583253052409, "mc2_stderr": 0.014644004519733833 }, "harness|arc:challenge|25": { "acc": 0.4658703071672355, "acc_stderr": 0.014577311315231102, "acc_norm": 0.5034129692832765, "acc_norm_stderr": 0.014611050403244081 }, "harness|hellaswag|10": { "acc": 0.5416251742680741, "acc_stderr": 0.004972460206842306, "acc_norm": 0.7412865962955587, "acc_norm_stderr": 0.00437032822483179 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.46710526315789475, "acc_stderr": 0.040601270352363966, "acc_norm": 0.46710526315789475, "acc_norm_stderr": 0.040601270352363966 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5207547169811321, "acc_stderr": 0.030746349975723463, "acc_norm": 0.5207547169811321, "acc_norm_stderr": 0.030746349975723463 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4722222222222222, "acc_stderr": 0.04174752578923185, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.04174752578923185 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4393063583815029, "acc_stderr": 0.037842719328874674, "acc_norm": 0.4393063583815029, "acc_norm_stderr": 0.037842719328874674 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237657, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237657 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3872340425531915, "acc_stderr": 0.03184389265339525, "acc_norm": 0.3872340425531915, "acc_norm_stderr": 0.03184389265339525 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.04266339443159393, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.04266339443159393 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.45517241379310347, "acc_stderr": 0.04149886942192117, "acc_norm": 0.45517241379310347, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.34656084656084657, "acc_stderr": 0.024508777521028424, "acc_norm": 0.34656084656084657, "acc_norm_stderr": 0.024508777521028424 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2777777777777778, "acc_stderr": 0.04006168083848878, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.04006168083848878 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5451612903225806, "acc_stderr": 0.028327743091561077, "acc_norm": 0.5451612903225806, "acc_norm_stderr": 0.028327743091561077 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39408866995073893, "acc_stderr": 0.034381579670365446, "acc_norm": 0.39408866995073893, "acc_norm_stderr": 0.034381579670365446 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.036810508691615514, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.036810508691615514 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6464646464646465, "acc_stderr": 0.03406086723547155, "acc_norm": 0.6464646464646465, "acc_norm_stderr": 0.03406086723547155 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6528497409326425, "acc_stderr": 0.03435696168361355, "acc_norm": 0.6528497409326425, "acc_norm_stderr": 0.03435696168361355 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.45384615384615384, "acc_stderr": 0.025242770987126188, "acc_norm": 0.45384615384615384, "acc_norm_stderr": 0.025242770987126188 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114993, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114993 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4831932773109244, "acc_stderr": 0.03246013680375308, "acc_norm": 0.4831932773109244, "acc_norm_stderr": 0.03246013680375308 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6385321100917432, "acc_stderr": 0.020598082009937378, "acc_norm": 0.6385321100917432, "acc_norm_stderr": 0.020598082009937378 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4166666666666667, "acc_stderr": 0.03362277436608043, "acc_norm": 0.4166666666666667, "acc_norm_stderr": 0.03362277436608043 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5833333333333334, "acc_stderr": 0.03460228327239172, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.03460228327239172 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5864978902953587, "acc_stderr": 0.03205649904851859, "acc_norm": 0.5864978902953587, "acc_norm_stderr": 0.03205649904851859 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.515695067264574, "acc_stderr": 0.0335412657542081, "acc_norm": 0.515695067264574, "acc_norm_stderr": 0.0335412657542081 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5572519083969466, "acc_stderr": 0.043564472026650695, "acc_norm": 0.5572519083969466, "acc_norm_stderr": 0.043564472026650695 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.04412015806624504, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.04412015806624504 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5766871165644172, "acc_stderr": 0.038818912133343826, "acc_norm": 0.5766871165644172, "acc_norm_stderr": 0.038818912133343826 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.04327040932578729, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.04327040932578729 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.044986763205729224, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.044986763205729224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7863247863247863, "acc_stderr": 0.026853450377009144, "acc_norm": 0.7863247863247863, "acc_norm_stderr": 0.026853450377009144 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956913, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6756066411238825, "acc_stderr": 0.016740929047162692, "acc_norm": 0.6756066411238825, "acc_norm_stderr": 0.016740929047162692 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5202312138728323, "acc_stderr": 0.026897049996382875, "acc_norm": 0.5202312138728323, "acc_norm_stderr": 0.026897049996382875 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.28268156424581004, "acc_stderr": 0.015060381730018108, "acc_norm": 0.28268156424581004, "acc_norm_stderr": 0.015060381730018108 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5490196078431373, "acc_stderr": 0.02849199358617156, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.02849199358617156 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5980707395498392, "acc_stderr": 0.02784647600593047, "acc_norm": 0.5980707395498392, "acc_norm_stderr": 0.02784647600593047 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5370370370370371, "acc_stderr": 0.027744313443376536, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.027744313443376536 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.028838921471251458, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.028838921471251458 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.35528031290743156, "acc_stderr": 0.01222362336404404, "acc_norm": 0.35528031290743156, "acc_norm_stderr": 0.01222362336404404 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4963235294117647, "acc_stderr": 0.030372015885428188, "acc_norm": 0.4963235294117647, "acc_norm_stderr": 0.030372015885428188 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4493464052287582, "acc_stderr": 0.020123766528027262, "acc_norm": 0.4493464052287582, "acc_norm_stderr": 0.020123766528027262 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5272727272727272, "acc_stderr": 0.04782001791380061, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.04782001791380061 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5795918367346938, "acc_stderr": 0.03160106993449601, "acc_norm": 0.5795918367346938, "acc_norm_stderr": 0.03160106993449601 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7263681592039801, "acc_stderr": 0.03152439186555404, "acc_norm": 0.7263681592039801, "acc_norm_stderr": 0.03152439186555404 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.03828401115079022, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.03828401115079022 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6257309941520468, "acc_stderr": 0.03711601185389483, "acc_norm": 0.6257309941520468, "acc_norm_stderr": 0.03711601185389483 }, "harness|truthfulqa:mc|0": { "mc1": 0.2802937576499388, "mc1_stderr": 0.015723139524608767, "mc2": 0.4354583253052409, "mc2_stderr": 0.014644004519733833 }, "harness|winogrande|5": { "acc": 0.7150749802683505, "acc_stderr": 0.01268598612514122 }, "harness|gsm8k|5": { "acc": 0.16224412433661864, "acc_stderr": 0.010155130880393524 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Rijgersberg__GEITje-7B-chat-v2
[ "region:us" ]
2024-01-19T15:34:03+00:00
{"pretty_name": "Evaluation run of Rijgersberg/GEITje-7B-chat-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Rijgersberg/GEITje-7B-chat-v2](https://huggingface.co/Rijgersberg/GEITje-7B-chat-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Rijgersberg__GEITje-7B-chat-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T15:31:36.828021](https://huggingface.co/datasets/open-llm-leaderboard/details_Rijgersberg__GEITje-7B-chat-v2/blob/main/results_2024-01-19T15-31-36.828021.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48880916708181743,\n \"acc_stderr\": 0.03442707322127932,\n \"acc_norm\": 0.4945295622293122,\n \"acc_norm_stderr\": 0.035197321298279766,\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.4354583253052409,\n \"mc2_stderr\": 0.014644004519733833\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4658703071672355,\n \"acc_stderr\": 0.014577311315231102,\n \"acc_norm\": 0.5034129692832765,\n \"acc_norm_stderr\": 0.014611050403244081\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5416251742680741,\n \"acc_stderr\": 0.004972460206842306,\n \"acc_norm\": 0.7412865962955587,\n \"acc_norm_stderr\": 0.00437032822483179\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723463,\n \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723463\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237657,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237657\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5451612903225806,\n \"acc_stderr\": 0.028327743091561077,\n \"acc_norm\": 0.5451612903225806,\n \"acc_norm_stderr\": 0.028327743091561077\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036810508691615514,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036810508691615514\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126188,\n \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126188\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937378,\n \"acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937378\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239172,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239172\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5864978902953587,\n \"acc_stderr\": 0.03205649904851859,\n \"acc_norm\": 0.5864978902953587,\n \"acc_norm_stderr\": 0.03205649904851859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.038818912133343826,\n \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.038818912133343826\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n \"acc_stderr\": 0.026853450377009144,\n \"acc_norm\": 0.7863247863247863,\n \"acc_norm_stderr\": 0.026853450377009144\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n \"acc_stderr\": 0.016740929047162692,\n \"acc_norm\": 0.6756066411238825,\n \"acc_norm_stderr\": 0.016740929047162692\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n \"acc_stderr\": 0.015060381730018108,\n \"acc_norm\": 0.28268156424581004,\n \"acc_norm_stderr\": 0.015060381730018108\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n \"acc_stderr\": 0.02784647600593047,\n \"acc_norm\": 0.5980707395498392,\n \"acc_norm_stderr\": 0.02784647600593047\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35528031290743156,\n \"acc_stderr\": 0.01222362336404404,\n \"acc_norm\": 0.35528031290743156,\n \"acc_norm_stderr\": 0.01222362336404404\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428188,\n \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428188\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4493464052287582,\n \"acc_stderr\": 0.020123766528027262,\n \"acc_norm\": 0.4493464052287582,\n \"acc_norm_stderr\": 0.020123766528027262\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6257309941520468,\n \"acc_stderr\": 0.03711601185389483,\n \"acc_norm\": 0.6257309941520468,\n \"acc_norm_stderr\": 0.03711601185389483\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.4354583253052409,\n \"mc2_stderr\": 0.014644004519733833\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7150749802683505,\n \"acc_stderr\": 0.01268598612514122\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16224412433661864,\n \"acc_stderr\": 0.010155130880393524\n }\n}\n```", "repo_url": "https://huggingface.co/Rijgersberg/GEITje-7B-chat-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|arc:challenge|25_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|gsm8k|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hellaswag|10_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T15-31-36.828021.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["**/details_harness|winogrande|5_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T15-31-36.828021.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T15_31_36.828021", "path": ["results_2024-01-19T15-31-36.828021.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T15-31-36.828021.parquet"]}]}]}
2024-01-19T15:34:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Rijgersberg/GEITje-7B-chat-v2 Dataset automatically created during the evaluation run of model Rijgersberg/GEITje-7B-chat-v2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T15:31:36.828021(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Rijgersberg/GEITje-7B-chat-v2\n\n\n\nDataset automatically created during the evaluation run of model Rijgersberg/GEITje-7B-chat-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T15:31:36.828021(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Rijgersberg/GEITje-7B-chat-v2\n\n\n\nDataset automatically created during the evaluation run of model Rijgersberg/GEITje-7B-chat-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T15:31:36.828021(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a63f962a9f692e2ce048b2f3ce826ea219956870
## About This is a pre-filtered and pre-split dataset for the HPE Generative AI "Medical Transcript Classification" tutorials. - [No-Code Version (UI Only)](tbd) - [Notebooks Version](tbd)
hpe-ai/medical-cases-classification-tutorial
[ "region:us" ]
2024-01-19T15:40:12+00:00
{}
2024-01-19T15:44:49+00:00
[]
[]
TAGS #region-us
## About This is a pre-filtered and pre-split dataset for the HPE Generative AI "Medical Transcript Classification" tutorials. - No-Code Version (UI Only) - Notebooks Version
[ "## About\n\nThis is a pre-filtered and pre-split dataset for the HPE Generative AI \"Medical Transcript Classification\" tutorials.\n\n- No-Code Version (UI Only)\n- Notebooks Version" ]
[ "TAGS\n#region-us \n", "## About\n\nThis is a pre-filtered and pre-split dataset for the HPE Generative AI \"Medical Transcript Classification\" tutorials.\n\n- No-Code Version (UI Only)\n- Notebooks Version" ]
a9468536e633cd67948e3a5c39ffa57ba92429f3
https://huggingface.co/datasets/LDJnr/Capybara features: general, multi-turn, chat length: 1.31k
Wanfq/capybara
[ "region:us" ]
2024-01-19T15:43:01+00:00
{}
2024-01-19T15:46:43+00:00
[]
[]
TAGS #region-us
URL features: general, multi-turn, chat length: 1.31k
[]
[ "TAGS\n#region-us \n" ]
e760b3c5081fe37c8a06f376d87529e52c1c7f3d
# Dataset Card for "math_23k_value_init" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/math_23k_value_init
[ "region:us" ]
2024-01-19T15:43:23+00:00
{"dataset_info": {"features": [{"name": "text", "struct": [{"name": "asm", "dtype": "string"}, {"name": "c", "dtype": "string"}, {"name": "driver", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 21098660, "num_examples": 21104}], "download_size": 2532668, "dataset_size": 21098660}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-21T06:41:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for "math_23k_value_init" More Information needed
[ "# Dataset Card for \"math_23k_value_init\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"math_23k_value_init\"\n\nMore Information needed" ]
0d435200ece6d5e6724a544ba1886574f49ea9e1
https://huggingface.co/datasets/ise-uiuc/Magicoder-OSS-Instruct-75K features: coding, single-turn, task only select python code length: 38.3k
Wanfq/magicoder
[ "region:us" ]
2024-01-19T15:43:51+00:00
{}
2024-01-19T15:48:51+00:00
[]
[]
TAGS #region-us
URL features: coding, single-turn, task only select python code length: 38.3k
[]
[ "TAGS\n#region-us \n" ]
e4258edde4d85b5e4aeb3c85ca9cab10fb656359
# Dataset Card for "slim_orca_hermes_reasoning_sft" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
alexredna/slim_orca_hermes_reasoning_sft
[ "region:us" ]
2024-01-19T15:45:49+00:00
{"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 27241850.16600492, "num_examples": 16433}], "download_size": 10075268, "dataset_size": 27241850.16600492}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-19T22:51:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "slim_orca_hermes_reasoning_sft" More Information needed
[ "# Dataset Card for \"slim_orca_hermes_reasoning_sft\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"slim_orca_hermes_reasoning_sft\"\n\nMore Information needed" ]
eed045b3eebcbf0305c3e68d5fb3deadd96cdfae
# Dataset Card for "math_23k_kernel" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/math_23k_kernel
[ "region:us" ]
2024-01-19T15:47:08+00:00
{"dataset_info": {"features": [{"name": "text", "struct": [{"name": "asm", "dtype": "string"}, {"name": "c", "dtype": "string"}, {"name": "driver", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 20623345, "num_examples": 21104}], "download_size": 1797637, "dataset_size": 20623345}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-21T06:44:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for "math_23k_kernel" More Information needed
[ "# Dataset Card for \"math_23k_kernel\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"math_23k_kernel\"\n\nMore Information needed" ]
895ecbaa7f282787606916f3914f53c8550bdeb4
idk what im doing
aFrofessionalFrog/jerry-snyder
[ "size_categories:n<1K", "language:en", "license:mit", "region:us" ]
2024-01-19T16:43:33+00:00
{"language": ["en"], "license": "mit", "size_categories": ["n<1K"], "pretty_name": "jerrygpt"}
2024-01-20T23:19:52+00:00
[]
[ "en" ]
TAGS #size_categories-n<1K #language-English #license-mit #region-us
idk what im doing
[]
[ "TAGS\n#size_categories-n<1K #language-English #license-mit #region-us \n" ]
1e2277ef4fe2d236a851f7a3fe27338116899ec5
# SQLITE-DB for Danbooru 2023 (until 7110548) This is an cleaned-up version (almost totally recreated) of https://huggingface.co/datasets/KBlueLeaf/danbooru2023-sqlite The previous sqlite database had major defects, especially with tag ids being mismatched, which was causing data to be actually different from server. Note that minor information, such as uploader id, are not fixed. Most of the discrepancy has been detected from id 5139963-6859952. The additional scripts and example files that can be used for modify/ adding post are provided for maintenance. # view-dataset.py offers you to get the information of post: The example post which contained wrong information can be viewed like this: post : https://danbooru.donmai.us/posts/6719955 ``` Post 6719955 tag_list_general: 1girl(0), long_hair(24), ribbon(36), simple_background(42), smile(43), solo(44), ahoge(71), open_mouth(90), blonde_hair(116), :d(147), red_eyes(193), blush_stickers(274), upper_body(285), neck_ribbon(376), red_ribbon(380), chibi(437), transparent_background(513), cardigan(2754), looking_afar(3998), white_cardigan(52258), sad_keanu_(meme)(136930) tag_list_copyright: tokino_sora_(old_design)(364562), juufuutei_raden(649634) tag_list_character: regloss_(hololive)(650035) tag_list_meta: (175) tag_list_artist: xt0y4x(525184) ``` However, the actual data will show you the difference: ``` {"id": 6719955, "difference": [{"tag_list_general": ["virtual_youtuber"], "tag_list_character": ["otonose_kanade"], "tag_list_artist": ["jb_jagbung"], "tag_list_copyright": ["hololive", "hololive_dev_is"]}, {"tag_list_general": ["sad_keanu_(meme)"], "tag_list_character": ["regloss_(hololive)"], "tag_list_artist": ["xt0y4x"], "tag_list_copyright": ["tokino_sora_(old_design)", "juufuutei_raden"]}]} ``` The actual tags ids are followed: ``` virtual_youtuber(136931) otonose_kanade(650036) jb_jagbung(525185) hololive(364563) hololive_dev_is(649635) ``` There were tags added / removed from post, but other than that, there is actual shift on tags - which is not consistent over database. ``` tag_list_character: regloss_(hololive)(650035) <-> otonose_kanade(650036) tag_list_artist: xt0y4x(525184) <-> jb_jagbung(525185) tag_list_copyright: tokino_sora_(old_design)(364562), juufuutei_raden(649634) <-> hololive(364563), hololive_dev_is(649635) ``` The tag virtual_youtuber is the good difference that we can add to database too. # crawling code is not included. # commit_difference.py offers you to change the database's information based on **difference jsonl** files. You can prepare bunch of jsonl files, which contains the line of json which contains id-difference. The data **must** contain string form of data, not tag id. **The actual danbooru tag id won't match with the database, it is intended to skip bunch of tags which does not have actual post usages.** # fix-tags.py just contains the code which will reflect the actual tag usage count, to tag popularity. # add_post.py includes code to add more recent post data directly into dataset. It contains simple skip schema, if post already exists and it has non-empty tag list, it won't add the post.
AngelBottomless/danbooru-2023-sqlite-fixed-7110548
[ "task_categories:text-to-image", "task_categories:image-classification", "size_categories:100M<n<1B", "license:mit", "region:us" ]
2024-01-19T16:44:44+00:00
{"license": "mit", "size_categories": ["100M<n<1B"], "task_categories": ["text-to-image", "image-classification"], "pretty_name": "sanitized-danbooru2023-sqlite"}
2024-01-21T01:36:03+00:00
[]
[]
TAGS #task_categories-text-to-image #task_categories-image-classification #size_categories-100M<n<1B #license-mit #region-us
# SQLITE-DB for Danbooru 2023 (until 7110548) This is an cleaned-up version (almost totally recreated) of URL The previous sqlite database had major defects, especially with tag ids being mismatched, which was causing data to be actually different from server. Note that minor information, such as uploader id, are not fixed. Most of the discrepancy has been detected from id 5139963-6859952. The additional scripts and example files that can be used for modify/ adding post are provided for maintenance. # URL offers you to get the information of post: The example post which contained wrong information can be viewed like this: post : URL However, the actual data will show you the difference: The actual tags ids are followed: There were tags added / removed from post, but other than that, there is actual shift on tags - which is not consistent over database. The tag virtual_youtuber is the good difference that we can add to database too. # crawling code is not included. # commit_difference.py offers you to change the database's information based on difference jsonl files. You can prepare bunch of jsonl files, which contains the line of json which contains id-difference. The data must contain string form of data, not tag id. The actual danbooru tag id won't match with the database, it is intended to skip bunch of tags which does not have actual post usages. # URL just contains the code which will reflect the actual tag usage count, to tag popularity. # add_post.py includes code to add more recent post data directly into dataset. It contains simple skip schema, if post already exists and it has non-empty tag list, it won't add the post.
[ "# SQLITE-DB for Danbooru 2023 (until 7110548)\nThis is an cleaned-up version (almost totally recreated) of URL\n\nThe previous sqlite database had major defects, especially with tag ids being mismatched, which was causing data to be actually different from server.\n\nNote that minor information, such as uploader id, are not fixed.\n\nMost of the discrepancy has been detected from id 5139963-6859952.\n\n\n\nThe additional scripts and example files that can be used for modify/ adding post are provided for maintenance.", "# URL offers you to get the information of post:\n\nThe example post which contained wrong information can be viewed like this:\npost : URL\n\n\nHowever, the actual data will show you the difference:\n\n\nThe actual tags ids are followed:\n\n\nThere were tags added / removed from post, but other than that, there is actual shift on tags - which is not consistent over database.\n\n\n\nThe tag virtual_youtuber is the good difference that we can add to database too.", "# crawling code is not included.", "# commit_difference.py offers you to change the database's information based on difference jsonl files.\n\nYou can prepare bunch of jsonl files, which contains the line of json which contains id-difference.\nThe data must contain string form of data, not tag id. The actual danbooru tag id won't match with the database, it is intended to skip bunch of tags which does not have actual post usages.", "# URL just contains the code which will reflect the actual tag usage count, to tag popularity.", "# add_post.py includes code to add more recent post data directly into dataset.\nIt contains simple skip schema, if post already exists and it has non-empty tag list, it won't add the post." ]
[ "TAGS\n#task_categories-text-to-image #task_categories-image-classification #size_categories-100M<n<1B #license-mit #region-us \n", "# SQLITE-DB for Danbooru 2023 (until 7110548)\nThis is an cleaned-up version (almost totally recreated) of URL\n\nThe previous sqlite database had major defects, especially with tag ids being mismatched, which was causing data to be actually different from server.\n\nNote that minor information, such as uploader id, are not fixed.\n\nMost of the discrepancy has been detected from id 5139963-6859952.\n\n\n\nThe additional scripts and example files that can be used for modify/ adding post are provided for maintenance.", "# URL offers you to get the information of post:\n\nThe example post which contained wrong information can be viewed like this:\npost : URL\n\n\nHowever, the actual data will show you the difference:\n\n\nThe actual tags ids are followed:\n\n\nThere were tags added / removed from post, but other than that, there is actual shift on tags - which is not consistent over database.\n\n\n\nThe tag virtual_youtuber is the good difference that we can add to database too.", "# crawling code is not included.", "# commit_difference.py offers you to change the database's information based on difference jsonl files.\n\nYou can prepare bunch of jsonl files, which contains the line of json which contains id-difference.\nThe data must contain string form of data, not tag id. The actual danbooru tag id won't match with the database, it is intended to skip bunch of tags which does not have actual post usages.", "# URL just contains the code which will reflect the actual tag usage count, to tag popularity.", "# add_post.py includes code to add more recent post data directly into dataset.\nIt contains simple skip schema, if post already exists and it has non-empty tag list, it won't add the post." ]
78b347bbf024dcf42513354756523acc73de482f
# Dataset Card for Evaluation run of ConvexAI/Chop-7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ConvexAI/Chop-7b](https://huggingface.co/ConvexAI/Chop-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ConvexAI__Chop-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T16:46:46.099414](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Chop-7b/blob/main/results_2024-01-19T16-46-46.099414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6189434455748648, "acc_stderr": 0.03297263847569279, "acc_norm": 0.6242517385876073, "acc_norm_stderr": 0.03363742978661649, "mc1": 0.4528763769889841, "mc1_stderr": 0.01742558984831402, "mc2": 0.6218778140523901, "mc2_stderr": 0.01525213415924314 }, "harness|arc:challenge|25": { "acc": 0.5887372013651877, "acc_stderr": 0.014379441068522084, "acc_norm": 0.6373720136518771, "acc_norm_stderr": 0.014049106564955012 }, "harness|hellaswag|10": { "acc": 0.6385182234614618, "acc_stderr": 0.00479447842638261, "acc_norm": 0.8304122684724159, "acc_norm_stderr": 0.0037450326672282806 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6578947368421053, "acc_stderr": 0.03860731599316092, "acc_norm": 0.6578947368421053, "acc_norm_stderr": 0.03860731599316092 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.038990736873573344, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.038990736873573344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895537, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895537 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.046774730044911984, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.046774730044911984 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.02537952491077839, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.02537952491077839 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7129032258064516, "acc_stderr": 0.02573654274559453, "acc_norm": 0.7129032258064516, "acc_norm_stderr": 0.02573654274559453 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5320197044334976, "acc_stderr": 0.035107665979592154, "acc_norm": 0.5320197044334976, "acc_norm_stderr": 0.035107665979592154 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885417, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885417 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.02578772318072388, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.02578772318072388 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5846153846153846, "acc_stderr": 0.02498535492310234, "acc_norm": 0.5846153846153846, "acc_norm_stderr": 0.02498535492310234 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131147, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131147 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.031282177063684614, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.031282177063684614 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7963302752293578, "acc_stderr": 0.017266742087630793, "acc_norm": 0.7963302752293578, "acc_norm_stderr": 0.017266742087630793 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.03402801581358966, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588663, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588663 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808503, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808503 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6053811659192825, "acc_stderr": 0.03280400504755291, "acc_norm": 0.6053811659192825, "acc_norm_stderr": 0.03280400504755291 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.04039314978724561, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.04039314978724561 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.036401182719909476, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.036401182719909476 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243838, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243838 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.02220930907316561, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.02220930907316561 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.776500638569604, "acc_stderr": 0.01489723522945071, "acc_norm": 0.776500638569604, "acc_norm_stderr": 0.01489723522945071 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6965317919075145, "acc_stderr": 0.024752411960917205, "acc_norm": 0.6965317919075145, "acc_norm_stderr": 0.024752411960917205 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4212290502793296, "acc_stderr": 0.01651367603117959, "acc_norm": 0.4212290502793296, "acc_norm_stderr": 0.01651367603117959 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818737, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818737 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.02623696588115326, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.02623696588115326 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6944444444444444, "acc_stderr": 0.025630824975621348, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.025630824975621348 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.450354609929078, "acc_stderr": 0.029680105565029036, "acc_norm": 0.450354609929078, "acc_norm_stderr": 0.029680105565029036 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4335071707953064, "acc_stderr": 0.01265681038398396, "acc_norm": 0.4335071707953064, "acc_norm_stderr": 0.01265681038398396 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6286764705882353, "acc_stderr": 0.02934980313976587, "acc_norm": 0.6286764705882353, "acc_norm_stderr": 0.02934980313976587 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6143790849673203, "acc_stderr": 0.019691459052354022, "acc_norm": 0.6143790849673203, "acc_norm_stderr": 0.019691459052354022 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.02879518557429129, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.02879518557429129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8208955223880597, "acc_stderr": 0.027113286753111837, "acc_norm": 0.8208955223880597, "acc_norm_stderr": 0.027113286753111837 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.4528763769889841, "mc1_stderr": 0.01742558984831402, "mc2": 0.6218778140523901, "mc2_stderr": 0.01525213415924314 }, "harness|winogrande|5": { "acc": 0.7679558011049724, "acc_stderr": 0.01186414969182794 }, "harness|gsm8k|5": { "acc": 0.39727065959059893, "acc_stderr": 0.013478659652337794 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ConvexAI__Chop-7b
[ "region:us" ]
2024-01-19T16:49:01+00:00
{"pretty_name": "Evaluation run of ConvexAI/Chop-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/Chop-7b](https://huggingface.co/ConvexAI/Chop-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Chop-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T16:46:46.099414](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Chop-7b/blob/main/results_2024-01-19T16-46-46.099414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6189434455748648,\n \"acc_stderr\": 0.03297263847569279,\n \"acc_norm\": 0.6242517385876073,\n \"acc_norm_stderr\": 0.03363742978661649,\n \"mc1\": 0.4528763769889841,\n \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6218778140523901,\n \"mc2_stderr\": 0.01525213415924314\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522084,\n \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955012\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6385182234614618,\n \"acc_stderr\": 0.00479447842638261,\n \"acc_norm\": 0.8304122684724159,\n \"acc_norm_stderr\": 0.0037450326672282806\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.02537952491077839,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.02537952491077839\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n \"acc_stderr\": 0.02573654274559453,\n \"acc_norm\": 0.7129032258064516,\n \"acc_norm_stderr\": 0.02573654274559453\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072388,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072388\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630793,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630793\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n \"acc_stderr\": 0.01651367603117959,\n \"acc_norm\": 0.4212290502793296,\n \"acc_norm_stderr\": 0.01651367603117959\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621348,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621348\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n \"acc_stderr\": 0.01265681038398396,\n \"acc_norm\": 0.4335071707953064,\n \"acc_norm_stderr\": 0.01265681038398396\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354022,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354022\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4528763769889841,\n \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6218778140523901,\n \"mc2_stderr\": 0.01525213415924314\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.01186414969182794\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39727065959059893,\n \"acc_stderr\": 0.013478659652337794\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/Chop-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|arc:challenge|25_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|gsm8k|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hellaswag|10_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T16-46-46.099414.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["**/details_harness|winogrande|5_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T16-46-46.099414.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T16_46_46.099414", "path": ["results_2024-01-19T16-46-46.099414.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T16-46-46.099414.parquet"]}]}]}
2024-01-19T16:49:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ConvexAI/Chop-7b Dataset automatically created during the evaluation run of model ConvexAI/Chop-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T16:46:46.099414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ConvexAI/Chop-7b\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Chop-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T16:46:46.099414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ConvexAI/Chop-7b\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Chop-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T16:46:46.099414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
e591c6914ac911025e03425bb55f621c7de8865b
# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-13B-v1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [WhiteRabbitNeo/WhiteRabbitNeo-13B-v1](https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-13B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T16:51:00.125160](https://huggingface.co/datasets/open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1/blob/main/results_2024-01-19T16-51-00.125160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4325743019051002, "acc_stderr": 0.03450564854492944, "acc_norm": 0.4356434201033021, "acc_norm_stderr": 0.03525272782306864, "mc1": 0.29008567931456547, "mc1_stderr": 0.01588623687420952, "mc2": 0.44577231939553535, "mc2_stderr": 0.014884190006288057 }, "harness|arc:challenge|25": { "acc": 0.4462457337883959, "acc_stderr": 0.014526705548539982, "acc_norm": 0.4854948805460751, "acc_norm_stderr": 0.014605241081370056 }, "harness|hellaswag|10": { "acc": 0.5126468830910177, "acc_stderr": 0.0049881849883452855, "acc_norm": 0.6870145389364668, "acc_norm_stderr": 0.004627607991626908 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.04292346959909281, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.35555555555555557, "acc_stderr": 0.04135176749720386, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.04135176749720386 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.42105263157894735, "acc_stderr": 0.040179012759817494, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.040179012759817494 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4377358490566038, "acc_stderr": 0.03053333843046751, "acc_norm": 0.4377358490566038, "acc_norm_stderr": 0.03053333843046751 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3958333333333333, "acc_stderr": 0.04089465449325582, "acc_norm": 0.3958333333333333, "acc_norm_stderr": 0.04089465449325582 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3815028901734104, "acc_stderr": 0.037038511930995194, "acc_norm": 0.3815028901734104, "acc_norm_stderr": 0.037038511930995194 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.04280105837364395, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.04280105837364395 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3829787234042553, "acc_stderr": 0.03177821250236922, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.03177821250236922 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2982456140350877, "acc_stderr": 0.043036840335373146, "acc_norm": 0.2982456140350877, "acc_norm_stderr": 0.043036840335373146 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29894179894179895, "acc_stderr": 0.02357760479165581, "acc_norm": 0.29894179894179895, "acc_norm_stderr": 0.02357760479165581 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.04263906892795132, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.04263906892795132 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3774193548387097, "acc_stderr": 0.027575960723278236, "acc_norm": 0.3774193548387097, "acc_norm_stderr": 0.027575960723278236 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3399014778325123, "acc_stderr": 0.033327690684107895, "acc_norm": 0.3399014778325123, "acc_norm_stderr": 0.033327690684107895 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5818181818181818, "acc_stderr": 0.03851716319398395, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.03851716319398395 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5454545454545454, "acc_stderr": 0.035476014940069384, "acc_norm": 0.5454545454545454, "acc_norm_stderr": 0.035476014940069384 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5440414507772021, "acc_stderr": 0.03594413711272437, "acc_norm": 0.5440414507772021, "acc_norm_stderr": 0.03594413711272437 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3435897435897436, "acc_stderr": 0.024078696580635474, "acc_norm": 0.3435897435897436, "acc_norm_stderr": 0.024078696580635474 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.0263357394040558, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.0263357394040558 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3949579831932773, "acc_stderr": 0.031753678460966245, "acc_norm": 0.3949579831932773, "acc_norm_stderr": 0.031753678460966245 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.544954128440367, "acc_stderr": 0.02135050309092517, "acc_norm": 0.544954128440367, "acc_norm_stderr": 0.02135050309092517 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.032568505702936464, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.032568505702936464 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6127450980392157, "acc_stderr": 0.03418931233833343, "acc_norm": 0.6127450980392157, "acc_norm_stderr": 0.03418931233833343 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6329113924050633, "acc_stderr": 0.031376240725616185, "acc_norm": 0.6329113924050633, "acc_norm_stderr": 0.031376240725616185 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4618834080717489, "acc_stderr": 0.03346015011973228, "acc_norm": 0.4618834080717489, "acc_norm_stderr": 0.03346015011973228 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3969465648854962, "acc_stderr": 0.04291135671009224, "acc_norm": 0.3969465648854962, "acc_norm_stderr": 0.04291135671009224 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6528925619834711, "acc_stderr": 0.043457245702925335, "acc_norm": 0.6528925619834711, "acc_norm_stderr": 0.043457245702925335 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5092592592592593, "acc_stderr": 0.04832853553437055, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.04832853553437055 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.49693251533742333, "acc_stderr": 0.03928297078179663, "acc_norm": 0.49693251533742333, "acc_norm_stderr": 0.03928297078179663 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04287858751340455, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04287858751340455 }, "harness|hendrycksTest-management|5": { "acc": 0.5048543689320388, "acc_stderr": 0.049505043821289195, "acc_norm": 0.5048543689320388, "acc_norm_stderr": 0.049505043821289195 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6752136752136753, "acc_stderr": 0.03067902276549883, "acc_norm": 0.6752136752136753, "acc_norm_stderr": 0.03067902276549883 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.5466155810983397, "acc_stderr": 0.0178020871358503, "acc_norm": 0.5466155810983397, "acc_norm_stderr": 0.0178020871358503 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.4624277456647399, "acc_stderr": 0.026842985519615375, "acc_norm": 0.4624277456647399, "acc_norm_stderr": 0.026842985519615375 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2759776536312849, "acc_stderr": 0.014950103002475353, "acc_norm": 0.2759776536312849, "acc_norm_stderr": 0.014950103002475353 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.4215686274509804, "acc_stderr": 0.028275490156791434, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.028275490156791434 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.4662379421221865, "acc_stderr": 0.028333277109562783, "acc_norm": 0.4662379421221865, "acc_norm_stderr": 0.028333277109562783 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.48148148148148145, "acc_stderr": 0.027801656212323674, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.027801656212323674 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.028838921471251458, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.028838921471251458 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3285528031290743, "acc_stderr": 0.01199602724750291, "acc_norm": 0.3285528031290743, "acc_norm_stderr": 0.01199602724750291 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3014705882352941, "acc_stderr": 0.027875982114273168, "acc_norm": 0.3014705882352941, "acc_norm_stderr": 0.027875982114273168 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.39215686274509803, "acc_stderr": 0.019751726508762626, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.019751726508762626 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4909090909090909, "acc_stderr": 0.04788339768702861, "acc_norm": 0.4909090909090909, "acc_norm_stderr": 0.04788339768702861 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5469387755102041, "acc_stderr": 0.03186785930004129, "acc_norm": 0.5469387755102041, "acc_norm_stderr": 0.03186785930004129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.48756218905472637, "acc_stderr": 0.0353443984853958, "acc_norm": 0.48756218905472637, "acc_norm_stderr": 0.0353443984853958 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-virology|5": { "acc": 0.3855421686746988, "acc_stderr": 0.037891344246115496, "acc_norm": 0.3855421686746988, "acc_norm_stderr": 0.037891344246115496 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.5263157894736842, "acc_stderr": 0.03829509868994727, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.03829509868994727 }, "harness|truthfulqa:mc|0": { "mc1": 0.29008567931456547, "mc1_stderr": 0.01588623687420952, "mc2": 0.44577231939553535, "mc2_stderr": 0.014884190006288057 }, "harness|winogrande|5": { "acc": 0.6740331491712708, "acc_stderr": 0.013173782636922187 }, "harness|gsm8k|5": { "acc": 0.22365428354814254, "acc_stderr": 0.011477795578836105 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1
[ "region:us" ]
2024-01-19T16:53:20+00:00
{"pretty_name": "Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-13B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [WhiteRabbitNeo/WhiteRabbitNeo-13B-v1](https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-13B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T16:51:00.125160](https://huggingface.co/datasets/open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1/blob/main/results_2024-01-19T16-51-00.125160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4325743019051002,\n \"acc_stderr\": 0.03450564854492944,\n \"acc_norm\": 0.4356434201033021,\n \"acc_norm_stderr\": 0.03525272782306864,\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44577231939553535,\n \"mc2_stderr\": 0.014884190006288057\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4462457337883959,\n \"acc_stderr\": 0.014526705548539982,\n \"acc_norm\": 0.4854948805460751,\n \"acc_norm_stderr\": 0.014605241081370056\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5126468830910177,\n \"acc_stderr\": 0.0049881849883452855,\n \"acc_norm\": 0.6870145389364668,\n \"acc_norm_stderr\": 0.004627607991626908\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.03053333843046751,\n \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.03053333843046751\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n \"acc_stderr\": 0.037038511930995194,\n \"acc_norm\": 0.3815028901734104,\n \"acc_norm_stderr\": 0.037038511930995194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3774193548387097,\n \"acc_stderr\": 0.027575960723278236,\n \"acc_norm\": 0.3774193548387097,\n \"acc_norm_stderr\": 0.027575960723278236\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.035476014940069384,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.035476014940069384\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5440414507772021,\n \"acc_stderr\": 0.03594413711272437,\n \"acc_norm\": 0.5440414507772021,\n \"acc_norm_stderr\": 0.03594413711272437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635474,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635474\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.031753678460966245,\n \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.031753678460966245\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.544954128440367,\n \"acc_stderr\": 0.02135050309092517,\n \"acc_norm\": 0.544954128440367,\n \"acc_norm_stderr\": 0.02135050309092517\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936464,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936464\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.03418931233833343,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.03418931233833343\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5466155810983397,\n \"acc_stderr\": 0.0178020871358503,\n \"acc_norm\": 0.5466155810983397,\n \"acc_norm_stderr\": 0.0178020871358503\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n \"acc_stderr\": 0.014950103002475353,\n \"acc_norm\": 0.2759776536312849,\n \"acc_norm_stderr\": 0.014950103002475353\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.028275490156791434,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.028275490156791434\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n \"acc_stderr\": 0.028333277109562783,\n \"acc_norm\": 0.4662379421221865,\n \"acc_norm_stderr\": 0.028333277109562783\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.027801656212323674,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.027801656212323674\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3285528031290743,\n \"acc_stderr\": 0.01199602724750291,\n \"acc_norm\": 0.3285528031290743,\n \"acc_norm_stderr\": 0.01199602724750291\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.019751726508762626,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.019751726508762626\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004129,\n \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48756218905472637,\n \"acc_stderr\": 0.0353443984853958,\n \"acc_norm\": 0.48756218905472637,\n \"acc_norm_stderr\": 0.0353443984853958\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.03829509868994727,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.03829509868994727\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44577231939553535,\n \"mc2_stderr\": 0.014884190006288057\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6740331491712708,\n \"acc_stderr\": 0.013173782636922187\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22365428354814254,\n \"acc_stderr\": 0.011477795578836105\n }\n}\n```", "repo_url": "https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-13B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|arc:challenge|25_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|gsm8k|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hellaswag|10_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T16-51-00.125160.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["**/details_harness|winogrande|5_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T16-51-00.125160.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T16_51_00.125160", "path": ["results_2024-01-19T16-51-00.125160.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T16-51-00.125160.parquet"]}]}]}
2024-01-19T16:53:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-13B-v1 Dataset automatically created during the evaluation run of model WhiteRabbitNeo/WhiteRabbitNeo-13B-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T16:51:00.125160(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-13B-v1\n\n\n\nDataset automatically created during the evaluation run of model WhiteRabbitNeo/WhiteRabbitNeo-13B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T16:51:00.125160(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-13B-v1\n\n\n\nDataset automatically created during the evaluation run of model WhiteRabbitNeo/WhiteRabbitNeo-13B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T16:51:00.125160(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7288a2e046ed6313497d5e8fc94d33c6c7bf8075
# Dataset Card for "math_23k_train_numeric" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/math_23k_train_numeric
[ "region:us" ]
2024-01-19T17:14:41+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20552640.641394995, "num_examples": 21094}], "download_size": 2752930, "dataset_size": 20552640.641394995}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-21T06:44:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for "math_23k_train_numeric" More Information needed
[ "# Dataset Card for \"math_23k_train_numeric\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"math_23k_train_numeric\"\n\nMore Information needed" ]
9c4692152388c198138a865af044a1b3e86e07db
polars-tpch =========== This repo contains the code used for performance evaluation of polars. The benchmarks are TPC-standardised queries and data designed to test the performance of "real" workflows. From the [TPC website](https://www.tpc.org/tpch/): > TPC-H is a decision support benchmark. It consists of a suite of business-oriented ad hoc queries and concurrent data modifications. The queries and the data populating the database have been chosen to have broad industry-wide relevance. This benchmark illustrates decision support systems that examine large volumes of data, execute queries with a high degree of complexity, and give answers to critical business questions. ## Generating TPC-H Data ### Project setup ```shell # clone this repository git clone https://github.com/pola-rs/tpch.git cd tpch/tpch-dbgen # build tpch-dbgen make ``` Notes: - For MacOS, the above `make` command will result in an error while compiling like below, ```shell bm_utils.c:71:10: fatal error: 'malloc.h' file not found #include <malloc.h> ^~~~~~~~~~ 1 error generated. make: *** [bm_utils.o] Error 1 ``` To fix this, change the import statement `#include <malloc.h>` to `#include <sys/malloc.h>` in the files where error is reported (`bm_utils.c` and `varsub.c`) and then re-run the command `make`. ### Execute ```shell # change directory to the root of the repository cd ../ ./run.sh ``` This will do the following, - Create a new virtual environment with all required dependencies. - Generate data for benchmarks. - Run the benchmark suite.
kunishou/tpch_tables_scale_1
[ "region:us" ]
2024-01-19T17:23:20+00:00
{}
2024-01-23T18:28:51+00:00
[]
[]
TAGS #region-us
polars-tpch =========== This repo contains the code used for performance evaluation of polars. The benchmarks are TPC-standardised queries and data designed to test the performance of "real" workflows. From the TPC website: > TPC-H is a decision support benchmark. It consists of a suite of business-oriented ad hoc queries and concurrent data modifications. The queries and the data populating the database have been chosen to have broad industry-wide relevance. This benchmark illustrates decision support systems that examine large volumes of data, execute queries with a high degree of complexity, and give answers to critical business questions. ## Generating TPC-H Data ### Project setup Notes: - For MacOS, the above 'make' command will result in an error while compiling like below, To fix this, change the import statement '#include <malloc.h>' to '#include <sys/malloc.h>' in the files where error is reported ('bm_utils.c' and 'varsub.c') and then re-run the command 'make'. ### Execute This will do the following, - Create a new virtual environment with all required dependencies. - Generate data for benchmarks. - Run the benchmark suite.
[ "## Generating TPC-H Data", "### Project setup\n\n\n\nNotes:\n\n- For MacOS, the above 'make' command will result in an error while compiling like below,\n\n \n To fix this, change the import statement '#include <malloc.h>' to '#include <sys/malloc.h>' in the files where error\n is reported ('bm_utils.c' and 'varsub.c') and then re-run the command 'make'.", "### Execute\n\n\n\nThis will do the following,\n\n- Create a new virtual environment with all required dependencies.\n- Generate data for benchmarks.\n- Run the benchmark suite." ]
[ "TAGS\n#region-us \n", "## Generating TPC-H Data", "### Project setup\n\n\n\nNotes:\n\n- For MacOS, the above 'make' command will result in an error while compiling like below,\n\n \n To fix this, change the import statement '#include <malloc.h>' to '#include <sys/malloc.h>' in the files where error\n is reported ('bm_utils.c' and 'varsub.c') and then re-run the command 'make'.", "### Execute\n\n\n\nThis will do the following,\n\n- Create a new virtual environment with all required dependencies.\n- Generate data for benchmarks.\n- Run the benchmark suite." ]
52b0d0adb104e46f5bdf04ea97ea0ae6394fd04d
# Dataset Card for "cai-conversation-dev1705680551" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
HuggingFaceH4/grok-conversation-harmless2
[ "region:us" ]
2024-01-19T17:23:31+00:00
{"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 77931081, "num_examples": 21268}, {"name": "train_prefs", "num_bytes": 77863425, "num_examples": 21269}, {"name": "test_sft", "num_bytes": 4236971, "num_examples": 1156}, {"name": "test_prefs", "num_bytes": 4235042, "num_examples": 1156}], "download_size": 66850108, "dataset_size": 164266519}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]}
2024-01-19T17:23:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cai-conversation-dev1705680551" More Information needed
[ "# Dataset Card for \"cai-conversation-dev1705680551\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cai-conversation-dev1705680551\"\n\nMore Information needed" ]
e5fec64e1f0688e47699b9cf8c26fe4ed350123a
## License apache-2.0 ## About ### Source https://catalog.data.gov/dataset/consumer-complaint-database ### Data Prep - removed rows with empty values - filtered out complaints with over 100 words - filtered out products, sub-products, issues and sub-issues with less than 30 occurrences ## Purpose Intended for use in the HPE Generative AI "Financial Customer Classification" tutorial. - [No-Code Version (UI Only)](tbd) - [Notebooks Version](tbd)
hpe-ai/customer-complaints
[ "region:us" ]
2024-01-19T17:32:51+00:00
{}
2024-01-20T02:58:57+00:00
[]
[]
TAGS #region-us
## License apache-2.0 ## About ### Source URL ### Data Prep - removed rows with empty values - filtered out complaints with over 100 words - filtered out products, sub-products, issues and sub-issues with less than 30 occurrences ## Purpose Intended for use in the HPE Generative AI "Financial Customer Classification" tutorial. - No-Code Version (UI Only) - Notebooks Version
[ "## License\n\napache-2.0", "## About", "### Source\n\nURL", "### Data Prep\n\n- removed rows with empty values\n- filtered out complaints with over 100 words\n- filtered out products, sub-products, issues and sub-issues with less than 30 occurrences", "## Purpose\n\nIntended for use in the HPE Generative AI \"Financial Customer Classification\" tutorial.\n\n- No-Code Version (UI Only)\n- Notebooks Version" ]
[ "TAGS\n#region-us \n", "## License\n\napache-2.0", "## About", "### Source\n\nURL", "### Data Prep\n\n- removed rows with empty values\n- filtered out complaints with over 100 words\n- filtered out products, sub-products, issues and sub-issues with less than 30 occurrences", "## Purpose\n\nIntended for use in the HPE Generative AI \"Financial Customer Classification\" tutorial.\n\n- No-Code Version (UI Only)\n- Notebooks Version" ]
92d2dcdaa14ab258f5b10167f7d9ab4b6bf5da54
# Dataset Card for "rap_phase2_19jan_22i_v1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
am-infoweb/rap_phase2_19jan_22i_v1
[ "region:us" ]
2024-01-19T17:42:46+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 121333727.25, "num_examples": 107976}, {"name": "test", "num_bytes": 40444575.75, "num_examples": 35992}], "download_size": 91609370, "dataset_size": 161778303.0}}
2024-01-19T17:43:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for "rap_phase2_19jan_22i_v1" More Information needed
[ "# Dataset Card for \"rap_phase2_19jan_22i_v1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"rap_phase2_19jan_22i_v1\"\n\nMore Information needed" ]
cf1b52c64cc0b9072401d340ee47a19d228c4d22
# Dataset Card for "wikiner_fr_mixed_caps" This is an update on the dataset [Jean-Baptiste/wikiner_fr](https://huggingface.co/datasets/Jean-Baptiste/wikiner_fr) with: - removal of duplicated examples and leakage - random de-capitalization of words (20%) You can see the code to create the changes in the script `update_dataset.py` in the repository. Dataset Description (reproduced from original repo): - **Homepage:** https://metatext.io/datasets/wikiner - **Repository:** - **Paper:** https://www.sciencedirect.com/science/article/pii/S0004370212000276?via%3Dihub - **Leaderboard:** - **Point of Contact:**
Alizee/wikiner_fr_mixed_caps
[ "task_categories:token-classification", "size_categories:100K<n<1M", "language:fr", "region:us" ]
2024-01-19T17:43:58+00:00
{"language": ["fr"], "size_categories": ["100K<n<1M"], "task_categories": ["token-classification"], "pretty_name": "wikiner_fr", "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "ner_tags", "sequence": {"class_label": {"names": {"0": "O", "1": "LOC", "2": "PER", "3": "MISC", "4": "ORG"}}}}], "splits": [{"name": "train", "num_bytes": 54139057, "num_examples": 120060}, {"name": "test", "num_bytes": 5952227, "num_examples": 13393}], "download_size": 15572314, "dataset_size": 60091284}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2024-01-19T18:15:36+00:00
[]
[ "fr" ]
TAGS #task_categories-token-classification #size_categories-100K<n<1M #language-French #region-us
# Dataset Card for "wikiner_fr_mixed_caps" This is an update on the dataset Jean-Baptiste/wikiner_fr with: - removal of duplicated examples and leakage - random de-capitalization of words (20%) You can see the code to create the changes in the script 'update_dataset.py' in the repository. Dataset Description (reproduced from original repo): - Homepage: URL - Repository: - Paper: URL - Leaderboard: - Point of Contact:
[ "# Dataset Card for \"wikiner_fr_mixed_caps\"\n\nThis is an update on the dataset Jean-Baptiste/wikiner_fr with:\n - removal of duplicated examples and leakage\n - random de-capitalization of words (20%)\n\nYou can see the code to create the changes in the script 'update_dataset.py' in the repository.\n\nDataset Description (reproduced from original repo):\n\n- Homepage: URL\n- Repository: \n- Paper: URL\n- Leaderboard:\n- Point of Contact:" ]
[ "TAGS\n#task_categories-token-classification #size_categories-100K<n<1M #language-French #region-us \n", "# Dataset Card for \"wikiner_fr_mixed_caps\"\n\nThis is an update on the dataset Jean-Baptiste/wikiner_fr with:\n - removal of duplicated examples and leakage\n - random de-capitalization of words (20%)\n\nYou can see the code to create the changes in the script 'update_dataset.py' in the repository.\n\nDataset Description (reproduced from original repo):\n\n- Homepage: URL\n- Repository: \n- Paper: URL\n- Leaderboard:\n- Point of Contact:" ]
7c30c55b836a25c497a532e855f35917b7fbfb41
# Dataset Card for Evaluation run of FelixChao/Voldemort-10B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [FelixChao/Voldemort-10B](https://huggingface.co/FelixChao/Voldemort-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FelixChao__Voldemort-10B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T18:00:16.560627](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Voldemort-10B/blob/main/results_2024-01-19T18-00-16.560627.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6311660068530696, "acc_stderr": 0.03247623450462309, "acc_norm": 0.6326331571048615, "acc_norm_stderr": 0.03312761818003797, "mc1": 0.41615667074663404, "mc1_stderr": 0.017255657502903043, "mc2": 0.5992144173223708, "mc2_stderr": 0.015636909190356544 }, "harness|arc:challenge|25": { "acc": 0.6296928327645052, "acc_stderr": 0.01411129875167495, "acc_norm": 0.64419795221843, "acc_norm_stderr": 0.013990571137918762 }, "harness|hellaswag|10": { "acc": 0.6627165903206532, "acc_stderr": 0.004718162860083519, "acc_norm": 0.8424616610237005, "acc_norm_stderr": 0.0036356303524759065 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.037150621549989056, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.037150621549989056 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.660377358490566, "acc_stderr": 0.029146904747798328, "acc_norm": 0.660377358490566, "acc_norm_stderr": 0.029146904747798328 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895536, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895536 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5446808510638298, "acc_stderr": 0.03255525359340355, "acc_norm": 0.5446808510638298, "acc_norm_stderr": 0.03255525359340355 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728763, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778394, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778394 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4827586206896552, "acc_stderr": 0.035158955511656986, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198892, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198892 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8756476683937824, "acc_stderr": 0.023814477086593542, "acc_norm": 0.8756476683937824, "acc_norm_stderr": 0.023814477086593542 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6461538461538462, "acc_stderr": 0.024243783994062157, "acc_norm": 0.6461538461538462, "acc_norm_stderr": 0.024243783994062157 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524593, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886786, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886786 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.036848815213890225, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.036848815213890225 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8128440366972477, "acc_stderr": 0.016722684526200144, "acc_norm": 0.8128440366972477, "acc_norm_stderr": 0.016722684526200144 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.02747974455080851, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.02747974455080851 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.031493846709941306, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.031493846709941306 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368985, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368985 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7052023121387283, "acc_stderr": 0.024547617794803828, "acc_norm": 0.7052023121387283, "acc_norm_stderr": 0.024547617794803828 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3687150837988827, "acc_stderr": 0.016135759015030116, "acc_norm": 0.3687150837988827, "acc_norm_stderr": 0.016135759015030116 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7124183006535948, "acc_stderr": 0.02591780611714716, "acc_norm": 0.7124183006535948, "acc_norm_stderr": 0.02591780611714716 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6784565916398714, "acc_stderr": 0.026527724079528872, "acc_norm": 0.6784565916398714, "acc_norm_stderr": 0.026527724079528872 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7006172839506173, "acc_stderr": 0.02548311560119546, "acc_norm": 0.7006172839506173, "acc_norm_stderr": 0.02548311560119546 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.02973659252642444, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.02973659252642444 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4406779661016949, "acc_stderr": 0.012680037994097077, "acc_norm": 0.4406779661016949, "acc_norm_stderr": 0.012680037994097077 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.02850145286039655, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.02850145286039655 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6486928104575164, "acc_stderr": 0.019312676065786554, "acc_norm": 0.6486928104575164, "acc_norm_stderr": 0.019312676065786554 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.028920583220675592, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.028920583220675592 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8208955223880597, "acc_stderr": 0.027113286753111837, "acc_norm": 0.8208955223880597, "acc_norm_stderr": 0.027113286753111837 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.03882310850890594, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.03882310850890594 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.03094445977853321, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.03094445977853321 }, "harness|truthfulqa:mc|0": { "mc1": 0.41615667074663404, "mc1_stderr": 0.017255657502903043, "mc2": 0.5992144173223708, "mc2_stderr": 0.015636909190356544 }, "harness|winogrande|5": { "acc": 0.7703235990528808, "acc_stderr": 0.011821645601838229 }, "harness|gsm8k|5": { "acc": 0.599696739954511, "acc_stderr": 0.01349592643656644 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_FelixChao__Voldemort-10B
[ "region:us" ]
2024-01-19T18:02:32+00:00
{"pretty_name": "Evaluation run of FelixChao/Voldemort-10B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Voldemort-10B](https://huggingface.co/FelixChao/Voldemort-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Voldemort-10B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T18:00:16.560627](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Voldemort-10B/blob/main/results_2024-01-19T18-00-16.560627.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6311660068530696,\n \"acc_stderr\": 0.03247623450462309,\n \"acc_norm\": 0.6326331571048615,\n \"acc_norm_stderr\": 0.03312761818003797,\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5992144173223708,\n \"mc2_stderr\": 0.015636909190356544\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6296928327645052,\n \"acc_stderr\": 0.01411129875167495,\n \"acc_norm\": 0.64419795221843,\n \"acc_norm_stderr\": 0.013990571137918762\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6627165903206532,\n \"acc_stderr\": 0.004718162860083519,\n \"acc_norm\": 0.8424616610237005,\n \"acc_norm_stderr\": 0.0036356303524759065\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798328,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798328\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198892,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198892\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524593,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3687150837988827,\n \"acc_stderr\": 0.016135759015030116,\n \"acc_norm\": 0.3687150837988827,\n \"acc_norm_stderr\": 0.016135759015030116\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119546,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119546\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n \"acc_stderr\": 0.012680037994097077,\n \"acc_norm\": 0.4406779661016949,\n \"acc_norm_stderr\": 0.012680037994097077\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786554,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786554\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675592,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675592\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5992144173223708,\n \"mc2_stderr\": 0.015636909190356544\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.599696739954511,\n \"acc_stderr\": 0.01349592643656644\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Voldemort-10B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|arc:challenge|25_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|gsm8k|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hellaswag|10_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T18-00-16.560627.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["**/details_harness|winogrande|5_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T18-00-16.560627.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T18_00_16.560627", "path": ["results_2024-01-19T18-00-16.560627.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T18-00-16.560627.parquet"]}]}]}
2024-01-19T18:02:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FelixChao/Voldemort-10B Dataset automatically created during the evaluation run of model FelixChao/Voldemort-10B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T18:00:16.560627(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of FelixChao/Voldemort-10B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Voldemort-10B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T18:00:16.560627(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FelixChao/Voldemort-10B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Voldemort-10B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T18:00:16.560627(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a1e0cd1392036b00b7600c3221f0ab86cf2f365a
# Dataset Card for Evaluation run of FelixChao/Magician-MoE-4x7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [FelixChao/Magician-MoE-4x7B](https://huggingface.co/FelixChao/Magician-MoE-4x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FelixChao__Magician-MoE-4x7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T18:31:40.054595](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Magician-MoE-4x7B/blob/main/results_2024-01-19T18-31-40.054595.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2469906164466485, "acc_stderr": 0.030560107857506777, "acc_norm": 0.2482166284070575, "acc_norm_stderr": 0.03137507664015106, "mc1": 0.28886168910648713, "mc1_stderr": 0.01586634640138431, "mc2": NaN, "mc2_stderr": NaN }, "harness|arc:challenge|25": { "acc": 0.22696245733788395, "acc_stderr": 0.012240491536132861, "acc_norm": 0.28242320819112626, "acc_norm_stderr": 0.01315545688409722 }, "harness|hellaswag|10": { "acc": 0.2789285002987453, "acc_stderr": 0.004475557360359701, "acc_norm": 0.300637323242382, "acc_norm_stderr": 0.00457598076392358 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2814814814814815, "acc_stderr": 0.03885004245800253, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.03885004245800253 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2631578947368421, "acc_stderr": 0.03583496176361063, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.03583496176361063 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2641509433962264, "acc_stderr": 0.027134291628741713, "acc_norm": 0.2641509433962264, "acc_norm_stderr": 0.027134291628741713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.22916666666666666, "acc_stderr": 0.03514697467862388, "acc_norm": 0.22916666666666666, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.16, "acc_stderr": 0.03684529491774709, "acc_norm": 0.16, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.19, "acc_stderr": 0.03942772444036624, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768077, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2138728323699422, "acc_stderr": 0.03126511206173042, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.03126511206173042 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006718, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006718 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2765957446808511, "acc_stderr": 0.02924188386962883, "acc_norm": 0.2765957446808511, "acc_norm_stderr": 0.02924188386962883 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.22807017543859648, "acc_stderr": 0.03947152782669415, "acc_norm": 0.22807017543859648, "acc_norm_stderr": 0.03947152782669415 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.23448275862068965, "acc_stderr": 0.035306258743465914, "acc_norm": 0.23448275862068965, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25396825396825395, "acc_stderr": 0.022418042891113942, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.022418042891113942 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.19047619047619047, "acc_stderr": 0.03512207412302052, "acc_norm": 0.19047619047619047, "acc_norm_stderr": 0.03512207412302052 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.24193548387096775, "acc_stderr": 0.024362599693031096, "acc_norm": 0.24193548387096775, "acc_norm_stderr": 0.024362599693031096 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2315270935960591, "acc_stderr": 0.029678333141444444, "acc_norm": 0.2315270935960591, "acc_norm_stderr": 0.029678333141444444 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.26666666666666666, "acc_stderr": 0.03453131801885415, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.23737373737373738, "acc_stderr": 0.030313710538198913, "acc_norm": 0.23737373737373738, "acc_norm_stderr": 0.030313710538198913 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.23316062176165803, "acc_stderr": 0.03051611137147602, "acc_norm": 0.23316062176165803, "acc_norm_stderr": 0.03051611137147602 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.22564102564102564, "acc_stderr": 0.021193632525148543, "acc_norm": 0.22564102564102564, "acc_norm_stderr": 0.021193632525148543 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.026335739404055803, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.026335739404055803 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2605042016806723, "acc_stderr": 0.028510251512341933, "acc_norm": 0.2605042016806723, "acc_norm_stderr": 0.028510251512341933 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.18543046357615894, "acc_stderr": 0.03173284384294286, "acc_norm": 0.18543046357615894, "acc_norm_stderr": 0.03173284384294286 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.21651376146788992, "acc_stderr": 0.017658710594443128, "acc_norm": 0.21651376146788992, "acc_norm_stderr": 0.017658710594443128 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.28703703703703703, "acc_stderr": 0.030851992993257017, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.030851992993257017 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.030190282453501943, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.030190282453501943 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.27848101265822783, "acc_stderr": 0.029178682304842548, "acc_norm": 0.27848101265822783, "acc_norm_stderr": 0.029178682304842548 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.27802690582959644, "acc_stderr": 0.030069584874494026, "acc_norm": 0.27802690582959644, "acc_norm_stderr": 0.030069584874494026 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2366412213740458, "acc_stderr": 0.0372767357559692, "acc_norm": 0.2366412213740458, "acc_norm_stderr": 0.0372767357559692 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.038968789850704164, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.038968789850704164 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04557239513497751, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2392638036809816, "acc_stderr": 0.033519538795212696, "acc_norm": 0.2392638036809816, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.17857142857142858, "acc_stderr": 0.036352091215778065, "acc_norm": 0.17857142857142858, "acc_norm_stderr": 0.036352091215778065 }, "harness|hendrycksTest-management|5": { "acc": 0.30097087378640774, "acc_stderr": 0.04541609446503946, "acc_norm": 0.30097087378640774, "acc_norm_stderr": 0.04541609446503946 }, "harness|hendrycksTest-marketing|5": { "acc": 0.23076923076923078, "acc_stderr": 0.027601921381417614, "acc_norm": 0.23076923076923078, "acc_norm_stderr": 0.027601921381417614 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.25287356321839083, "acc_stderr": 0.015543377313719681, "acc_norm": 0.25287356321839083, "acc_norm_stderr": 0.015543377313719681 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2398843930635838, "acc_stderr": 0.02298959254312357, "acc_norm": 0.2398843930635838, "acc_norm_stderr": 0.02298959254312357 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2424581005586592, "acc_stderr": 0.014333522059217889, "acc_norm": 0.2424581005586592, "acc_norm_stderr": 0.014333522059217889 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24836601307189543, "acc_stderr": 0.024739981355113596, "acc_norm": 0.24836601307189543, "acc_norm_stderr": 0.024739981355113596 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2765273311897106, "acc_stderr": 0.02540383297817962, "acc_norm": 0.2765273311897106, "acc_norm_stderr": 0.02540383297817962 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.25, "acc_stderr": 0.02409347123262133, "acc_norm": 0.25, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.26595744680851063, "acc_stderr": 0.026358065698880596, "acc_norm": 0.26595744680851063, "acc_norm_stderr": 0.026358065698880596 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23989569752281617, "acc_stderr": 0.010906282617981634, "acc_norm": 0.23989569752281617, "acc_norm_stderr": 0.010906282617981634 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.19852941176470587, "acc_stderr": 0.024231013370541104, "acc_norm": 0.19852941176470587, "acc_norm_stderr": 0.024231013370541104 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2369281045751634, "acc_stderr": 0.017201662169789796, "acc_norm": 0.2369281045751634, "acc_norm_stderr": 0.017201662169789796 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2727272727272727, "acc_stderr": 0.04265792110940588, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.04265792110940588 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.1673469387755102, "acc_stderr": 0.023897144768914524, "acc_norm": 0.1673469387755102, "acc_norm_stderr": 0.023897144768914524 }, "harness|hendrycksTest-sociology|5": { "acc": 0.22885572139303484, "acc_stderr": 0.029705284056772426, "acc_norm": 0.22885572139303484, "acc_norm_stderr": 0.029705284056772426 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.18, "acc_stderr": 0.03861229196653695, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653695 }, "harness|hendrycksTest-virology|5": { "acc": 0.2710843373493976, "acc_stderr": 0.03460579907553026, "acc_norm": 0.2710843373493976, "acc_norm_stderr": 0.03460579907553026 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21637426900584794, "acc_stderr": 0.03158149539338734, "acc_norm": 0.21637426900584794, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.28886168910648713, "mc1_stderr": 0.01586634640138431, "mc2": NaN, "mc2_stderr": NaN }, "harness|winogrande|5": { "acc": 0.4988161010260458, "acc_stderr": 0.014052446290529015 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_FelixChao__Magician-MoE-4x7B
[ "region:us" ]
2024-01-19T18:34:12+00:00
{"pretty_name": "Evaluation run of FelixChao/Magician-MoE-4x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Magician-MoE-4x7B](https://huggingface.co/FelixChao/Magician-MoE-4x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Magician-MoE-4x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T18:31:40.054595](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Magician-MoE-4x7B/blob/main/results_2024-01-19T18-31-40.054595.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2469906164466485,\n \"acc_stderr\": 0.030560107857506777,\n \"acc_norm\": 0.2482166284070575,\n \"acc_norm_stderr\": 0.03137507664015106,\n \"mc1\": 0.28886168910648713,\n \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.28242320819112626,\n \"acc_norm_stderr\": 0.01315545688409722\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2789285002987453,\n \"acc_stderr\": 0.004475557360359701,\n \"acc_norm\": 0.300637323242382,\n \"acc_norm_stderr\": 0.00457598076392358\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.03885004245800253,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.03885004245800253\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361063,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361063\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741713,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.03126511206173042,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.03126511206173042\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02924188386962883,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02924188386962883\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444444,\n \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444444\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198913,\n \"acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198913\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147602,\n \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147602\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148543,\n \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148543\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.18543046357615894,\n \"acc_stderr\": 0.03173284384294286,\n \"acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.03173284384294286\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21651376146788992,\n \"acc_stderr\": 0.017658710594443128,\n \"acc_norm\": 0.21651376146788992,\n \"acc_norm_stderr\": 0.017658710594443128\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257017,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257017\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501943,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501943\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842548,\n \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842548\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n \"acc_stderr\": 0.030069584874494026,\n \"acc_norm\": 0.27802690582959644,\n \"acc_norm_stderr\": 0.030069584874494026\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.0372767357559692,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.0372767357559692\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.038968789850704164,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.038968789850704164\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n \"acc_stderr\": 0.036352091215778065,\n \"acc_norm\": 0.17857142857142858,\n \"acc_norm_stderr\": 0.036352091215778065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.04541609446503946,\n \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.04541609446503946\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.027601921381417614,\n \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.027601921381417614\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.25287356321839083,\n \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113596,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113596\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.2765273311897106,\n \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n \"acc_stderr\": 0.010906282617981634,\n \"acc_norm\": 0.23989569752281617,\n \"acc_norm_stderr\": 0.010906282617981634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541104,\n \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541104\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2369281045751634,\n \"acc_stderr\": 0.017201662169789796,\n \"acc_norm\": 0.2369281045751634,\n \"acc_norm_stderr\": 0.017201662169789796\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4988161010260458,\n \"acc_stderr\": 0.014052446290529015\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Magician-MoE-4x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|arc:challenge|25_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|gsm8k|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hellaswag|10_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T18-31-40.054595.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["**/details_harness|winogrande|5_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T18-31-40.054595.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T18_31_40.054595", "path": ["results_2024-01-19T18-31-40.054595.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T18-31-40.054595.parquet"]}]}]}
2024-01-19T18:34:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FelixChao/Magician-MoE-4x7B Dataset automatically created during the evaluation run of model FelixChao/Magician-MoE-4x7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T18:31:40.054595(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of FelixChao/Magician-MoE-4x7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Magician-MoE-4x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T18:31:40.054595(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FelixChao/Magician-MoE-4x7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Magician-MoE-4x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T18:31:40.054595(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
cc9e1df7eca48a81370aa1e5e2ec042eb01fe847
Data from real humans, courtesy of https://reddit.com/r/WritingPrompts
euclaise/WritingPrompts_curated
[ "license:mit", "region:us" ]
2024-01-19T18:42:40+00:00
{"license": "mit", "dataset_info": {"features": [{"name": "body", "dtype": "string"}, {"name": "comment_score", "dtype": "int64"}, {"name": "prompt", "dtype": "string"}, {"name": "post_score", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 244506795.8945573, "num_examples": 66332}], "download_size": 168000074, "dataset_size": 244506795.8945573}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-19T18:54:20+00:00
[]
[]
TAGS #license-mit #region-us
Data from real humans, courtesy of URL
[]
[ "TAGS\n#license-mit #region-us \n" ]
d48fe526abeaa1dd8c1cf1e9ebee10e4e7fa85d0
https://github.com/nyu-mll/nope ``` @inproceedings{NOPE, title="{NOPE}: {A} Corpus of Naturally-Occurring Presuppositions in {E}nglish", author={Parrish, Alicia, and Schuster, Sebastian and Warstadt, Alex and Agha, Omar and Lee, Soo-Hwan and Zhao, Zhuoye and Bowman, Samuel R. and Linzen, Tal}, booktitle={Proceedings of the 25th Conference on Computational Natural Language Learning (CoNLL)}, year={2021} } ```
tasksource/nope
[ "region:us" ]
2024-01-19T19:00:13+00:00
{}
2024-01-19T19:03:08+00:00
[]
[]
TAGS #region-us
URL
[]
[ "TAGS\n#region-us \n" ]
15856f0953ddfa68106e8bdc012cf929187dd218
# Dataset Card for Evaluation run of vicgalle/solarized-18B-dpo <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [vicgalle/solarized-18B-dpo](https://huggingface.co/vicgalle/solarized-18B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vicgalle__solarized-18B-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T19:06:47.100620](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__solarized-18B-dpo/blob/main/results_2024-01-19T19-06-47.100620.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6387313535462917, "acc_stderr": 0.03249557900745891, "acc_norm": 0.6436773766747061, "acc_norm_stderr": 0.03315021744428834, "mc1": 0.5116279069767442, "mc1_stderr": 0.01749876717574008, "mc2": 0.6649309264076487, "mc2_stderr": 0.015613981130737324 }, "harness|arc:challenge|25": { "acc": 0.64419795221843, "acc_stderr": 0.013990571137918762, "acc_norm": 0.6834470989761092, "acc_norm_stderr": 0.013592431519068074 }, "harness|hellaswag|10": { "acc": 0.6951802429794861, "acc_stderr": 0.004593902601979335, "acc_norm": 0.877912766381199, "acc_norm_stderr": 0.00326717445844976 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.042039210401562783, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.042039210401562783 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.028815615713432115, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.028815615713432115 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7291666666666666, "acc_stderr": 0.03716177437566017, "acc_norm": 0.7291666666666666, "acc_norm_stderr": 0.03716177437566017 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.035676037996391706, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.035676037996391706 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.45098039215686275, "acc_stderr": 0.04951218252396262, "acc_norm": 0.45098039215686275, "acc_norm_stderr": 0.04951218252396262 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542129, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.46825396825396826, "acc_stderr": 0.0256993528321318, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.0256993528321318 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.02354079935872329, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.02354079935872329 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.46798029556650245, "acc_stderr": 0.035107665979592154, "acc_norm": 0.46798029556650245, "acc_norm_stderr": 0.035107665979592154 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721175, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721175 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8181818181818182, "acc_stderr": 0.027479603010538797, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.027479603010538797 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.02463978909770944, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.02463978909770944 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6512820512820513, "acc_stderr": 0.02416278028401772, "acc_norm": 0.6512820512820513, "acc_norm_stderr": 0.02416278028401772 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948485, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7058823529411765, "acc_stderr": 0.02959732973097809, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.02959732973097809 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8073394495412844, "acc_stderr": 0.016909276884936066, "acc_norm": 0.8073394495412844, "acc_norm_stderr": 0.016909276884936066 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.027599174300640766, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.027599174300640766 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.726457399103139, "acc_stderr": 0.029918586707798827, "acc_norm": 0.726457399103139, "acc_norm_stderr": 0.029918586707798827 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6870229007633588, "acc_stderr": 0.04066962905677697, "acc_norm": 0.6870229007633588, "acc_norm_stderr": 0.04066962905677697 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.03487825168497892, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.03487825168497892 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597518, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597518 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7854406130268199, "acc_stderr": 0.014680033956893346, "acc_norm": 0.7854406130268199, "acc_norm_stderr": 0.014680033956893346 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.653179190751445, "acc_stderr": 0.025624723994030454, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.025624723994030454 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.423463687150838, "acc_stderr": 0.0165254258987735, "acc_norm": 0.423463687150838, "acc_norm_stderr": 0.0165254258987735 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6993464052287581, "acc_stderr": 0.02625605383571896, "acc_norm": 0.6993464052287581, "acc_norm_stderr": 0.02625605383571896 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6945337620578779, "acc_stderr": 0.026160584450140453, "acc_norm": 0.6945337620578779, "acc_norm_stderr": 0.026160584450140453 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600713002, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600713002 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5177304964539007, "acc_stderr": 0.02980873964223777, "acc_norm": 0.5177304964539007, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5026075619295959, "acc_stderr": 0.012770062445433166, "acc_norm": 0.5026075619295959, "acc_norm_stderr": 0.012770062445433166 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.029163128570670733, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.029163128570670733 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6601307189542484, "acc_stderr": 0.019162418588623553, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.019162418588623553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8059701492537313, "acc_stderr": 0.02796267760476892, "acc_norm": 0.8059701492537313, "acc_norm_stderr": 0.02796267760476892 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.035887028128263686, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263686 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7660818713450293, "acc_stderr": 0.03246721765117826, "acc_norm": 0.7660818713450293, "acc_norm_stderr": 0.03246721765117826 }, "harness|truthfulqa:mc|0": { "mc1": 0.5116279069767442, "mc1_stderr": 0.01749876717574008, "mc2": 0.6649309264076487, "mc2_stderr": 0.015613981130737324 }, "harness|winogrande|5": { "acc": 0.8050513022888713, "acc_stderr": 0.01113409941593826 }, "harness|gsm8k|5": { "acc": 0.4025777103866566, "acc_stderr": 0.013508523063663442 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_vicgalle__solarized-18B-dpo
[ "region:us" ]
2024-01-19T19:09:08+00:00
{"pretty_name": "Evaluation run of vicgalle/solarized-18B-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/solarized-18B-dpo](https://huggingface.co/vicgalle/solarized-18B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__solarized-18B-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T19:06:47.100620](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__solarized-18B-dpo/blob/main/results_2024-01-19T19-06-47.100620.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6387313535462917,\n \"acc_stderr\": 0.03249557900745891,\n \"acc_norm\": 0.6436773766747061,\n \"acc_norm_stderr\": 0.03315021744428834,\n \"mc1\": 0.5116279069767442,\n \"mc1_stderr\": 0.01749876717574008,\n \"mc2\": 0.6649309264076487,\n \"mc2_stderr\": 0.015613981130737324\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.013990571137918762,\n \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.013592431519068074\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6951802429794861,\n \"acc_stderr\": 0.004593902601979335,\n \"acc_norm\": 0.877912766381199,\n \"acc_norm_stderr\": 0.00326717445844976\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.0256993528321318,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.0256993528321318\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.02354079935872329,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.02354079935872329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02959732973097809,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02959732973097809\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936066,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936066\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597518,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597518\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n \"acc_stderr\": 0.0165254258987735,\n \"acc_norm\": 0.423463687150838,\n \"acc_norm_stderr\": 0.0165254258987735\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5026075619295959,\n \"acc_stderr\": 0.012770062445433166,\n \"acc_norm\": 0.5026075619295959,\n \"acc_norm_stderr\": 0.012770062445433166\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.02796267760476892,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.02796267760476892\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5116279069767442,\n \"mc1_stderr\": 0.01749876717574008,\n \"mc2\": 0.6649309264076487,\n \"mc2_stderr\": 0.015613981130737324\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.01113409941593826\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4025777103866566,\n \"acc_stderr\": 0.013508523063663442\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/solarized-18B-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|arc:challenge|25_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|gsm8k|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hellaswag|10_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T19-06-47.100620.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["**/details_harness|winogrande|5_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T19-06-47.100620.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T19_06_47.100620", "path": ["results_2024-01-19T19-06-47.100620.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T19-06-47.100620.parquet"]}]}]}
2024-01-19T19:09:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vicgalle/solarized-18B-dpo Dataset automatically created during the evaluation run of model vicgalle/solarized-18B-dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T19:06:47.100620(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of vicgalle/solarized-18B-dpo\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/solarized-18B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T19:06:47.100620(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vicgalle/solarized-18B-dpo\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/solarized-18B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T19:06:47.100620(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d6f23fdfe7238835c2e9cff85f35dfc33bc622a7
# Dataset Card for Evaluation run of andrijdavid/tinyllama-dare <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [andrijdavid/tinyllama-dare](https://huggingface.co/andrijdavid/tinyllama-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_andrijdavid__tinyllama-dare", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T19:20:12.926605](https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__tinyllama-dare/blob/main/results_2024-01-19T19-20-12.926605.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.260218948497339, "acc_stderr": 0.03089367507715055, "acc_norm": 0.26040524383105657, "acc_norm_stderr": 0.031653815968800486, "mc1": 0.2558139534883721, "mc1_stderr": 0.015274176219283361, "mc2": 0.3901127619389903, "mc2_stderr": 0.014174485975506508 }, "harness|arc:challenge|25": { "acc": 0.3643344709897611, "acc_stderr": 0.014063260279882412, "acc_norm": 0.3728668941979522, "acc_norm_stderr": 0.014131176760131163 }, "harness|hellaswag|10": { "acc": 0.4700258912567218, "acc_stderr": 0.004980807231136748, "acc_norm": 0.6277633937462657, "acc_norm_stderr": 0.004824130528590593 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.033556772163131424, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.033556772163131424 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123387, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123387 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2679245283018868, "acc_stderr": 0.027257260322494845, "acc_norm": 0.2679245283018868, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.034765901043041336, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.034765901043041336 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749895, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749895 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237655, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237655 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.251063829787234, "acc_stderr": 0.028346963777162452, "acc_norm": 0.251063829787234, "acc_norm_stderr": 0.028346963777162452 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21052631578947367, "acc_stderr": 0.0383515395439942, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.0383515395439942 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.23448275862068965, "acc_stderr": 0.035306258743465914, "acc_norm": 0.23448275862068965, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25396825396825395, "acc_stderr": 0.022418042891113953, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.022418042891113953 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276862, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276862 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25161290322580643, "acc_stderr": 0.02468597928623997, "acc_norm": 0.25161290322580643, "acc_norm_stderr": 0.02468597928623997 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03010833071801162, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03010833071801162 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.23, "acc_stderr": 0.042295258468165044, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165044 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.24848484848484848, "acc_stderr": 0.03374402644139405, "acc_norm": 0.24848484848484848, "acc_norm_stderr": 0.03374402644139405 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.22727272727272727, "acc_stderr": 0.029857515673386407, "acc_norm": 0.22727272727272727, "acc_norm_stderr": 0.029857515673386407 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.21761658031088082, "acc_stderr": 0.029778663037752954, "acc_norm": 0.21761658031088082, "acc_norm_stderr": 0.029778663037752954 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2564102564102564, "acc_stderr": 0.022139081103971545, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.022139081103971545 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24444444444444444, "acc_stderr": 0.02620276653465215, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.02620276653465215 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.02755361446786382, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.02755361446786382 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2185430463576159, "acc_stderr": 0.03374235550425694, "acc_norm": 0.2185430463576159, "acc_norm_stderr": 0.03374235550425694 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.24036697247706423, "acc_stderr": 0.01832060732096407, "acc_norm": 0.24036697247706423, "acc_norm_stderr": 0.01832060732096407 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.03214952147802749, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.03214952147802749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.27450980392156865, "acc_stderr": 0.031321798030832904, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.031321798030832904 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.26582278481012656, "acc_stderr": 0.028756799629658342, "acc_norm": 0.26582278481012656, "acc_norm_stderr": 0.028756799629658342 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.34977578475336324, "acc_stderr": 0.03200736719484503, "acc_norm": 0.34977578475336324, "acc_norm_stderr": 0.03200736719484503 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.24427480916030533, "acc_stderr": 0.03768335959728745, "acc_norm": 0.24427480916030533, "acc_norm_stderr": 0.03768335959728745 }, "harness|hendrycksTest-international_law|5": { "acc": 0.24793388429752067, "acc_stderr": 0.039418975265163025, "acc_norm": 0.24793388429752067, "acc_norm_stderr": 0.039418975265163025 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2222222222222222, "acc_stderr": 0.040191074725573483, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25766871165644173, "acc_stderr": 0.03436150827846917, "acc_norm": 0.25766871165644173, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.2524271844660194, "acc_stderr": 0.04301250399690875, "acc_norm": 0.2524271844660194, "acc_norm_stderr": 0.04301250399690875 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2777777777777778, "acc_stderr": 0.029343114798094476, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.029343114798094476 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2886334610472541, "acc_stderr": 0.016203792703197804, "acc_norm": 0.2886334610472541, "acc_norm_stderr": 0.016203792703197804 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2398843930635838, "acc_stderr": 0.02298959254312357, "acc_norm": 0.2398843930635838, "acc_norm_stderr": 0.02298959254312357 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.22681564245810057, "acc_stderr": 0.014005843570897897, "acc_norm": 0.22681564245810057, "acc_norm_stderr": 0.014005843570897897 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24183006535947713, "acc_stderr": 0.024518195641879334, "acc_norm": 0.24183006535947713, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2765273311897106, "acc_stderr": 0.025403832978179615, "acc_norm": 0.2765273311897106, "acc_norm_stderr": 0.025403832978179615 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2654320987654321, "acc_stderr": 0.024569223600460845, "acc_norm": 0.2654320987654321, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24113475177304963, "acc_stderr": 0.025518731049537766, "acc_norm": 0.24113475177304963, "acc_norm_stderr": 0.025518731049537766 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2333767926988266, "acc_stderr": 0.010803108481179088, "acc_norm": 0.2333767926988266, "acc_norm_stderr": 0.010803108481179088 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.20588235294117646, "acc_stderr": 0.024562204314142314, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.024562204314142314 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2679738562091503, "acc_stderr": 0.017917974069594726, "acc_norm": 0.2679738562091503, "acc_norm_stderr": 0.017917974069594726 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2909090909090909, "acc_stderr": 0.04350271442923243, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.1673469387755102, "acc_stderr": 0.023897144768914524, "acc_norm": 0.1673469387755102, "acc_norm_stderr": 0.023897144768914524 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23880597014925373, "acc_stderr": 0.030147775935409224, "acc_norm": 0.23880597014925373, "acc_norm_stderr": 0.030147775935409224 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-virology|5": { "acc": 0.3313253012048193, "acc_stderr": 0.03664314777288087, "acc_norm": 0.3313253012048193, "acc_norm_stderr": 0.03664314777288087 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.25146198830409355, "acc_stderr": 0.033275044238468436, "acc_norm": 0.25146198830409355, "acc_norm_stderr": 0.033275044238468436 }, "harness|truthfulqa:mc|0": { "mc1": 0.2558139534883721, "mc1_stderr": 0.015274176219283361, "mc2": 0.3901127619389903, "mc2_stderr": 0.014174485975506508 }, "harness|winogrande|5": { "acc": 0.659037095501184, "acc_stderr": 0.0133226814359348 }, "harness|gsm8k|5": { "acc": 0.016679302501895376, "acc_stderr": 0.0035275958887224465 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_andrijdavid__tinyllama-dare
[ "region:us" ]
2024-01-19T19:22:02+00:00
{"pretty_name": "Evaluation run of andrijdavid/tinyllama-dare", "dataset_summary": "Dataset automatically created during the evaluation run of model [andrijdavid/tinyllama-dare](https://huggingface.co/andrijdavid/tinyllama-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andrijdavid__tinyllama-dare\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T19:20:12.926605](https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__tinyllama-dare/blob/main/results_2024-01-19T19-20-12.926605.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.260218948497339,\n \"acc_stderr\": 0.03089367507715055,\n \"acc_norm\": 0.26040524383105657,\n \"acc_norm_stderr\": 0.031653815968800486,\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.015274176219283361,\n \"mc2\": 0.3901127619389903,\n \"mc2_stderr\": 0.014174485975506508\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3643344709897611,\n \"acc_stderr\": 0.014063260279882412,\n \"acc_norm\": 0.3728668941979522,\n \"acc_norm_stderr\": 0.014131176760131163\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4700258912567218,\n \"acc_stderr\": 0.004980807231136748,\n \"acc_norm\": 0.6277633937462657,\n \"acc_norm_stderr\": 0.004824130528590593\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.033556772163131424,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.033556772163131424\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123387,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749895,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749895\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113953,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113953\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276862,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276862\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.02468597928623997,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.02468597928623997\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.022139081103971545,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.022139081103971545\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786382,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02755361446786382\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.031321798030832904,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.031321798030832904\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.029343114798094476,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.029343114798094476\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n \"acc_stderr\": 0.016203792703197804,\n \"acc_norm\": 0.2886334610472541,\n \"acc_norm_stderr\": 0.016203792703197804\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n \"acc_stderr\": 0.014005843570897897,\n \"acc_norm\": 0.22681564245810057,\n \"acc_norm_stderr\": 0.014005843570897897\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.2765273311897106,\n \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537766,\n \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537766\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2333767926988266,\n \"acc_stderr\": 0.010803108481179088,\n \"acc_norm\": 0.2333767926988266,\n \"acc_norm_stderr\": 0.010803108481179088\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142314,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142314\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594726,\n \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594726\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n \"acc_stderr\": 0.03664314777288087,\n \"acc_norm\": 0.3313253012048193,\n \"acc_norm_stderr\": 0.03664314777288087\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.015274176219283361,\n \"mc2\": 0.3901127619389903,\n \"mc2_stderr\": 0.014174485975506508\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.659037095501184,\n \"acc_stderr\": 0.0133226814359348\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \"acc_stderr\": 0.0035275958887224465\n }\n}\n```", "repo_url": "https://huggingface.co/andrijdavid/tinyllama-dare", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|arc:challenge|25_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|gsm8k|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hellaswag|10_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T19-20-12.926605.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["**/details_harness|winogrande|5_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T19-20-12.926605.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T19_20_12.926605", "path": ["results_2024-01-19T19-20-12.926605.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T19-20-12.926605.parquet"]}]}]}
2024-01-19T19:22:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of andrijdavid/tinyllama-dare Dataset automatically created during the evaluation run of model andrijdavid/tinyllama-dare on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T19:20:12.926605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of andrijdavid/tinyllama-dare\n\n\n\nDataset automatically created during the evaluation run of model andrijdavid/tinyllama-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T19:20:12.926605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of andrijdavid/tinyllama-dare\n\n\n\nDataset automatically created during the evaluation run of model andrijdavid/tinyllama-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T19:20:12.926605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
377de7adc152b2519bd559b3c736d28f7a7f0ce2
# Dataset Card for Evaluation run of leveldevai/BeagleMist-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [leveldevai/BeagleMist-7B](https://huggingface.co/leveldevai/BeagleMist-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_leveldevai__BeagleMist-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T19:26:57.593325](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__BeagleMist-7B/blob/main/results_2024-01-19T19-26-57.593325.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6578372082846259, "acc_stderr": 0.03183820575158367, "acc_norm": 0.6576272237213145, "acc_norm_stderr": 0.03249584099098042, "mc1": 0.48225214198286415, "mc1_stderr": 0.01749247084307536, "mc2": 0.648345615677013, "mc2_stderr": 0.0152064953463137 }, "harness|arc:challenge|25": { "acc": 0.6791808873720137, "acc_stderr": 0.013640943091946528, "acc_norm": 0.7107508532423208, "acc_norm_stderr": 0.013250012579393443 }, "harness|hellaswag|10": { "acc": 0.6963752240589524, "acc_stderr": 0.004588827958775114, "acc_norm": 0.874726150169289, "acc_norm_stderr": 0.0033035264131234957 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.04094376269996793, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.04094376269996793 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.0373852067611967, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.0373852067611967 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.028049186315695255, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.028049186315695255 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6994219653179191, "acc_stderr": 0.03496101481191179, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.03496101481191179 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5914893617021276, "acc_stderr": 0.032134180267015755, "acc_norm": 0.5914893617021276, "acc_norm_stderr": 0.032134180267015755 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.02530590624159063, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.02530590624159063 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8, "acc_stderr": 0.022755204959542946, "acc_norm": 0.8, "acc_norm_stderr": 0.022755204959542946 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229872, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229872 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.02882088466625326, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.02882088466625326 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8605504587155963, "acc_stderr": 0.014852421490033048, "acc_norm": 0.8605504587155963, "acc_norm_stderr": 0.014852421490033048 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8578431372549019, "acc_stderr": 0.02450980392156861, "acc_norm": 0.8578431372549019, "acc_norm_stderr": 0.02450980392156861 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.02531049537694486, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.02531049537694486 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.035477710041594654, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.035477710041594654 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8240740740740741, "acc_stderr": 0.036809181416738807, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8326947637292464, "acc_stderr": 0.013347327202920332, "acc_norm": 0.8326947637292464, "acc_norm_stderr": 0.013347327202920332 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7485549132947977, "acc_stderr": 0.02335736578587403, "acc_norm": 0.7485549132947977, "acc_norm_stderr": 0.02335736578587403 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.016495400635820084, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.016495400635820084 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757482, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757482 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984813, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46284224250325945, "acc_stderr": 0.01273492357953207, "acc_norm": 0.46284224250325945, "acc_norm_stderr": 0.01273492357953207 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462927, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462927 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.684640522875817, "acc_stderr": 0.01879808628488689, "acc_norm": 0.684640522875817, "acc_norm_stderr": 0.01879808628488689 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.02826388994378459, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.02826388994378459 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.48225214198286415, "mc1_stderr": 0.01749247084307536, "mc2": 0.648345615677013, "mc2_stderr": 0.0152064953463137 }, "harness|winogrande|5": { "acc": 0.819258089976322, "acc_stderr": 0.010814911009613988 }, "harness|gsm8k|5": { "acc": 0.7187263078089462, "acc_stderr": 0.012384789310940243 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_leveldevai__BeagleMist-7B
[ "region:us" ]
2024-01-19T19:29:19+00:00
{"pretty_name": "Evaluation run of leveldevai/BeagleMist-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [leveldevai/BeagleMist-7B](https://huggingface.co/leveldevai/BeagleMist-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__BeagleMist-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T19:26:57.593325](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__BeagleMist-7B/blob/main/results_2024-01-19T19-26-57.593325.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6578372082846259,\n \"acc_stderr\": 0.03183820575158367,\n \"acc_norm\": 0.6576272237213145,\n \"acc_norm_stderr\": 0.03249584099098042,\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.648345615677013,\n \"mc2_stderr\": 0.0152064953463137\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.013640943091946528,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6963752240589524,\n \"acc_stderr\": 0.004588827958775114,\n \"acc_norm\": 0.874726150169289,\n \"acc_norm_stderr\": 0.0033035264131234957\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8605504587155963,\n \"acc_stderr\": 0.014852421490033048,\n \"acc_norm\": 0.8605504587155963,\n \"acc_norm_stderr\": 0.014852421490033048\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156861,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156861\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694486,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694486\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.01273492357953207,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.01273492357953207\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.648345615677013,\n \"mc2_stderr\": 0.0152064953463137\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613988\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7187263078089462,\n \"acc_stderr\": 0.012384789310940243\n }\n}\n```", "repo_url": "https://huggingface.co/leveldevai/BeagleMist-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|arc:challenge|25_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|gsm8k|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hellaswag|10_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T19-26-57.593325.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["**/details_harness|winogrande|5_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T19-26-57.593325.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T19_26_57.593325", "path": ["results_2024-01-19T19-26-57.593325.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T19-26-57.593325.parquet"]}]}]}
2024-01-19T19:29:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of leveldevai/BeagleMist-7B Dataset automatically created during the evaluation run of model leveldevai/BeagleMist-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T19:26:57.593325(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of leveldevai/BeagleMist-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/BeagleMist-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T19:26:57.593325(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of leveldevai/BeagleMist-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/BeagleMist-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T19:26:57.593325(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4d1d667bd4810fe649a5fb5fa1ab5b634460a53c
# Dataset Card for Evaluation run of andrijdavid/macaroni-7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [andrijdavid/macaroni-7b](https://huggingface.co/andrijdavid/macaroni-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_andrijdavid__macaroni-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-19T19:41:59.982970](https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__macaroni-7b/blob/main/results_2024-01-19T19-41-59.982970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6519925648651723, "acc_stderr": 0.032113795854224476, "acc_norm": 0.6512702756390973, "acc_norm_stderr": 0.03278624017466537, "mc1": 0.5618115055079559, "mc1_stderr": 0.01736923616440441, "mc2": 0.6876319522488263, "mc2_stderr": 0.015242295657961013 }, "harness|arc:challenge|25": { "acc": 0.7098976109215017, "acc_stderr": 0.013261573677520769, "acc_norm": 0.7312286689419796, "acc_norm_stderr": 0.012955065963710693 }, "harness|hellaswag|10": { "acc": 0.7197769368651663, "acc_stderr": 0.004481902637505655, "acc_norm": 0.8816968731328421, "acc_norm_stderr": 0.0032230665918060015 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.02783491252754407, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.02783491252754407 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.02548718714785938, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.02548718714785938 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188712, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188712 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603348, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603348 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03038835355188679, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03038835355188679 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669237, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669237 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.026460569561240644, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.026460569561240644 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467618, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467618 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313729, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313729 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098823, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098823 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.046840993210771065, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.046840993210771065 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179326, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179326 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368985, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368985 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500097, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500097 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4424581005586592, "acc_stderr": 0.01661139368726858, "acc_norm": 0.4424581005586592, "acc_norm_stderr": 0.01661139368726858 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7320261437908496, "acc_stderr": 0.025360603796242557, "acc_norm": 0.7320261437908496, "acc_norm_stderr": 0.025360603796242557 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.02567025924218893, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.02567025924218893 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869647, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869647 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6748366013071896, "acc_stderr": 0.018950886770806315, "acc_norm": 0.6748366013071896, "acc_norm_stderr": 0.018950886770806315 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685516, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640044, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640044 }, "harness|truthfulqa:mc|0": { "mc1": 0.5618115055079559, "mc1_stderr": 0.01736923616440441, "mc2": 0.6876319522488263, "mc2_stderr": 0.015242295657961013 }, "harness|winogrande|5": { "acc": 0.8437253354380426, "acc_stderr": 0.010205351791873497 }, "harness|gsm8k|5": { "acc": 0.686125852918878, "acc_stderr": 0.012782681251053203 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_andrijdavid__macaroni-7b
[ "region:us" ]
2024-01-19T19:44:19+00:00
{"pretty_name": "Evaluation run of andrijdavid/macaroni-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [andrijdavid/macaroni-7b](https://huggingface.co/andrijdavid/macaroni-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andrijdavid__macaroni-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-19T19:41:59.982970](https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__macaroni-7b/blob/main/results_2024-01-19T19-41-59.982970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6519925648651723,\n \"acc_stderr\": 0.032113795854224476,\n \"acc_norm\": 0.6512702756390973,\n \"acc_norm_stderr\": 0.03278624017466537,\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.01736923616440441,\n \"mc2\": 0.6876319522488263,\n \"mc2_stderr\": 0.015242295657961013\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520769,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710693\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7197769368651663,\n \"acc_stderr\": 0.004481902637505655,\n \"acc_norm\": 0.8816968731328421,\n \"acc_norm_stderr\": 0.0032230665918060015\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n \"acc_stderr\": 0.01661139368726858,\n \"acc_norm\": 0.4424581005586592,\n \"acc_norm_stderr\": 0.01661139368726858\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.01736923616440441,\n \"mc2\": 0.6876319522488263,\n \"mc2_stderr\": 0.015242295657961013\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873497\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \"acc_stderr\": 0.012782681251053203\n }\n}\n```", "repo_url": "https://huggingface.co/andrijdavid/macaroni-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|arc:challenge|25_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|gsm8k|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hellaswag|10_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-19T19-41-59.982970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["**/details_harness|winogrande|5_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-19T19-41-59.982970.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_19T19_41_59.982970", "path": ["results_2024-01-19T19-41-59.982970.parquet"]}, {"split": "latest", "path": ["results_2024-01-19T19-41-59.982970.parquet"]}]}]}
2024-01-19T19:44:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of andrijdavid/macaroni-7b Dataset automatically created during the evaluation run of model andrijdavid/macaroni-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-19T19:41:59.982970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of andrijdavid/macaroni-7b\n\n\n\nDataset automatically created during the evaluation run of model andrijdavid/macaroni-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T19:41:59.982970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of andrijdavid/macaroni-7b\n\n\n\nDataset automatically created during the evaluation run of model andrijdavid/macaroni-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-19T19:41:59.982970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
30706871ad6524bbcd02b6d29f7e9fdc1ddc588a
## Python Copilot AI Research Coding Dataset This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset. ### Details Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more. - Rows: 514430 - Size: 674 MB - Data type: text - Format: Extracted code using python AST ### Schema ```json { "args": "string", "class_bases": "string", "class_docstr": "string", "class_docstr_tok": "string", "class_name": "string", "code": "string", "code_tok": "string", "docstr": "string", "docstr_tok": "string", "file_path": "string", "filename": "string", "imports": "string", "is_member": "bool", "label_desc": "string", "label_desc_len": "int64", "label_id": "string", "lend": "int64", "lstart": "int64", "name": "string", "num_all_bases": "float64", "num_bases": "float64", "num_classes": "float64", "num_functions": "int64", "num_imports": "int64", "num_methods": "float64", "raises": "string", "returns": "string", "total_objects": "int64" } ``` ### How to use the dataset ```python from datasets import load_dataset ds = load_dataset("matlok/python-copilot-training-on-ai-research-repos", data_dir="files") ```
matlok/python-copilot-training-on-ai-research-repos
[ "task_categories:text-generation", "task_ids:parsing", "size_categories:100K<n<1M", "license:other", "python-copilot", "python-coding", "fine-tuning", "training", "alpaca", "text", "coding", "region:us" ]
2024-01-19T20:35:01+00:00
{"license": ["other"], "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "task_ids": ["parsing"], "pretty_name": "python copilot ai research coding dataset", "dataset_info": [{"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-code-ai-core-v1_00000002.parquet"}]}], "tags": ["python-copilot", "python-coding", "fine-tuning", "training", "alpaca", "text", "coding"]}
2024-01-25T18:54:12+00:00
[]
[]
TAGS #task_categories-text-generation #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #fine-tuning #training #alpaca #text #coding #region-us
## Python Copilot AI Research Coding Dataset This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset. ### Details Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more. - Rows: 514430 - Size: 674 MB - Data type: text - Format: Extracted code using python AST ### Schema ### How to use the dataset
[ "## Python Copilot AI Research Coding Dataset\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.", "### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 514430\n- Size: 674 MB\n- Data type: text\n- Format: Extracted code using python AST", "### Schema", "### How to use the dataset" ]
[ "TAGS\n#task_categories-text-generation #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #fine-tuning #training #alpaca #text #coding #region-us \n", "## Python Copilot AI Research Coding Dataset\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.", "### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 514430\n- Size: 674 MB\n- Data type: text\n- Format: Extracted code using python AST", "### Schema", "### How to use the dataset" ]