GALACTICA 6.7B (standard)

Why have this ?

Well, I liked the premise of GALACTICA models because they are trained on scientific texts. Intuitively, The models they should be the model of choice when it comes to scientific tasks. Although the weights of the original model are available, they are raw binary files, and this repository converts them into safetensors.

Size Parameters
mini 125 M
base 1.3 B
standard 6.7 B
large 30 B
huge 120 B

NOTE

I have not trained, nor fine-tuned the existing model weights provided by GALACTICA and this repository in no way attempts to take the credit away from Ross Taylor and team who contributed towards GALACTICA. This repository just exists as a personal snapshot of the model in a different tensor format for faster computations and downstream model building exercises.

Original paper Citation

@inproceedings{GALACTICA,
    title={GALACTICA: A Large Language Model for Science},
    author={Ross Taylor and Marcin Kardas and Guillem Cucurull and Thomas Scialom and Anthony Hartshorn and Elvis Saravia and Andrew Poulton and Viktor Kerkez and Robert Stojnic},
    year={2022}
}
Downloads last month
12
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for akhilpandey95/galactica-standard

Finetuned
(2)
this model