Template bug fixed in llama.cpp

#11
by matteogeniaccio - opened

This PR fixed a template bug but it requires regeneration.
https://github.com/ggml-org/llama.cpp/commit/ced44be34290fab450f8344efa047d8a08e723b4
Thanks

And there's another fix for GLM4 chat template, it seems just a linefeed https://github.com/ggml-org/llama.cpp/releases/tag/b5250

this new one i think is only on the inference side, right?

Versus the first that i regenerated for?

Yes. Inference side. But the modified jinja template is embedded in the gguf and used in llama.cpp.

ohhhhh i didn't notice that part

but that would have been picked up when i remade these a couple days ago, so no worries there :)

Sign up or log in to comment