Jordan Legg PRO
AI & ML interests
Recent Activity
Organizations
takarajordan's activity

I'm using https://artificialanalysis.ai/ just because it puts everything in one place! It's not the best resource but these days I'm all about saving time.

@ThomasTheMaker if you make an issue on the repo, I'll look into it!

@ThomasTheMaker it's just the raw attention and transformer architecture in golang designed for serverless so performance will definitely be less than ggml and llama.cpp since it's not accelerated by GPU's but if you're into edge AI CPU only, this is the first, only and best way to compute attention.
Quantization can definitely be supported as it's just a math model!

We built this library at takara.ai to bring attention mechanisms and transformer layers to Go β in a form that's lightweight, clean, and dependency-free.
Weβre proud to say that every part of this project reflects what we set out to do.
- Pure Go β no external dependencies, built entirely on the Go standard library
- Core support for DotProductAttention and MultiHeadAttention
- Full transformer layers with LayerNorm, feed-forward networks, and residual connections
- Designed for edge, embedded, and real-time environments where simplicity and performance matter
Thank you to everyone who has supported this so far β the stars, forks, and feedback mean a lot.


No abstracts, just bullet points.
Start your day here: https://tldr.takara.ai
This is a pretty big update for sure. The models have improved significantly which is great for everyone involved, especially the end user. Those datasets look very promising as well!
Sounds interesting, Iβll check it out!
This is a really interesting post. Iβve been looking at the DeepSeek models for sure. This shows a pretty nice improvement, would love to see some example changes!
Very cool