lbourdois commited on
Commit
7d953e6
·
1 Parent(s): 1aeb57e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -2
README.md CHANGED
@@ -1,10 +1,30 @@
1
  ---
2
  title: SSM Blog Posts
3
- emoji: 💻
4
  colorFrom: purple
5
  colorTo: yellow
6
  sdk: static
7
  pinned: false
8
  ---
9
 
10
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  title: SSM Blog Posts
3
+ emoji: 📝
4
  colorFrom: purple
5
  colorTo: yellow
6
  sdk: static
7
  pinned: false
8
  ---
9
 
10
+ <b><p style="text-align: center; color:red">Une version en français est disponible sur mon [blog](https://lbourdois.github.io/blog/ssm/)</p></b>
11
+ <br>
12
+
13
+ October 7, 2021, while wondering whether [AK](https://hf.co/akhaliq) was a bot or a human, I saw one of his [tweets](https://twitter.com/_akhaliq/status/1445931206030282756).
14
+ A link to a publication on [open-review.net](https://openreview.net/forum?id=uYLFoz1vlAC) accompanied by the following image:
15
+
16
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/613b0a62a14099d5afed7830/QMpNVGwdQV2jRw-jYalxa.png" alt="alt text" width="800" height="450">
17
+
18
+
19
+ Intrigued by the results announced, I went to read what this S3 model consisted of, which would be renamed less than a month later to [S4](https://twitter.com/_albertgu/status/1456031299194470407) ([link](https://github.com/lbourdois/blog/blob/master/assets/efficiently_modeling_long_sequences_s3.pdf) from the version from when it was still called S3 for those interested).
20
+
21
+ It's the only scientific article that gave me goosebumps when I read it, so beautiful did I find it. At that time, I was convinced that State Space Models (SSM) would replace transformers in the following months. Two years later, I'm forced to admit that I was completely mistaken in the face of the tidal wave of LLMs making the news in NLP.
22
+
23
+ Nevertheless, on Monday December 4, 2023, the announcement of Mamba by [Albert Gu](https://twitter.com/_albertgu/status/1731727672286294400) and [Tri Dao](https://twitter.com/tri_dao/status/1731728602230890895) aroused some interest. The phenomenon was accentuated 4 days later with the announcement of [StripedHyena](https://twitter.com/togethercompute/status/1733213267185762411) by Together AI.
24
+ A good opportunity for me to write a few words about SSM developments over the past two years.
25
+
26
+ I'm planning three articles to start with, where the aim is to illustrate the basics of SSM with S4 (the "Attention is all you need" of the field) before carrying out a literature review of the evolution of SSM since that first paper:
27
+ - [Introduction to SSM and S4](WIP)
28
+ - [SSM evolutions in 2022](WIP)
29
+ - [SSM developments in 2023](WIP)
30
+ I hope in a second time, time permitting, to go into detail about the architectures of some specific SSMs with animations ✨