Skip to content
Navigation menu
Search
Powered by
Search
Algolia
Log in
Create account
Forem
Close
#
moe
Follow
Hide
Posts
Left menu
👋
Sign in
for the ability to sort posts by
relevant
,
latest
, or
top
.
Right menu
🚀 LLMs are getting huge. But do we need all that firepower all the time?
Aleksei Aleinikov
Aleksei Aleinikov
Aleksei Aleinikov
Follow
Apr 11
🚀 LLMs are getting huge. But do we need all that firepower all the time?
#
ai
#
llm
#
machinelearning
#
moe
1
reaction
Comments
Add Comment
1 min read
A Slightly Technical Deep Dive into DeepSeek R1
Abhishek Gautam
Abhishek Gautam
Abhishek Gautam
Follow
Jan 30
A Slightly Technical Deep Dive into DeepSeek R1
#
deepseek
#
ai
#
opensource
#
moe
3
reactions
Comments
Add Comment
3 min read
DBRX, Grok, Mixtral: Mixture-of-Experts is a trending architecture for LLMs
AI/ML API
AI/ML API
AI/ML API
Follow
Apr 11 '24
DBRX, Grok, Mixtral: Mixture-of-Experts is a trending architecture for LLMs
#
llm
#
moe
#
ai
Comments
Add Comment
7 min read
loading...
We're a blogging-forward open source social network where we learn from one another
Log in
Create account