GPT Reviews
Meta's Training and Inference Accelerator promises significant performance improvements for AI workloads.
Avi Wigderson receives the Turing Award for his contributions to the theory of computation and randomness in computation.
Intel's Meteor Lake iGPU and Mistral 8x22B offer exciting advancements in the GPU market and language models.
MuPT and Eagle and Finch present new models for music generation and sequence modeling, respectively.
Contact:Β Β sergi@earkind.com
Timestamps:
00:34 Introduction
01:37Β Our next-generation Meta Training and Inference Accelerator
05:05Β Intelβs Ambitious Meteor Lake iGPU
06:06Β Mistral 8x22B
07:34 Fake sponsor
09:34Β MuPT: A Generative Symbolic Music Pretrained Transformer
11:11Β Eagle and Finch: RWKV with Matrix-Valued States and Dynamic Recurrence
12:49 Outro