Meta's Llama 3.1 vs. GPT-4o 🀯 // OpenAI's own AI chips 🧐 // SlowFast-LLaVA for Video LLMs 🎬

Meta's Llama 3.1 vs. GPT-4o 🀯 // OpenAI's own AI chips 🧐 // SlowFast-LLaVA for Video LLMs 🎬

Author: Earkind July 23, 2024 Duration: 14:06

Meta's upcoming Llama 3.1 models could outperform the current state-of-the-art closed-source LLM model, OpenAI's GPT-4o.

OpenAI is planning to develop its own AI chip to optimize performance and potentially supercharge their progress towards AGI.

Apple's SlowFast-LLaVA is a new training-free video large language model that captures both detailed spatial semantics and long-range temporal context in video without exceeding the token budget of commonly used LLMs.

Google's Conditioned Language Policy (CLP) framework is a general framework that builds on techniques from multi-task training and parameter-efficient finetuning to develop steerable models that can trade-off multiple conflicting objectives at inference time.

Contact:Β Β sergi@earkind.com

Timestamps:

00:34 Introduction

01:28Β LLAMA 405B Performance Leaked

03:01Β OpenAI Wants Its Own AI Chips

04:25Β Towards more cooperative AI safety strategies

06:01 Fake sponsor

07:35Β SlowFast-LLaVA: A Strong Training-Free Baseline for Video Large Language Models

09:17Β AssistantBench: Can Web Agents Solve Realistic and Time-Consuming Tasks?

10:56Β Conditioned Language Policy: A General Framework for Steerable Multi-Objective Finetuning

12:46 Outro


Each morning, GPT Reviews serves up a fresh, slightly chaotic conversation about everything happening in artificial intelligence. This daily podcast from Earkind is actually crafted by AI, offering a unique blend of the latest headlines, major announcements, and intriguing research plucked from sources like arXiv. But it’s far from a dry briefing. The dynamic comes from its four distinct hosts: Giovani Pete Tizzano brings relentless optimism as an AI enthusiast, while Robert, the analyst, provides a grounded and often skeptical counterpoint. Olivia, who’s deeply embedded in online communities, shares the buzz and broader reactions, and Belinda, the witty research expert, helps unpack the technical details with clarity and a sharp sense of humor. Tuning in feels like dropping into a lively roundtable where complex ideas are debated, explained, and occasionally laughed about. You’ll get a comprehensive yet digestible overview of the AI landscape, all wrapped in a format that’s as entertaining as it is informative. The result is a consistently engaging listen that keeps you updated without feeling like homework, making it a standout in the daily news podcast space.
Author: Language: English Episodes: 100

GPT Reviews
Podcast Episodes