🎙️ Episode 26404:10 • May 1, 2026
Mistral Medium 3.5: 128B Open-Weight Frontier Coder
Listen to this episode
AI-generated discussion by Alex and Jamie
About this episode
Join Alex and Jamie as they discuss mistral medium 3.5: 128b open-weight frontier coder in this episode of Nerd Level Tech AI Cast.
Transcript
[Alex]: Welcome back, tech enthusiasts! You’re listening to the "Nerd Level Tech AI Cast," where we dive deep into the circuits of the latest tech advancements. I’m Alex, here to break down the complex. [Jamie]: And I’m Jamie, here to keep Alex from geeking out too hard and to ask all the questions you’re probably thinking! Today, we’re talking about the Mistral Medium 3.5, a 128 billion-parameter AI beast that’s changing the coding game. Alex, ready to unpack this? [Alex]: Absolutely, Jamie. Let’s get into it! [Alex]: So, Mistral Medium 3.5 was released just last April, and it’s a dense model, meaning every single one of its 128 billion parameters is active in processing, unlike some other models that only activate a portion of their parameters. [Jamie]: Hold up, dense like my morning smoothie? What’s the benefit of having all parameters active? [Alex]: Good analogy! Yes, like your chunky smoothie, having all parameters active makes the AI more robust per computation. It’s heavier on memory but offers predictability and efficiency, especially when deployed on smaller setups like four GPUs. [Jamie]: Four GPUs? That sounds more like a gaming rig than a server farm! [Alex]: Exactly! And that’s the beauty of it. Mistral claims you can host this AI on as few as four GPUs with decent performance. It’s like having a powerhouse in your backyard. [Alex]: Moving on, Medium 3.5 has a 256K token context window. That’s like being able to remember the last 256,000 tokens it saw, which is great for complex tasks like handling large codebases or detailed documents in a single go. [Jamie]: That’s a lot of tokens! So, it’s not just about raw power; it’s also about memory? [Alex]: Spot on. And when it comes to inputs, it’s not just text. This model can handle images too, though it outputs only in text. The flexibility here is fantastic for tasks that need to analyze both text and visual data. [Jamie]: Like reading a comic book and describing it? Neat! [Alex]: Exactly. Now, on benchmarks, while it’s not topping the charts against the behemoths like DeepSeek V4-Pro, it holds its own with a score of 77.6 on the SWE-Bench. [Jamie]: Not the top scorer, huh? But still, pretty good for something that you can run on a few GPUs at home. [Alex]: Right. And it’s half the API cost of some of its competitors, which makes it a cost-effective option for developers. [Jamie]: Now, what’s this about the Modified MIT license? Sounds like legal mumbo-jumbo. [Alex]: It does sound a bit dry, but it’s crucial. Mistral Medium 3.5 is open under a Modified MIT license, which is pretty open except for very large companies, who need to negotiate separate terms. It’s a way to keep the tech accessible while also protecting the creator’s interests. [Jamie]: So, it’s like saying, "Hey, everyone can use it, but if you’re really big, let’s talk details"? [Alex]: Exactly that. And speaking of using it, Mistral also updated their products like Le Chat and Mistral Vibe to use this new model. [Jamie]: So, they eat their own dog food. [Alex]: They do, and it seems to taste good! [LAUGHS] [Jamie]: Before we wrap up, let’s touch on the EAGLE draft head. That sounds fancy. What’s it do? [Alex]: The EAGLE draft head is designed for low-latency work, improving how quickly the model can start giving useful output. It’s a bit technical, but think of it as the turbo button on your old PC, making things faster when you really need speed. [Jamie]: Ah, hitting the nitro! I love it. [Alex]: Exactly! [OUTRO MUSIC FADES IN] [Alex]: Well, that’s a wrap on today’s episode. Thanks for diving into the world of AI with us. [Jamie]: And remember, whether it’s dense models or smoothies, we’ll keep breaking it down for you here at Nerd Level Tech AI Cast. Thanks for tuning in, and see you next time! [OUTRO MUSIC FADES OUT]