Radeon 7000 Leaks Mean more Than Unfancy Fans And Power Connectors

Radeon 7000 Leaked Images

I’m sure you’ve seen them already. Someone… leaked pictures of the upcoming RDNA3 GPUs, and there’s some interesting information here.

Let’s go over what we know. A Twitter user that goes by the handle HXL leaked images of two AMD Radeon graphics cards. Both GPUs in those leaks are assumed to be RDNA3 reference boards. Two photos of both assumed reference boards offer a top-down view of the fan design and heatsink and a side view featuring the power connectors.

While that’s not much to go on, we can read between the lines.

Did AMD leak RDNA3 images to ease apprehension?

First, let’s address these leaks. The timing is too perfect. The Nvidia GeForce 4090 has been under fire recently, and not in a good way. GeForce 4090 power adapters are melting and becoming a fire risk.

Nvidia initially pushed back on tech press, stating everything is working within spec, but reports continued to surface from GeForce 4090 users experiencing the same issues. Many of these reports have been independently verified by other outlets.

Igor’s Lab reported that the metal film between the power connectors is too thin. Before Igor’s report, Jayz2Cents, a popular tech YouTube blogger, made assumptions that the failure point was most likely due to poor contact between power supply pins.

When electrical components don’t have good connections, they create resistance. Resistance creates heat, and based on Igor’s findings, weak insulators and poor construction is causing Nvidia’s power adapters to fail in a blazingly hot way.

It could be assumed that AMD wanted to squash the hype cycle before it had a chance to fester. After reports of Nvidia’s power connector issues, many people in the tech scene wondered if the new upcoming AMD GPUs would suffer the same fate. If these leaked photos do indeed display RDNA3 reference design boards, that should put many PC enthusiasts’ fears to rest. The new RNDA3 GPUs do not use the same power connectors as the GeForce 4090 – RNDA3 is still rockin’ the classics.

It should be noted that reputable 3rd party power adapter manufacturers do not appear to have the same issues as Nvidia’s.  While this may seem like good news, I would like to express cautious optimism. There’s a lot of power being pumped through these power adapters, and it’s still early days.

More Powaaaah!

As others have pointed out, the power connectors displayed in the supposed RDNA3 leaks don’t support the same power draw as Nvidia’s. They have a max theoretical 375-watt support. That may not be all doom and gloom for AMD, however.

The RDNA architecture, as Miss Sue (AMD’s CEO) puts it, has aged like fine wine. Unlike Nvidia, AMD is trying to be wattage responsible. Both RDNA and RDNA2 have been more power efficient than Nvidia’s offerings while offering impressive comparable results.

It’s highly rumored that AMD is going to boost rasterization performance, too. The crown is still going to Nvidia in the ray tracing department and likely will for some time, however.

Either way, for gamers waiting to see what AMD’s new toys look like, this Christmas season could be exciting.

Less Powaaaah!

That’s not the interesting tidbits for me, however. Don’t get me wrong. I love some juicy tech gossip as much as the next person, but I’ll need a bit more fluff for my sandwich.

Here’s where it gets interesting to me. Those power connectors are telling. AMD wants to continue the trend of being power budget-friendly, and Nvidia isn’t careful, AMD may eat Nvidia’s lunch because of it.

When the 4090 info leaked months back, everyone in the tech scene balked at how much power it was suspected of using. Of course, we now know that the nominal power draw for a GeForce 4090 is about 450 watts though its power adapters can support up to 600 watts. Nonetheless, the GeForce 4090 quickly became a meme.

I also recently wrote an article that the GeForce 4090 wasn’t for gamers. Though I spell out that sentence with a bit of flame bait (go back and read the article), I still contend that the GeForce 4090 isn’t for gamers. Instead, it’s a high-end non-Quadro professional card that gamers and PC enthusiasts can easily access. It certainly helps Nvidia’s image, too. We’ve all been talking about the 4090 non-stop since it launched (for better or worse).

I’m going to take the same position for AMDs flag-ship cards – though they probably won’t have the same shock value the 4090 did. Whether or not AMD can come close to matching the 4090 performance doesn’t matter. GeForce beat them to the gate. We’re expecting an equal competitor, and anything less will be met with disappointment.

AMD RDNA3

It Always Comes Back To Business…

However, like Nvidia, AMD’s RDNA3 architecture will spill over to the enterprise world. With a lot of performance headroom and less power draw, it will be a no-brainer for the data center to choose AMD over Nvidia at scale.

There are multiple reasons why. The biggest, as I’ve alluded to, is power. Power budgeting and environmental management in a data center, for lack of better words, universally understood globally, is a bitch. Data centers are giant, hot, power-hungry beasts.

Data centers will choose AMD over Nvidia in this department. AMD is already starting to pull ahead in the data center. RDNA GPU accelerators and Epyc chips are very efficient for the workloads applied to them.

I Can’t Stop Talking About Skynet!

Nvidia has a massive lead in machine learning. The industry is practically built around Cuda, but no one likes building for it. I can’t attest to this myself. I’ve only ever used pre-trained models and libraries for my ML implementations.

AMD is making significant advancements with ROCm, though – AMD’s ML platform. While ROCm is slightly more challenging to get up and running from a basic dev’s point of view, it works great. The biggest issue I experienced playing with ROCm is installing Ubuntu (I’m an Arch fan). Otherwise, AMD’s ROCm libraries were easy to install. They worked great on my little Asus Strix with a Radeon 6800m.

If you haven’t heard about the AI fuss yet, it’s a big thing. People love the new updates to ROCm, and from what I have been researching, it’s designed the way ML engineers and data scientists want it. This is priming AMD to be an ML powerhouse in the near future.

AV1 Has Entered The Chat

Another big thing is AV1 encoding support. AV1 is blowing up in the tech media. There’s a reason why. AV1 is incredibly efficient for encoding, storing, and decoding videos. Videos take a lot of processing power and storage to manage, however.

It’s not just a YouTube or TikTok problem, either. Video has become an essential communication method for all businesses. Ask me how much some of my clients spend on S3 storage each month to store video files… It’s an insane amount.

AV1 encoding support will be huge in the next few years as more hardware support is introduced. H.264 and H.265 will slowly fade, and the HEVC codec will die on the vine despite Apple’s best efforts.

Big Things On AMD’s Horizon

See how I managed to draw that out? I love me some juicy tech gossip.

AMD is putting itself in a position to steal Nvidia’s crown. Though these recent RNDA3 leaks don’t seal the deal, they do provide the icing to an otherwise tasty cake.

Nvidia has been messing up lately. Nvidia’s CEO, Jensen Huang, has made it clear that he doesn’t like working with partners. Nvidia wants to go the Apple route. While it works for Apple, I can guarantee this is a mistake. Nvidia needs board partners as much as AMD does. Neither AMD nor Nvidia can support their products’ vast array of use cases. They need partners to fill in the gaps. The Apple model does not work in this case due to the simple fact that GPUs are not self-contained products like an iPhone. They are components that build bigger things and need to be able to adapt accordingly.

Nvidia is setting the world on fire, and not in a good way. While the 4090 power connector issue will likely fade into a meme over time as early adopters slowly grow numb to that pain, there will always be a hint of that in the back of our minds. Ten years from now, we’ll be like, “Remember when the 4090 caught on fire?” That kind of negative sentiment isn’t good over the long haul.

Whether salespeople want to admit it or not, negative sentiment impacts product recognition and brand trust for a long time. IT folks will remember this when they purchase a fleet of GPUs for their workstations or data centers. AMD appears to be eliminating power draw issues before they have a chance to occur.

Nvidia holds the crown in AI, but they’ve pissed off people along the way. AMD took their sweet ass time taking ML seriously, but they are firing on all cylinders today. ROCm is shaping up to be the future.

Data centers are already preferencing AMD products over NVidia’s. AV1 support, ROCm, and power budgeting will only encourage data centers to continue to adopt AMD in the future. Likewise, even though AMD is fending off both Intel and Nvidia in the enterprise environment, Miss Sue seems poised for the challenge. It looks like she knows how to play the long game.

Let’s End On A Gaming Note

Tying this all back into gaming, AMD is staged to take cloud gaming by storm. Stadia and Microsoft are built on top of AMD tech. AMD supports all the right things for cloud gaming (video encoding, AI, etc.).

Nvidia recently put GeForce Now on a 40% discount. That indicates that Nvidia is trying to attract as many Stadia customers to GeForce Now as possible before they join the Xbox ecosystem. It doesn’t hurt Microsoft that about half of the Xbox Series S customers are new to Xbox.

What are your thoughts? Agree or disagree? Let me know why.

About Jonathan Welling 10 Articles
My nam is Jon and I am a tech therapist. I'm a recovering sysadmin with more than a decade of served time in the tech industry doing everything from front-line support to managing systems and application development. I now spend my days helping others recover from tech phobias through exposure therapy, assisting SMBs with IT, writing words, and enjoying long walks on the beach.

Be the first to comment

Leave a Reply

Your email address will not be published.


*