04.01.2022., 07:08
|
#143
|
Premium
Datum registracije: Apr 2004
Lokacija: ZG - Dubrava
Postovi: 2,177
|
Nevidija bome duboko posegnula u kasu da bi osigurala buducnost i konurentnost sa TSMC -ovim 5nm protiv nadolazecih kartica iz AMD-a koje ce također biti na TSMC -ovoj 5nm litografiji.
Citiraj:
NVIDIA is reportedly coughing up big money to TSMC (Taiwan Semiconductor Manufacturing Company) to ensure that it secures plenty of next-gen 5nm wafers for its upcoming Ada Lovelace GPUs that will power the next-gen GeForce RTX 40 series graphics cards.
But in new rumors from MyDrivers, NVIDIA is paying much more than usual to get 5nm wafer from TSMC with the company reportedly paying TSMC a large $1.64 billion in Q3 2021... but that increases to $1.79 billion in Q1 2022 as they ramp up to the Ada Lovelace GPU being made and printed onto next-gen GeForce RTX 40 series GPUs.
A larger long-term deal has also reportedly been struck, which would see NVIDIA coughing up $6.9 billion to TSMC in order to secure next-gen 5nm wafers for Ada Lovelace. We've heard before that NVIDIA's next-gen GeForce RTX 40 series GPUs would be made by TSMC -- and not Samsung -- and its upcoming 5nm node.
MyDrivers reports: "According to industry sources, TSMC's requirements for Apple, MediaTek, AMD and other three major customers are relatively low. They do not need to pay too much deposit in advance to stabilize production capacity. Customers like NVIDIA need to pay huge advance payments in advance if they want to obtain 5nm production orders".
Samsung has been making Ampere GPUs for NVIDIA and its GeForce RTX 30 series, but we all know they aren't the most power-efficient GPUs when you get to the high-end. AMD has hit its stride with its RDNA 2 GPU architecture, with TSMC baking their Radeon RX 6000 series chips on the 7nm node -- the same node used to make the semi-custom chips inside of the PlayStation 5 and Xbox Series X consoles -- and they're fantastic with power.
NVIDIA is making a move that it needs to in order to... well, frankly survive. The company makes its bread and butter from GPUs and if it can't sell enough -- especially of the very best, which TSMC will be capable of providing with its new N5 node -- then well, that's not going to be good at all. NVIDIA can't have another misfire as things will be radically different for Team Green in 2022, 2023, and beyond.
AMD will have its next-gen RDNA 3 architecture here with an MCM-based design -- MCM being a multi-chip module, so multiple GPU chiplets on the same card, similar to how the Zen CPU chiplets are on the Ryzen CPUs -- and will offer some truly monster performance on the next-gen Navi 31-based Radeon RX 7900 XT.
Not only that, but the GPU market is going to be completely different this time next year with CPU giant Intel joining the GPU market with its Xe-based Arc Alchemist. Later in the year we'll see how Intel has gone, and what comes next with Intel's Arc GPUs, and then in 2023 we've been hearing that is when things will really ramp for Xe and Arc... and NVIDIA knows that.
It gets worse, as the ongoing pandemic has been wreaking havoc on every single industry -- and the tech industry hasn't been shy from being slammed into the ground. The entire supply chain is f***ed right now, and Taiwan is in the middle of a shit-flinging fight with China + Japan, with the threat of nuclear strikes from China on Taiwan if Japan intervenes in any form.
NVIDIA's next-gen Ada Lovelace GPU and the flagship GeForce RTX 4090 or GeForce RTX 4090 Ti should be capable of 4K 120FPS gaming without a problem, but I think next-gen 8K 60FPS and better yet, 8K 120FPS gaming to be in the sights of NVIDIA. Maybe not raw, but with a next-gen DLSS 3.0 up its sleeve... it's entirely possible, and that is very, very exciting.
AMD's next-gen Navi 31 GPU has recently taped out, the very first MCM-based GPU that should form into a new flagship Radeon RX 7000 series graphics card in the second half of 2022. NVIDIA will be fighting back with the GeForce RTX 40 series and new flagship GeForce RTX 4090, which should offer 2x the performance and power consumption of the GeForce RTX 3090.
Read more: https://www.tweaktown.com/news/83634...022/index.html
|
|
|
|