View Single Post
Staro 19.03.2025., 20:42   #417
The Exiled
McG
Moj komp
 
The Exiled's Avatar
 
Datum registracije: Feb 2014
Lokacija: Varaždin
Postovi: 8,224
I šećer na kraju, stručLjak svih stručLJaka, friško bivši Intel direktor koji se između ostalog proslavil svojim bajnim izjavama, obećanjima, analizama i predviđanjima - opet ponavlja kako se nVidiji "posrećilo". Iskreno ne znam zašto su ga uopće zvali na GTC, jer deda i dalje samo lupa po svom, uporno ponavljajući kojekakve alternativne činjenice misleći da je jako pametan, dok se s nostalgijom prisjeća svih svojih zajeba u Intelu. nVidiji se posrećilo, AMD je u Intelovom retrovizoru i ARM je nebitan na sveukupnom tržištu, sve redom činjenično stanje kad se izjasni inženjer Gelsinger.

Citiraj:
Pat Gelsinger repeats observation that nVidia CEO "got lucky" with AI industry boom
Citiraj:
Pat Gelsinger has quite bravely stepped into the belly of the beast this week. The former Intel boss was an invited guest at nVidia's GTC 2025 conference; currently taking place in San Francisco, California. Technology news outlets have extracted key quotes from Gelsinger's musings during an in-person appearance on Acquired's "Live at GTC" video podcast. In the past, the ex-Team Blue chief held the belief that nVidia was "extraordinarily lucky" with a market leading position. Yesterday's panel discussion provided a repeat visit—where Gelsinger repeated his long-held opinion: "the CPU was the king of the hill, and I applaud Jensen for his tenacity in just saying, 'No, I am not trying to build one of those; I am trying to deliver against the workload starting in graphics. You know, it became this broader view. And then he got lucky with AI, and one time I was debating with him, he said: 'No, I got really lucky with AI workload because it just demanded that type of architecture.' That is where the center of application development is (right now)."

The American businessman and electrical engineer reckons that AI hardware costs are climbing to unreasonable levels: "today, if we think about the training workload, okay, but you have to give away something much more optimized for inferencing. You know a GPU is way too expensive; I argue it is 10,000 times too expensive to fully realize what we want to do with the deployment of inferencing for AI and then, of course, what's beyond that." Despite the "failure" of a much older Intel design, Gelsinger delved into some rose-tinted nostalgia: "I had a project that was well known in the industry called Larrabee and which was trying to bridge the programmability of the CPU with a throughput oriented architecture (of a GPU), and I think had Intel stay on that path, you know, the future could have been different...I give Jensen a lot of credit (as) he just stayed true to that throughput computing or accelerated (vision)." With the semi-recent cancelation of "Falcon Shores" chip design, Intel's AI GPU division is likely regrouping around their next-generation "Jaguar Shores" project—industry watchdogs reckon that this rack-scale platform will arrive in 2026.
Izvor: TechPowerUp
__________________
AMD Ryzen 9 9950X | Noctua NH-U12A chromax.black | MSI MAG B650 Tomahawk Wi-Fi | 128GB Kingston FURY Beast DDR5-5200 | 256GB AData SX8200 Pro NVMe | 2x4TB WD Red Plus | Fractal Define 7 Compact | Seasonic GX-750
AMD Ryzen 5 7600 | Noctua NH-U12A chromax.black | MSI MAG B650 Tomahawk Wi-Fi | 128GB Kingston FURY Beast DDR5-5200 | 256GB AData SX8200 Pro NVMe | 2x12TB WD Red Plus | Fractal Define 7 Compact | eVGA 650 B5
The Exiled je online   Reply With Quote