|
change my name posted:Business is so cool, you can just make up whatever numbers you feel like huh (unless some of that is already locked in for next year) it actually proves your point, but fiscal quarters are not aligned with the calendar year and companies use all sorts of wild dates. Q3 FY24 already closed on October 29th, 2023 for Nvidia.
|
# ? Nov 21, 2023 23:24 |
|
|
# ? May 5, 2024 11:56 |
|
change my name posted:Business is so cool, you can just make up whatever numbers you feel like huh (unless some of that is already locked in for next year) Those are numbers they're reporting to have already happened.
|
# ? Nov 21, 2023 23:44 |
|
Cyrano4747 posted:If you've already got a 3090 I really don't see a compelling reason to upgrade this generation at all, unless you're one of those edge cases where you actually need the grunt that a 4090 has. I'm fortunate enough to have the money and I like being able to run high resolutions with all the bells and whistles on. But yeah, I'm thinking it might be pretty hard to justify. I guess we'll see!
|
# ? Nov 22, 2023 00:08 |
|
PCWorld interviews Alex Battaglia about rendering and GPU stuff https://www.youtube.com/watch?v=live?7TECZHoFBpE
|
# ? Nov 22, 2023 00:26 |
|
Lockback posted:Will depend on a lot. Yeah, the 4070 Ti Super might be the only truly compelling part; if it stays at or near the 4070 Ti's price, and also sees the rumored bump in ram up to 16 GB, then it seems like it all but eliminates any reason to go with a 4080 and maybe even 4080 Super given DLSS/etc. I'd expect it to perform really well at 4K/60 and probably even 4K/120 with DLSS. Edit: Cygni posted:gaming almost back up to crypto bubble levels but holy poo poo that data center... Won't somebody please think of poor Nvidia's revenue and profit? How can they afford to sell their cards so cheap? :qq qqq:
|
# ? Nov 22, 2023 00:26 |
|
Cygni posted:gaming almost back up to crypto bubble levels but holy poo poo that data center... Data Center includes AI stuff correct?
|
# ? Nov 22, 2023 02:21 |
|
MH Knights posted:Data Center includes AI stuff correct? Yeah
|
# ? Nov 22, 2023 03:14 |
|
Internet Explorer posted:I'm fortunate enough to have the money and I like being able to run high resolutions with all the bells and whistles on. But yeah, I'm thinking it might be pretty hard to justify. I guess we'll see! I have started practicing my justification for the 5090. I think I’m going to be in good shape by the time it releases.
|
# ? Nov 22, 2023 03:47 |
|
See, that's why I stay out of this thread. Ya'll are a bad influence. The monitor thread as well.
|
# ? Nov 22, 2023 03:48 |
|
Internet Explorer posted:See, that's why I stay out of this thread. Ya'll are a bad influence. The monitor thread as well. Na once you get an OLED monitor that’s the end game.
|
# ? Nov 22, 2023 03:49 |
|
The reason why I came to ask about a new graphics card was because I was already pushing the 3090 and then I just picked up the G9 OLED. To go along with my two LG C2s in the house.
|
# ? Nov 22, 2023 03:51 |
|
Internet Explorer posted:The reason why I came to ask about a new graphics card was because I was already pushing the 3090 and then I just picked up the G9 OLED. To go along with my two LG C2s in the house. Realistically, unless you're worried about power consumption (which is totally valid), stick with your 3090 for another generation. The resale value is holding up because of the extra VRAM so I'm sure if you upgrade next gen you'll be able to offset the cost a bit.
|
# ? Nov 22, 2023 03:52 |
|
Internet Explorer posted:The reason why I came to ask about a new graphics card was because I was already pushing the 3090 and then I just picked up the G9 OLED. To go along with my two LG C2s in the house. 3090 + DLSS until the 5000-series is out, then splurge on the 5090, echoing this: change my name posted:Realistically, unless you're worried about power consumption (which is totally valid), stick with your 3090 for another generation. The resale value is holding up because of the extra VRAM so I'm sure if you upgrade next gen you'll be able to offset the cost a bit.
|
# ? Nov 22, 2023 04:14 |
|
Internet Explorer posted:Maybe it's still too early to tell, but how are we feeling about the potential SUPER releases? Going to be something worth buying, or are people who buy it going to be kicking themselves for not waiting for the 5000 series? At least according to the rumor mill, they don't seem like they are going to be released all that far apart. it's not just the super releases themselves, but also the way the rest of the stack is shuffled around it. some skus are going to get some price cuts. 4070 Ti Super is going to be the interesting one, as others note. There is a lot of margin in AD103, the 4080 is the most overpriced by far, they are totally going to be able to do a $849/899 4080 and have some marginal 4080 super right above it for some silly price. 4070 Ti Super probably takes the 4070 Ti MSRP ($749) but you finally get a real 16gb card. They will not put AD102 into 4080 Super, not with the AI market this hot. If it's 20GB it has to be a new die. In which case maybe $1100 for 4080 Super 20GB? But probably not. Unless there's some curveball like 20gb a 4080 super probably going to be a massive wet fart. 4060 ti super also meh. AD104 is already way cut down for 4070 and I just don't see them going way lower. They can't cut off like 40-50% of the die for these supers to make performance levels that "make sense". Maybe there's a refresh that moves 4060 ti performance down to the $329/379 price points or something, at most. That would push the 4060 down maybe $20 or something as well. Supposedly 4060 Ti 16GB is selling OK for AI stuff as well which argues against big price cuts. I don't know about having 4070 ti super and also 4070 super, that rumor doesn't make sense, there's not enough daylight there for 4070, 4070 super, 4070 ti, 4070 ti super imo. maybe it supercedes the old sku and they just sell through the inventory to morons/prebuilts, but, that's a lot of skus in a $200 space. I agree that I think if you've got a 3090, just wait and buy a 5090, or wait and see what the market looks like then. Paul MaudDib fucked around with this message at 07:57 on Nov 22, 2023 |
# ? Nov 22, 2023 07:12 |
|
I'll just get whatever ends up being the most reasonable $/fps after the Supers drop (maybe even AMD) and that hopefully forces everything to get new pricing. But we'll see, I've been saying I'll get a new GPU since Turing but my 1070 keeps chugging along.
|
# ? Nov 22, 2023 10:56 |
|
Paul MaudDib posted:I don't know about having 4070 ti super and also 4070 super, that rumor doesn't make sense, there's not enough daylight there for 4070, 4070 super, 4070 ti, 4070 ti super imo. maybe it supercedes the old sku and they just sell through the inventory to morons/prebuilts, but, that's a lot of skus in a $200 space. Rumour is they've stopped manufacturing the regular 4070 and 4070ti so it would be just the super models once stocks run out. Might take a while for the latter unless there's some really good deals, so it still sounds pretty cramped
|
# ? Nov 22, 2023 11:45 |
|
MarcusSA posted:Na once you get an OLED monitor that’s the end game. Until you come back complaining of degradation. We definitely don't have "perfect" display tech yet
|
# ? Nov 22, 2023 12:18 |
|
HalloKitty posted:Until you come back complaining of degradation. We just need a monitor subscription where they'll automatically send you an new OLED monitor every 5,000 hours or whatever
|
# ? Nov 22, 2023 12:53 |
|
Arzachel posted:Rumour is they've stopped manufacturing the regular 4070 and 4070ti so it would be just the super models once stocks run out. Might take a while for the latter unless there's some really good deals, so it still sounds pretty cramped Nope, the 4070ti is going to be stopped but the 4070 is staying. So the new 4070 line is: 4070 --- 4070 Super --- 4070 Ti Super https://www.dexerto.com/tech/nvidia-leak-says-rumored-4070-super-wont-fully-replace-base-model-2393954/ I would not expect the 4080 to be more than 16GB, but honestly I don't think that's a big deal. For it's performance envelope 16GB seems fine. Its not like you get more performance from 20GB vs 16, and I would question whether the 4080 would have enough oomph to be relevant long enough for 16GB to be a limiter.
|
# ? Nov 22, 2023 15:34 |
|
I'm already maxing out my 12gb regularly so I don't think 16 is a lot of headroom.
|
# ? Nov 22, 2023 22:45 |
|
FuzzySlippers posted:I'm already maxing out my 12gb regularly so I don't think 16 is a lot of headroom. The question is "doing what?" 1440 vs 4k vs VR vs AI shenanigans etc. is all going to radically change how much vram is enough for someone's use case over whatever kind of timeline you think a video card should be good for.
|
# ? Nov 22, 2023 23:36 |
I would run out of ram regularly when I had 16gb playing anything running the Frostbite engine like 5 or more years ago if I didn't remember to kill all extraneous programs ahead of time. 32 is a very reasonable target these days imo.
|
|
# ? Nov 23, 2023 00:13 |
|
This is about VRAM, not normal RAM. (Which you are right about w/r/t 32GB. ) If you want 32 GB of VRAM on something, you'd better be ready to wait a few more years and also be ready to pay $2000.
|
# ? Nov 23, 2023 00:15 |
|
Y'all know this is VRAM, not System RAM, right?
|
# ? Nov 23, 2023 00:16 |
|
The existence of the Xbox Series S is the only reason 8GB GPUs are going to stay usable at all in AAA games. Otherwise the floor for VRAM in the consoles would be the 10GB fast VRAM in the Series X, and the PS5 can have about 12GB of VRAM if you really squeeze hard. We really should have 10-12GB in GPUs at the low end, but memory controllers and memory chip size limitations put us into this awkward spot. Is anyone betting on the lowly, $329 MSRP 3060 aging weirdly well because they all came with 12GB of VRAM? Hell, what's the story with these 3080 20GB? I never saw benchmarks of them: https://www.tomshardware.com/news/geforce-rtx-3080-20gb-gpus-emerge-for-around-dollar575. I wonder what those 3080 20GB sold for new. VRAM is cheap.
|
# ? Nov 23, 2023 00:37 |
|
Just running recent games on max details at ultrawide 1440p had me maxing 12gb VRAM and I just moved up to 1600p UW which I assume is gonna be worse. Games aimed at only the current console gen are hungrier for both VRAM and system RAM. Especially when running balls out with additional pc gaming fuckery on top of it whether that's higher detail/resolution or just having to share memory with random other open poo poo. I've noticed browser tabs getting greedier with VRAM though I don't know if that's a bug or on purpose. Crazy game modding can also absolutely blow up your VRAM since modders are using it less efficiently. You can make Skyrim look pretty impressive these days but doing it requires laughable specs. Any kind of AI tinkering will blow past 16gb easily but that's an entirely different thing and who knows if locally run models are going to be anything that matters over the next decade for people not engaged in actual AI work. I was impressed with what I could do for funzies, but the responsiveness gain from running local is pretty minimal compared to losing all the easily accessible power available remotely. Maybe someone will come up with an interesting usage for local.
|
# ? Nov 23, 2023 00:52 |
SpaceDrake posted:This is about VRAM, not normal RAM. (Which you are right about w/r/t 32GB. ) omg lmao. I saw the 16 and instantly my brain swapped over to system ram not vram, woops. Yeah totally disregard my anecdote.
|
|
# ? Nov 23, 2023 00:53 |
|
FuzzySlippers posted:Just running recent games on max details at ultrawide 1440p had me maxing 12gb VRAM and I just moved up to 1600p UW which I assume is gonna be worse. sure, but UW isnt exactly apples to apples with normal resolutions. Its significantly more pixels to push. I agree that 8gb is barely adequate today and manifestly inadequate for the future, but I don't think we're quite at 12gb being a hard floor either.
|
# ? Nov 23, 2023 01:02 |
|
resolution mostly affects VRAM usage because it involves higher-resolution textures, I thought, and not because the target framebuffers were larger. if that’s the case then I wouldn’t expect 1600 UW to use much more VRAM than 1440 UW, because they’ll use the same textures and shadowmap resolutions and such?
|
# ? Nov 23, 2023 01:08 |
|
there's no straightforward answer to that because it depends on the streaming/eviction strategy the engine uses, and the assets (how often are textures repeated)
|
# ? Nov 23, 2023 01:38 |
|
But it is safe to say the framebuffer itself in isolation isn't a significant driver of vram consumption anymore.
|
# ? Nov 23, 2023 02:14 |
|
i don't think framebuffer size was ever that big of a concern really, save for on the xbox 360 where it had to fit into the 10mb of fast memory
|
# ? Nov 23, 2023 02:36 |
|
repiv posted:i don't think framebuffer size was ever that big of a concern really, save for on the xbox 360 where it had to fit into the 10mb of fast memory The Xbox One was stuck at 900p in most games because its 32MB of fast memory couldn't fit all of the frame-sized buffers that a modern graphics pipeline needs. Even MGS5 is stuck at 900p for all eternity on today's vastly more powerful Xbox.
|
# ? Nov 23, 2023 02:39 |
|
oh yeah that too, microsoft couldn't help but make the same mistake twice
|
# ? Nov 23, 2023 02:40 |
|
https://www.tomshardware.com/news/d...sanctioned-tech AMD won't be able to get a leg up in China either, Joe Biden banned the 7900XTX.
|
# ? Nov 23, 2023 03:59 |
|
7900 XT too, but not the 4080? Or we just haven't heard yet.
|
# ? Nov 23, 2023 04:03 |
|
They're measuring the raw computational power, not framerates in Cyberpunk. The 7900XT is a more powerful GPU than the 4080 and over the limit, but the 4080 sneaks in under the limit.
|
# ? Nov 23, 2023 04:07 |
|
Which is to say, that rule is going to get absolutely hilarious in 2025 and beyond unless it gets the poo poo lobbied out of it. Nvidia is not going to appreciate having the 5080 (and maybe even the 5070Ti?) sanctioned off, and AMD is going to be mad about losing upwards of half its catalogue for export.
|
# ? Nov 23, 2023 04:26 |
|
Twerk from Home posted:They're measuring the raw computational power, not framerates in Cyberpunk. The 7900XT is a more powerful GPU than the 4080 and over the limit, but the 4080 sneaks in under the limit. The 7900XTX isn't close to the TPP limit for the export restrictions, and the 4080 is actually higher TPP (though still also well under the TPP limit). The formula is bits * TFLOPS, and the limit is a max of 4800 TPP in any precision the GPU supports. The 4090 exceeds it with 330 TFLOPS at FP16, for 5280 TPP. The 4080 has 195 TFLOPS at FP16, for 3120 TPP. As noted in the article, the 7900XTX only has a TPP of 1962 at FP16. NVIDIA's tensor cores count for the bans, which is why their TPP is so much higher across the board. If they are banning the AMD GPUs it's a special case, unrelated to the standard TPP bans.
|
# ? Nov 23, 2023 05:09 |
|
|
# ? May 5, 2024 11:56 |
|
SpaceDrake posted:AMD is going to be mad about losing upwards of half its catalogue for export. Not if they just don't make the upper half of the catalogue anymore!
|
# ? Nov 23, 2023 05:47 |