|
Taima posted:It's dawning on me that if I want to have an appropriate cable ready for when this GPU comes I'm probably going to have to roll the dice on some bullshit and hope for the best. sigh. At least I can return it if it doesn't work I guess. i would be very surprised if this random company that primarily makes phone cases has managed to make an actually working hdmi 2.1 optical cable before any reputable brand
|
# ? Sep 2, 2020 23:03 |
|
|
# ? May 6, 2024 16:57 |
|
repiv posted:https://twitter.com/Dachsjaeger/status/1301234503386570757 Is the view distance still really limited compared to normal Minecraft? What's the bottleneck there anyway?
|
# ? Sep 2, 2020 23:03 |
|
Zero VGS posted:I mean that all sounds wishy-washy and I don't care what's going on in Cinebench or whatever. I just wanna see like, how many watts is each system pulling from the wall while running newest Call of Duty benchmark in 4K, and what are the average FPS or whatever. So I can get close to real life gaming. How many watts and what am I really getting out of each. What those graphs show is that, all else being equal, Intel guzzles way more power than AMD, and as you overclock the gap widens. It's not the exact use case of a game, it's actually a WAY tougher one for the CPU. But it proves out the basic principle that Intel burns more power to do the same work. People don't really test the exact case you're describing because extreme gamer websites don't care about differences in CPU power consumption when the GPU is the thing doing all the real work and guzzling the most the power anyway. Like, if you're concerned with 4K game performance, you're stuck using the GPU that guzzles 320-350 watts. In that scenario the intel won't guzzle 235 watts, but the AMD won't need to use 160 or whatever either - both will be less by some amount because it's less work for the CPU either way.
|
# ? Sep 2, 2020 23:03 |
|
sean10mm posted:Like, if you're concerned with 4K game performance, you're stuck using the GPU that guzzles 320-350 watts. In that scenario the intel won't guzzle 235 watts, but the AMD won't need to use 160 or whatever either - both will be less by some amount because it's less work for the CPU either way. Yeah, the end information you're looking for here is: A Ryzen 3600 system will eat about 150-175W gaming, plus whatever the hell the GPU is doing. If you're looking at a 3080 and it's 320W sticker, that means you'll need at least a 550W PSU to run everything. Might be able to get away with a 500W one if you're ok with undervolting to bring power use down a little. If you've got some 450W SFF PSU, then you should be looking at the 3070 and be ecstatic that it'll get you 2080Ti-level performance for $600.
|
# ? Sep 2, 2020 23:07 |
|
Mercrom posted:Is the view distance still really limited compared to normal Minecraft? What's the bottleneck there anyway? It's still limited but raising the draw distance doesn't kill performance nearly as much as it used to. The max draw distance they allow with RT enabled is 24 chunks. I'm running around one of the sample worlds and getting 70-80fps at 1440p with 24 chunk draw distance and DLSS enabled, on a 2070 super
|
# ? Sep 2, 2020 23:17 |
|
Taima posted:It's dawning on me that if I want to have an appropriate cable ready for when this GPU comes I'm probably going to have to roll the dice on some bullshit and hope for the best. sigh. At least I can return it if it doesn't work I guess. If that's how you wanna do it you should stick with Amazon (who I believe are cutting some slack on return windows due to Covid but you can ask in their chat to make sure) or Monoprice who do honor their warranties with regards to performing to spec (but maybe wait a few weeks to see if they actually push out a 2.1). sean10mm posted:What those graphs show is that, all else being equal, Intel guzzles way more power than AMD, and as you overclock the gap widens. I hear ya, it's just that my total budget is 480 watts and if the CPU spikes a few seconds longer or x watts higher than it should, my PC will shut off from overcurrent condition. Comedy option is that I could go full and run a second power supply in parallel so that I have 12v at 80 amps, assuming my silver-coated copper wires don't melt, lol This is my 12v 40a fanless PSU, by the way: https://www.digikey.com/product-detail/en/mean-well-usa-inc/HEP-600-12/1866-2285-ND/7703841 DrDork posted:If you're looking at a 3080 and it's 320W sticker, that means you'll need at least a 550W PSU to run everything. Might be able to get away with a 500W one if you're ok with undervolting to bring power use down a little. If you've got some 450W SFF PSU, then you should be looking at the 3070 and be ecstatic that it'll get you 2080Ti-level performance for $600. Yeah so looking at the whitepaper for my PSU, it is rated for 480w constant, and claims to have a "105%-125% overcurrent limit". I take that range to mean it is dependent on it's temperature, like as long as it stays cool it might be able to get close to 600w. The other thing in my favor is that I believe normal PSU are rated for how much wattage they can pull from the wall, i.e. a 600w ATX power supply at 90% efficiency can actually only supply 540w to the system, whereas my PSU which is more of a bench power supply which is made to maintain the watts it is rated for.
|
# ? Sep 2, 2020 23:19 |
|
Gamers Nexus had exactly this kind of analysis just before the Ampere release actually. https://www.youtube.com/watch?v=X_wtoCBahhM&t=576s tl;dw: doesn't seem like there's much to it once the GPU gets involved.
|
# ? Sep 2, 2020 23:30 |
|
Zero VGS posted:Yeah so looking at the whitepaper for my PSU, it is rated for 480w constant, and claims to have a "105%-125% overcurrent limit". I take that range to mean it is dependent on it's temperature, like as long as it stays cool it might be able to get close to 600w. That overcurrent limit is supposed to be for transient loads. Like so that if your CPU spikes for 1/10th a second to 490W and then goes back down below 480W it doesn't shut off. You shouldn't be running >480W for any extended time. You're welcome to try and find out, but also keep in mind that in the past NVidia's TDP numbers have been, uh, something of white lies. That "280W" 2080Ti would regularly run at 330W, for example. So you could probably keep a 3080 at 320W, but you'd almost certainly be leaving performance on the table by doing so. e; You're also incorrect about the power labeling. PC PSUs are labeled according to what they can supply to the computers, not by how much they draw from the wall. If you take a look at the label on a PC PSU, you'll see it has the various voltage rails and max load amps, and a combined wattage output section. That wattage output is what gets slapped on the side as a "550W" or whatever PSU. Input power is impossible for them to calculate since it depends on load and whether you're using 115 or 230v mains. A Bronze 80% efficient 500W PSU running full tilt would be pulling ~600W from the wall. So I don't think you're getting any help there. DrDork fucked around with this message at 23:39 on Sep 2, 2020 |
# ? Sep 2, 2020 23:34 |
|
Zero VGS posted:The other thing in my favor is that I believe normal PSU are rated for how much wattage they can pull from the wall, i.e. a 600w ATX power supply at 90% efficiency can actually only supply 540w to the system
|
# ? Sep 2, 2020 23:38 |
|
Llamadeus posted:Pretty sure it works the opposite way, otherwise people would have to be looking up efficiency curves to figure out what PSU they actually needed. Correct. PSU Efficiency rating is how efficient it is at pulling the listed power at different wattage levels. So a 600w 90% efficiency would actually be pulling aroung 666w at max load. E: Yea what DrDork said.
|
# ? Sep 2, 2020 23:49 |
|
Llamadeus posted:Pretty sure it works the opposite way, otherwise people would have to be looking up efficiency curves to figure out what PSU they actually needed. Ah okay guess I was misremembering. At least I can run a parallel PSU if this doesn't cut it. Because the PSU puts out a very precise 12.0 volts I could even wire one PSU directly to the GPU pins with an on/off switch and the other to the PC.
|
# ? Sep 2, 2020 23:51 |
|
Cinara posted:666w at max load. THE WATTAGE OF THE BEAST
|
# ? Sep 2, 2020 23:52 |
|
Is there any appreciable difference between gaming in 1440P and 4K to justify jumping to 4K monitors? I thought 1440P Was very impressive compared to 1080 and have been quite happy with it but everyone seems to be talking up 4k now.
|
# ? Sep 2, 2020 23:55 |
|
repiv posted:who knows, it's a random unaccountable chinese brand that could be buying cables with hdmi 2.0 optical transceivers and selling them as hdmi 2.1 capable I’d try their 2.1 cables based on my experience so far.
|
# ? Sep 2, 2020 23:57 |
|
Kraftwerk posted:Is there any appreciable difference between gaming in 1440P and 4K to justify jumping to 4K monitors? I thought 1440P Was very impressive compared to 1080 and have been quite happy with it but everyone seems to be talking up 4k now. It's a giant difference if you will be gaming on a large format HDMI 2.1 display. Diminishing returns on small displays.
|
# ? Sep 2, 2020 23:58 |
|
Kraftwerk posted:Is there any appreciable difference between gaming in 1440P and 4K to justify jumping to 4K monitors? I thought 1440P Was very impressive compared to 1080 and have been quite happy with it but everyone seems to be talking up 4k now. DLSS 2.0 makes 4K very feasible in the like five games that support it, but 1440p is still the best high-end pick for both quality and performance. 4K people might also have Ultrawide and other weird setups beyond 27". Gaming on a TV is pretty cool, but it's kind of a mess to set up, and Steam Link has been pretty wonky when I tried it.
|
# ? Sep 2, 2020 23:59 |
|
4K high-hz monitors are still meh because DisplayPort 1.4 doesn't have enough bandwidth to run them without compromising image quality I'd wait for models with HDMI 2.1 to arrive before investing in one
|
# ? Sep 3, 2020 00:03 |
|
Zero VGS posted:Ah okay guess I was misremembering. At least I can run a parallel PSU if this doesn't cut it. Because the PSU puts out a very precise 12.0 volts I could even wire one PSU directly to the GPU pins with an on/off switch and the other to the PC. If you do this please post a picture. Mostly because I've never seen the style of PSU you linked before, and I would be intrigued to see what your actual setup is. I say this as someone who has so far missed the SFF train because I could never jam everything I wanted into the cases and power envelopes available. I'd love me a Corsair One style deal, but I need 10Gb networking, and their boards don't support TB3 or an additional PCIe card, so...
|
# ? Sep 3, 2020 00:03 |
|
repiv posted:4K high-hz monitors are still meh because DisplayPort 1.4 doesn't have enough bandwidth to run them without compromising image quality What cutoff do you feel is appropriate here? I'm only seeking 4k/120 because my display is 65 inches, where it makes absolute 100% sense (I have another display for work and desktop-specific games). I have to imagine that for a regular, say, 24-30 inch monitor, the diminishing returns would be horrifying. But I must confess that I have not gamed on a 4K monitor of that size, so that's just speculation.
|
# ? Sep 3, 2020 00:15 |
|
Taima posted:What cutoff do you feel is appropriate here? I'm only seeking 4k/120 because my display is 65 inches, where it makes absolute 100% sense (I have another display for work and desktop-specific games). The janky bandwidth limitations are automatically disqualifying for me so I haven't given it much thought. I'll decide whether 4K is worth it once there's 4K monitors without the jank
|
# ? Sep 3, 2020 00:20 |
|
It looks like the 3090 will be impossible to buy at launch - the reliable twitter leakers have pretty much said it's a paper launch for it. Given the dumb 8K gaming tagline and how little emphasis they've placed on it, makes you wonder why Nvidia allowed AIBs to sell 3090s. You have to recall that the Titan SKU has been an in-house thing previously.
|
# ? Sep 3, 2020 00:29 |
|
I'm very curious about reviews of the 3080, mostly for power consumption. I watched the GN video, but I'm still leery about it. My system probably can handle it, it's an 8086K/2070S running under a 2 year old Seasonic Focus Plus Gold 650W. My old Steam machine ran a Haswell i5 and a 780 with a 450W PSU so that's helped my fears
|
# ? Sep 3, 2020 00:29 |
|
Be prepared not to get your card till next yearquote:https://www.tweaktown.com/news/7491..._medium=twitter lol, going to be lots of pissed-off people trying to get it ahead of CP2077
|
# ? Sep 3, 2020 00:32 |
|
I mean if you're someone that's been ignoring the ongoing global pandemic that caused China and Taiwanese manufacturers to completely stop for a period of time. You deserve what you get.
|
# ? Sep 3, 2020 00:37 |
Didn't want that card anyway.
|
|
# ? Sep 3, 2020 00:40 |
|
i think the issue is more for people on Pascal or older cards who've been holding off a GPU purchase because of Ampere - buying a Turing card would have been an even worse choice given pricing this might actually drive console sales, lol. doing a paper launch right before CP2077
|
# ? Sep 3, 2020 00:41 |
|
shrike82 posted:Be prepared not to get your card till next year I loving knew it. I was hoping for the best but this pretty much seals my fate. I'll make a last ditch attempt to grab one on launch day and if I get out-sniped by the bots and other hardcorers then I'll just have to tough it out on the 1070.
|
# ? Sep 3, 2020 00:41 |
|
What's the point of having a big hoopla launch event if you won't have poo poo to sell? It's one thing to have problems with 3090 yields limiting it to a trickle at first, but if they can't even manufacture a 3070 it's really a joke. e: I mean it's just one dumb rumor so there's no point in getting over it. sean10mm fucked around with this message at 00:50 on Sep 3, 2020 |
# ? Sep 3, 2020 00:43 |
|
sean10mm posted:What's the point of having a big hoopla launch event if you won't have poo poo to sell? If AMD is going to release cards that at least compete at the 20xx level, and they’re working with suppliers who can actually supply, then this means Nvidia can get out in front of them with a product that people will wait for.
|
# ? Sep 3, 2020 00:51 |
|
Space Gopher posted:If AMD is going to release cards that at least compete at the 20xx level, and they’re working with suppliers who can actually supply, uhhhh let me just stop you there
|
# ? Sep 3, 2020 00:52 |
|
if it's driven by concerns about yields on a "new" node, better to launch soon than later, and shake the bugs out
|
# ? Sep 3, 2020 00:53 |
|
sean10mm posted:What's the point of having a big hoopla launch event if you won't have poo poo to sell? Yeah, time to take a deep breath and realize with the absence of information from Nvidia people are trying to cash in on all the hype (And our worst case scenario assumptions) to drive as many clicks as possible and profit from the buzz.
|
# ? Sep 3, 2020 00:54 |
|
https://twitter.com/JayzTwoCents/status/1301271705503936515?s=20 lol
|
# ? Sep 3, 2020 01:07 |
|
MSI said similar in their stream earlier today.
|
# ? Sep 3, 2020 01:11 |
|
Are there services that loan you a 2080 until you can get the 3080?
|
# ? Sep 3, 2020 01:15 |
|
So in other words, if you ARE lucky enough to get two, congrats on your "free" Ampere card, as I'm sure the Saudi Princes and billionaire kids of the world will happily pay +200% to have the new hotness.
|
# ? Sep 3, 2020 01:16 |
|
I mean my 1080Ti is perfectly fine but still I wanted the new card lol
|
# ? Sep 3, 2020 01:16 |
|
I don't see what difference this makes literally at all. If there's only X units then there's only X units, it doesn't matter whether NVIDIA makes you pay for them on launch day or 2 weeks before. If they did preorders then people would be f5'ing the page to slam in a preorder at the first second too before they ran through their shipment allocation and sold out.
|
# ? Sep 3, 2020 01:17 |
|
i think it's more an indication of how limited launch quantities are going to be if they don't think allowing preorders makes sense
|
# ? Sep 3, 2020 01:18 |
|
|
# ? May 6, 2024 16:57 |
|
shrike82 posted:i think it's more an indication of how limited launch quantities are going to be if they don't think allowing preorders makes sense I guess I don't understand why they wouldn't just backorder it kinda how apple does with their launches.
|
# ? Sep 3, 2020 01:19 |