|
redreader posted:Right, fair enough! Yeah I'll see how this pans out. If all goes well, I'll manage to get a 3070 from nowinstock. I was weighing a 1660 v a 3070, but if this is true it will solve my cost:performance needs dilemma. https://www.google.com/amp/s/www.pcgamer.com/amp/nvidia-rtx-3060-rumours-suggest-2080-performance-for-2060-cash/
|
# ? Sep 14, 2020 02:09 |
|
|
# ? May 17, 2024 00:03 |
|
Howard Phillips posted:What does this mean? Does Arm own the foundries or just design architecture and license it? ARM the company doesn't own any foundaries. They own a buttload of IP and research experience, though, and licensing deals out the wazoo. Rinkles posted:is this likely to get approved w/o issue Probably. It's a UK company being bought by a US one, so none of the usual China-involved worries. It doesn't create a monopoly on anything, either. So unless Trump wants to stick his dick it in for some unknowable reason, yeah, it'll probably go through alright.
|
# ? Sep 14, 2020 02:17 |
|
Sphyre posted:I don’t see the problem here. For example as we’ve gone from 480p to 1080p to 4K, the frame rate of movies has also had to increase correspondingly, from 23fps to There needs to be some kind of disclosure law when filmmakers try to sneak in one or two scenes of double-framerate. There's been a couple movies like Gangster Squad and Rogue One, not good movies in the first place but anyway they are 24fps almost the whole time then BAM one scene they switch to 48fps and I'm actually getting ill. It's super jarring. Mercifully I haven't seen it lately so maybe they got the hint.
|
# ? Sep 14, 2020 02:18 |
|
Craptacular! posted:Userbenchmark is bad, unless you’d [url= rather buy an i3 over Threadripper. I wouldn’t use a card with 4GB memory now, even if a strong performer of its generation. I was just trying to solve your problem today, since you might not get a 30-series until next year depending on things we don’t know. His relentless hate-on for AMD is hilarious. How many platforms has he been banned from at this point? CaptainSarcastic fucked around with this message at 02:22 on Sep 14, 2020 |
# ? Sep 14, 2020 02:19 |
|
Zero VGS posted:There needs to be some kind of disclosure law when filmmakers try to sneak in one or two scenes of double-framerate. There's been a couple movies like Gangster Squad and Rogue One, not good movies in the first place but anyway they are 24fps almost the whole time then BAM one scene they switch to 48fps and I'm actually getting ill. It's super jarring. Mercifully I haven't seen it lately so maybe they got the hint. sorry for the derail, but which scenes were that in Rogue One? I don't think I ever noticed.
|
# ? Sep 14, 2020 02:19 |
|
Craptacular! posted:Userbenchmark is bad, unless you’d [url= rather buy an i3 over Threadripper. I wouldn’t use a card with 4GB memory now, even if a strong performer of its generation. I was just trying to solve your problem today, since you might not get a 30-series until next year depending on things we don’t know. there's no spin with their GPU and (especially) their SSD benchmarks. If you want to see how a GTX 680 compares to a GT 1030 or how a GTX 750 compares to an Iris Pro there's no actual reviewer who's ever going to run a benchmark of that so you take what you can get. It's pretty ballpark accurate. (again, remember you were responding to a post using the GPU section, not the CPU section. Bit of a non sequitur.) also, I'm just going to say it, the 1950X loving sucked, it had even more problems than the other first-gen Ryzen poo poo, it looked great in Cinebench and it ran like poo poo in actual programs or (especially) gaming. It was NUMA on a package, it had all kinds of latency problems, it had half-rate AVX2, it was not a great processor. AMD's technique for gluing dies together got way better with Zen2. depending on what you were doing, a 8350K may well have come out on top of consumer applications. 1950X obviously did better in cinebench or other parallel poo poo that didn't care about latency/etc though. Paul MaudDib fucked around with this message at 02:29 on Sep 14, 2020 |
# ? Sep 14, 2020 02:25 |
|
gradenko_2000 posted:sorry for the derail, but which scenes were that in Rogue One? I don't think I ever noticed. I am 90% sure the CGI flyover of the lava planet (Mustafar) was double framerate, though I don't think you can see it in this YouTube clip.: https://www.youtube.com/watch?v=smYFSWHTg8Y I felt the same way watching it as I did with the fist-fight scene at the end of Gangster Squad and a lot of other people caught and commented that one at the time.
|
# ? Sep 14, 2020 02:44 |
|
Can someone explain the weirdness I'm seeing happen with regards to idle clock speed on my 1660 TI? I recently got two 1440p/144hz FreeSync displays, the LG 27GL83A-B main and a ViewSonic VX2758-2KP-MHD secondary and hooked them up with DP and HDMI respectively. Both are running at 1440p/144hz with FreeSync enabled. This is all great and the jump from 60 to 144hz was pretty incredible, but my GPU refused to downclock when idling and was sitting at 1400MHz/6000MHz~ just staring at my desktop wallpaper. Normally I wouldn't notice something like this, but the difference between 45c and 40c means that the fan isn't turning off and I can hear it whirring away. Did a bit of googling and this seemed to be a problem with nvidia cards and 2+ monitors running at different resolutions/hz a couple years ago, but it's happening to me with two monitors that are the same. In any case I downloaded nvidia profile inspector which had a multi display power saver feature that I've enabled and it's working pretty great and I haven't noticed any problems at all. The card idles at 300/405MHz sometimes. If I run a game and the card clocks up to gaming speeds, it doesn't drop back down to 300/405MHz anymore, instead sitting around 600/405 on the desktop until I shut a monitor off and turn it back on. Is there another solution to this or is this janky utility something I've just got to live with?
|
# ? Sep 14, 2020 02:49 |
|
redreader posted:Right, fair enough! Yeah I'll see how this pans out. If all goes well, I'll manage to get a 3070 from nowinstock. Okay. For reference, the 980ti is equivalent to a 1070, and the 1660 is about 15% better than that, so it slots in just under the 1080, and comparisons between the 1080 and the non-ti 980 aren’t event in the same ballpark.
|
# ? Sep 14, 2020 02:51 |
|
my case can barely fit its 1080ti so the 3080 is right out, if i even wanted to drop the cash internet's being frustratingly vague about comparisons between the 1080ti and the 3070 as well
|
# ? Sep 14, 2020 02:54 |
|
Martian Manfucker posted:Can someone explain the weirdness I'm seeing happen with regards to idle clock speed on my 1660 TI? I recently got two 1440p/144hz FreeSync displays, the LG 27GL83A-B main and a ViewSonic VX2758-2KP-MHD secondary and hooked them up with DP and HDMI respectively. Both are running at 1440p/144hz with FreeSync enabled. This is all great and the jump from 60 to 144hz was pretty incredible, but my GPU refused to downclock when idling and was sitting at 1400MHz/6000MHz~ just staring at my desktop wallpaper. Normally I wouldn't notice something like this, but the difference between 45c and 40c means that the fan isn't turning off and I can hear it whirring away. Just as a test, drop the secondary monitor down to 60 Hz. Different resolutions and refresh rates hasn't been a problem in forever, but multiple high refresh displays might be pushing the display controller hard enough that it doesn't want to clock down.
|
# ? Sep 14, 2020 02:54 |
|
Oxxidation posted:my case can barely fit its 1080ti so the 3080 is right out, if i even wanted to drop the cash They are claiming it to be equal to the 2080ti so there's your ballpark
|
# ? Sep 14, 2020 02:57 |
|
Indiana_Krom posted:Just as a test, drop the secondary monitor down to 60 Hz. Different resolutions and refresh rates hasn't been a problem in forever, but multiple high refresh displays might be pushing the display controller hard enough that it doesn't want to clock down. Just gave this a shot and there was no change. Thanks for the suggestion, though.
|
# ? Sep 14, 2020 03:09 |
|
Scarecow posted:They are claiming it to be equal to the 2080ti so there's your ballpark ha, looked into it further and found that my motherboard isn't new enough to handle the 3000 models either, all its connections are pci 3.0 so much for that, then
|
# ? Sep 14, 2020 03:21 |
|
PCI-E is backwards compatible. Intel doesn't even have PCI-E 4.0 support yet. Depending on game, it's anywhere from no to a small performance difference - something to mildly care about if you were doing a new build maybe, but it shouldn't be the reason you decide to buy a video card or not.
|
# ? Sep 14, 2020 03:24 |
|
Martian Manfucker posted:Just gave this a shot and there was no change. Thanks for the suggestion, though. Use MSI afterburner to make the fans wait till 50C to fire up? More of a bandaid then a fix tho.
|
# ? Sep 14, 2020 03:24 |
|
Craptacular! posted:Okay. For reference, the 980ti is equivalent to a 1070, and the 1660 is about 15% better than that, so it slots in just under the 1080, and comparisons between the 1080 and the non-ti 980 aren’t event in the same ballpark. Lol holy poo poo. Well, I can only imagine how much better it'll be with a 3070! Can't stress how much I'm looking forward to this.
|
# ? Sep 14, 2020 03:26 |
|
Oxxidation posted:ha, looked into it further and found that my motherboard isn't new enough to handle the 3000 models either, all its connections are pci 3.0 The cards support PCI-e 4.0, but work fine on PCI-e 3.0. Current expectations are that you’re probably looking at MAYBE a 5% drop in performance from 4.0 to 3.0, if there’s any at all. Also the 3080 FE is 285mm. Depending on your 1080Ti it might actually fit without issue.
|
# ? Sep 14, 2020 03:26 |
|
Kingnothing posted:The cards support PCI-e 4.0, but work fine on PCI-e 3.0. Current expectations are that you’re probably looking at MAYBE a 5% drop in performance from 4.0 to 3.0, if there’s any at all. nah, the 1080ti is 267 mm and i have maybe 3 or 4 mm in clearance between the card and the drive cages. i asked a family friend to assemble this thing, it must have been a pain in the rear end to fit good to know about the pci connectors, though
|
# ? Sep 14, 2020 03:31 |
|
Martian Manfucker posted:Just gave this a shot and there was no change. Thanks for the suggestion, though. you'll probably want "multi display power saver", it's a part of the nvidia inspector app a friend of mine runs a 144+60hz screen and without forcing the clocks manually with that app, it keeps at high frequency constantly
|
# ? Sep 14, 2020 03:34 |
|
Oxxidation posted:nah, the 1080ti is 267 mm and i have maybe 3 or 4 mm in clearance between the card and the drive cages. i asked a family friend to assemble this thing, it must have been a pain in the rear end to fit Do you know what case it is? Some of them are designed so the drive cage can be moved to another position.
|
# ? Sep 14, 2020 03:36 |
|
repiv posted:Do you know what case it is? Some of them are designed so the drive cage can be moved to another position. it's a be quiet! 800 silent base, this guy right here i really, really like this case e: oh hey i checked a youtube video and it looks like they are removable, this warrants further study Oxxidation fucked around with this message at 03:47 on Sep 14, 2020 |
# ? Sep 14, 2020 03:41 |
|
Yeah the drive cage is split into two sections, the top one comes out to make room for longer graphics cards Just about any card should fit with that removed, and you still get four HDD slots with just the lower section repiv fucked around with this message at 03:54 on Sep 14, 2020 |
# ? Sep 14, 2020 03:50 |
|
Oxxidation posted:it's a be quiet! 800 silent base, this guy right here You sure about the size of your GPU? Their documentation sucks or is difficult to find, but according to pretty much every site I looked GPU clearance WITH the drive cage installed is 290mm (400mm with it out) https://www.gamersnexus.net/hwreviews/1930-be-quiet-silent-base-800-review
|
# ? Sep 14, 2020 03:53 |
|
Oxxidation posted:it's a be quiet! 800 silent base, this guy right here Picture number 2 literally is a diagram of the cages being removed
|
# ? Sep 14, 2020 04:18 |
|
All I want to know is if it's worthwhile to jump from a 1070Ti to a 3070. Paying north of $600 for any vidya card is something I still don't feel comfortable with, which takes the 3080 right out of contention for me.BIG HEADLINE posted:Idiots can't be trusted to not plug a GPU wall wart into an un-UPSed or non-surge protected outlet. At least get a UPS just so you won't get turbofucked out of your work when your area suffers a massive power outage for the second time in a week due to a substation screwup.
|
# ? Sep 14, 2020 06:32 |
|
90s Solo Cup posted:All I want to know is if it's worthwhile to jump from a 1070Ti to a 3070. Paying north of $600 for any vidya card is something I still don't feel comfortable with, which takes the 3080 right out of contention for me. It 100% is IMO.
|
# ? Sep 14, 2020 06:38 |
|
Martian Manfucker posted:Can someone explain the weirdness I'm seeing happen with regards to idle clock speed on my 1660 TI? I recently got two 1440p/144hz FreeSync displays, the LG 27GL83A-B main and a ViewSonic VX2758-2KP-MHD secondary and hooked them up with DP and HDMI respectively. Both are running at 1440p/144hz with FreeSync enabled. This is all great and the jump from 60 to 144hz was pretty incredible, but my GPU refused to downclock when idling and was sitting at 1400MHz/6000MHz~ just staring at my desktop wallpaper. Normally I wouldn't notice something like this, but the difference between 45c and 40c means that the fan isn't turning off and I can hear it whirring away. Are you sure the displays are both actually running at 144hz? A couple weeks ago I found that my 144hz monitor seemed to be capped at 60hz and I couldn't figure out why. Nvidia control panel showed the right refresh rate, Windows display settings showed the right refresh rate, games were set to the right refresh rate, but I was still capped at 60hz. I finally checked in Device Manager and it had decided I had a generic PnP monitor running at 60hz. Fixed the refresh rate there and everything started working normally again. It might be worth double-checking just to be sure.
|
# ? Sep 14, 2020 06:38 |
|
90s Solo Cup posted:All I want to know is if it's worthwhile to jump from a 1070Ti to a 3070. Paying north of $600 for any vidya card is something I still don't feel comfortable with, which takes the 3080 right out of contention for me. 3070=2080ti basically. So yes.
|
# ? Sep 14, 2020 06:44 |
|
Looks like the Arm purchase goes is confirmed:quote:Update: SoftBank has agreed to sell Arm Holdings to US chip company Nvidia for $40 billion, ending four years of ownership as the Japanese technology group shifts towards becoming a global investment and asset management powerhouse. There might be some requirements from the UK side but it'll probably go through.
|
# ? Sep 14, 2020 08:49 |
|
Gamers Nexus upgraded their testing methodology for GPU tests and will be used with the 3080. Should be interesting to see! https://www.youtube.com/watch?v=-P7-ML-bPCE
|
# ? Sep 14, 2020 09:30 |
|
8-bit Miniboss posted:Gamers Nexus upgraded their testing methodology for GPU tests and will be used with the 3080. Should be interesting to see! Yeah this is a good video. Very informative.
|
# ? Sep 14, 2020 09:47 |
|
Do we have any reasonable guesses to go off as to when Nvidia will stop selling founder edition 3080s? I don't mind waiting, but I'd be sad if they spike interest with the FE versions and then never put them back in stock after the 17th, to make more profit off the third-party boards. The UK store still has 2000 series in stock, but that seems of little predictive value.
|
# ? Sep 14, 2020 11:26 |
|
My case is an NZXT H200 mini ITX, I'm guessing pretty much the only 3080 I'll be able to fit in there is the FE? And I have zero chance of getting a 3090 in there?
|
# ? Sep 14, 2020 11:48 |
|
Bad Parenting posted:My case is an NZXT H200 mini ITX, I'm guessing pretty much the only 3080 I'll be able to fit in there is the FE? And I have zero chance of getting a 3090 in there? The FE 3090 is 12.3 inches (313mm) and your case lists a max GPU length of 325. E: having said that the 3090 is kinda tall so I would definitely check more before buying one for a small case.
|
# ? Sep 14, 2020 11:52 |
|
AirRaid posted:The FE 3090 is 12.3 inches (313mm) and your case lists a max GPU length of 325.
|
# ? Sep 14, 2020 11:57 |
|
Llamadeus posted:The length or the height aren't the main problems there, more that the H200 only has two PCIe slots and the FE is a 3 slot card Yeah it's the 'thickness' of the card that is the issue, length is not too bad as stated above, it's just there's only 2 slots on the board and there's not much of a gap between those slots and the power supply shroud, so a lot of the AIB cards that state their size as 2.5 slots are probably not gonna fit either as far as I can tell
|
# ? Sep 14, 2020 12:05 |
|
Cefte posted:Do we have any reasonable guesses to go off as to when Nvidia will stop selling founder edition 3080s? I don't mind waiting, but I'd be sad if they spike interest with the FE versions and then never put them back in stock after the 17th, to make more profit off the third-party boards. The UK store still has 2000 series in stock, but that seems of little predictive value. Regions with direct NVIDIA sales should continue to sell founders editions. Regions where NVIDIA isn't selling direct and allocations are being bought by retailers are uncertain to guaranteed to stop, depending.
|
# ? Sep 14, 2020 12:07 |
Are third-party boards launching on the 17th too? No idea how that normally works.
|
|
# ? Sep 14, 2020 12:45 |
|
|
# ? May 17, 2024 00:03 |
|
BurritoJustice posted:Regions with direct NVIDIA sales should continue to sell founders editions. Regions where NVIDIA isn't selling direct and allocations are being bought by retailers are uncertain to guaranteed to stop, depending.
|
# ? Sep 14, 2020 12:46 |