Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Look Sir Droids
Jan 27, 2015

The tracks go off in this direction.

redreader posted:

Right, fair enough! Yeah I'll see how this pans out. If all goes well, I'll manage to get a 3070 from nowinstock.

I was weighing a 1660 v a 3070, but if this is true it will solve my cost:performance needs dilemma.


https://www.google.com/amp/s/www.pcgamer.com/amp/nvidia-rtx-3060-rumours-suggest-2080-performance-for-2060-cash/

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Howard Phillips posted:

What does this mean? Does Arm own the foundries or just design architecture and license it?

ARM the company doesn't own any foundaries. They own a buttload of IP and research experience, though, and licensing deals out the wazoo.

Rinkles posted:

is this likely to get approved w/o issue

Probably. It's a UK company being bought by a US one, so none of the usual China-involved worries. It doesn't create a monopoly on anything, either. So unless Trump wants to stick his dick it in for some unknowable reason, yeah, it'll probably go through alright.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Sphyre posted:

I don’t see the problem here. For example as we’ve gone from 480p to 1080p to 4K, the frame rate of movies has also had to increase correspondingly, from 23fps to

There needs to be some kind of disclosure law when filmmakers try to sneak in one or two scenes of double-framerate. There's been a couple movies like Gangster Squad and Rogue One, not good movies in the first place but anyway they are 24fps almost the whole time then BAM one scene they switch to 48fps and I'm actually getting ill. It's super jarring. Mercifully I haven't seen it lately so maybe they got the hint.

CaptainSarcastic
Jul 6, 2013



Craptacular! posted:

Userbenchmark is bad, unless you’d [url= rather buy an i3 over Threadripper. I wouldn’t use a card with 4GB memory now, even if a strong performer of its generation. I was just trying to solve your problem today, since you might not get a 30-series until next year depending on things we don’t know.

His relentless hate-on for AMD is hilarious. How many platforms has he been banned from at this point?

CaptainSarcastic fucked around with this message at 02:22 on Sep 14, 2020

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Zero VGS posted:

There needs to be some kind of disclosure law when filmmakers try to sneak in one or two scenes of double-framerate. There's been a couple movies like Gangster Squad and Rogue One, not good movies in the first place but anyway they are 24fps almost the whole time then BAM one scene they switch to 48fps and I'm actually getting ill. It's super jarring. Mercifully I haven't seen it lately so maybe they got the hint.

sorry for the derail, but which scenes were that in Rogue One? I don't think I ever noticed.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

Userbenchmark is bad, unless you’d [url= rather buy an i3 over Threadripper. I wouldn’t use a card with 4GB memory now, even if a strong performer of its generation. I was just trying to solve your problem today, since you might not get a 30-series until next year depending on things we don’t know.

there's no spin with their GPU and (especially) their SSD benchmarks. If you want to see how a GTX 680 compares to a GT 1030 or how a GTX 750 compares to an Iris Pro there's no actual reviewer who's ever going to run a benchmark of that so you take what you can get. It's pretty ballpark accurate.

(again, remember you were responding to a post using the GPU section, not the CPU section. Bit of a non sequitur.)

also, I'm just going to say it, the 1950X loving sucked, it had even more problems than the other first-gen Ryzen poo poo, it looked great in Cinebench and it ran like poo poo in actual programs or (especially) gaming. It was NUMA on a package, it had all kinds of latency problems, it had half-rate AVX2, it was not a great processor. AMD's technique for gluing dies together got way better with Zen2.

depending on what you were doing, a 8350K may well have come out on top of consumer applications. 1950X obviously did better in cinebench or other parallel poo poo that didn't care about latency/etc though.

Paul MaudDib fucked around with this message at 02:29 on Sep 14, 2020

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

gradenko_2000 posted:

sorry for the derail, but which scenes were that in Rogue One? I don't think I ever noticed.

I am 90% sure the CGI flyover of the lava planet (Mustafar) was double framerate, though I don't think you can see it in this YouTube clip.:

https://www.youtube.com/watch?v=smYFSWHTg8Y

I felt the same way watching it as I did with the fist-fight scene at the end of Gangster Squad and a lot of other people caught and commented that one at the time.

Martian Manfucker
Dec 27, 2012

misandry is real
Can someone explain the weirdness I'm seeing happen with regards to idle clock speed on my 1660 TI? I recently got two 1440p/144hz FreeSync displays, the LG 27GL83A-B main and a ViewSonic VX2758-2KP-MHD secondary and hooked them up with DP and HDMI respectively. Both are running at 1440p/144hz with FreeSync enabled. This is all great and the jump from 60 to 144hz was pretty incredible, but my GPU refused to downclock when idling and was sitting at 1400MHz/6000MHz~ just staring at my desktop wallpaper. Normally I wouldn't notice something like this, but the difference between 45c and 40c means that the fan isn't turning off and I can hear it whirring away.

Did a bit of googling and this seemed to be a problem with nvidia cards and 2+ monitors running at different resolutions/hz a couple years ago, but it's happening to me with two monitors that are the same. In any case I downloaded nvidia profile inspector which had a multi display power saver feature that I've enabled and it's working pretty great and I haven't noticed any problems at all. The card idles at 300/405MHz sometimes. If I run a game and the card clocks up to gaming speeds, it doesn't drop back down to 300/405MHz anymore, instead sitting around 600/405 on the desktop until I shut a monitor off and turn it back on.

Is there another solution to this or is this janky utility something I've just got to live with?

Craptacular!
Jul 9, 2001

Fuck the DH

redreader posted:

Right, fair enough! Yeah I'll see how this pans out. If all goes well, I'll manage to get a 3070 from nowinstock.

Okay. For reference, the 980ti is equivalent to a 1070, and the 1660 is about 15% better than that, so it slots in just under the 1080, and comparisons between the 1080 and the non-ti 980 aren’t event in the same ballpark.

Oxxidation
Jul 22, 2007
my case can barely fit its 1080ti so the 3080 is right out, if i even wanted to drop the cash

internet's being frustratingly vague about comparisons between the 1080ti and the 3070 as well

Indiana_Krom
Jun 18, 2007
Net Slacker

Martian Manfucker posted:

Can someone explain the weirdness I'm seeing happen with regards to idle clock speed on my 1660 TI? I recently got two 1440p/144hz FreeSync displays, the LG 27GL83A-B main and a ViewSonic VX2758-2KP-MHD secondary and hooked them up with DP and HDMI respectively. Both are running at 1440p/144hz with FreeSync enabled. This is all great and the jump from 60 to 144hz was pretty incredible, but my GPU refused to downclock when idling and was sitting at 1400MHz/6000MHz~ just staring at my desktop wallpaper. Normally I wouldn't notice something like this, but the difference between 45c and 40c means that the fan isn't turning off and I can hear it whirring away.

Did a bit of googling and this seemed to be a problem with nvidia cards and 2+ monitors running at different resolutions/hz a couple years ago, but it's happening to me with two monitors that are the same. In any case I downloaded nvidia profile inspector which had a multi display power saver feature that I've enabled and it's working pretty great and I haven't noticed any problems at all. The card idles at 300/405MHz sometimes. If I run a game and the card clocks up to gaming speeds, it doesn't drop back down to 300/405MHz anymore, instead sitting around 600/405 on the desktop until I shut a monitor off and turn it back on.

Is there another solution to this or is this janky utility something I've just got to live with?

Just as a test, drop the secondary monitor down to 60 Hz. Different resolutions and refresh rates hasn't been a problem in forever, but multiple high refresh displays might be pushing the display controller hard enough that it doesn't want to clock down.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

Oxxidation posted:

my case can barely fit its 1080ti so the 3080 is right out, if i even wanted to drop the cash

internet's being frustratingly vague about comparisons between the 1080ti and the 3070 as well

They are claiming it to be equal to the 2080ti so there's your ballpark

Martian Manfucker
Dec 27, 2012

misandry is real

Indiana_Krom posted:

Just as a test, drop the secondary monitor down to 60 Hz. Different resolutions and refresh rates hasn't been a problem in forever, but multiple high refresh displays might be pushing the display controller hard enough that it doesn't want to clock down.

Just gave this a shot and there was no change. Thanks for the suggestion, though.

Oxxidation
Jul 22, 2007

Scarecow posted:

They are claiming it to be equal to the 2080ti so there's your ballpark

ha, looked into it further and found that my motherboard isn't new enough to handle the 3000 models either, all its connections are pci 3.0

so much for that, then

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
PCI-E is backwards compatible. Intel doesn't even have PCI-E 4.0 support yet. Depending on game, it's anywhere from no to a small performance difference - something to mildly care about if you were doing a new build maybe, but it shouldn't be the reason you decide to buy a video card or not.

spunkshui
Oct 5, 2011



Martian Manfucker posted:

Just gave this a shot and there was no change. Thanks for the suggestion, though.

Use MSI afterburner to make the fans wait till 50C to fire up?

More of a bandaid then a fix tho.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum

Craptacular! posted:

Okay. For reference, the 980ti is equivalent to a 1070, and the 1660 is about 15% better than that, so it slots in just under the 1080, and comparisons between the 1080 and the non-ti 980 aren’t event in the same ballpark.

Lol holy poo poo. Well, I can only imagine how much better it'll be with a 3070! Can't stress how much I'm looking forward to this.

Pilfered Pallbearers
Aug 2, 2007

Oxxidation posted:

ha, looked into it further and found that my motherboard isn't new enough to handle the 3000 models either, all its connections are pci 3.0

so much for that, then

The cards support PCI-e 4.0, but work fine on PCI-e 3.0. Current expectations are that you’re probably looking at MAYBE a 5% drop in performance from 4.0 to 3.0, if there’s any at all.

Also the 3080 FE is 285mm. Depending on your 1080Ti it might actually fit without issue.

Oxxidation
Jul 22, 2007

Kingnothing posted:

The cards support PCI-e 4.0, but work fine on PCI-e 3.0. Current expectations are that you’re probably looking at MAYBE a 5% drop in performance from 4.0 to 3.0, if there’s any at all.

Also the 3080 FE is 285mm. Depending on your 1080Ti it might actually fit without issue.

nah, the 1080ti is 267 mm and i have maybe 3 or 4 mm in clearance between the card and the drive cages. i asked a family friend to assemble this thing, it must have been a pain in the rear end to fit

good to know about the pci connectors, though

Truga
May 4, 2014
Lipstick Apathy

Martian Manfucker posted:

Just gave this a shot and there was no change. Thanks for the suggestion, though.

you'll probably want "multi display power saver", it's a part of the nvidia inspector app

a friend of mine runs a 144+60hz screen and without forcing the clocks manually with that app, it keeps at high frequency constantly

repiv
Aug 13, 2009

Oxxidation posted:

nah, the 1080ti is 267 mm and i have maybe 3 or 4 mm in clearance between the card and the drive cages. i asked a family friend to assemble this thing, it must have been a pain in the rear end to fit

good to know about the pci connectors, though

Do you know what case it is? Some of them are designed so the drive cage can be moved to another position.

Oxxidation
Jul 22, 2007

repiv posted:

Do you know what case it is? Some of them are designed so the drive cage can be moved to another position.

it's a be quiet! 800 silent base, this guy right here

i really, really like this case

e: oh hey i checked a youtube video and it looks like they are removable, this warrants further study

Oxxidation fucked around with this message at 03:47 on Sep 14, 2020

repiv
Aug 13, 2009

Yeah the drive cage is split into two sections, the top one comes out to make room for longer graphics cards

Just about any card should fit with that removed, and you still get four HDD slots with just the lower section

repiv fucked around with this message at 03:54 on Sep 14, 2020

Pilfered Pallbearers
Aug 2, 2007

Oxxidation posted:

it's a be quiet! 800 silent base, this guy right here

i really, really like this case

e: oh hey i checked a youtube video and it looks like they are removable, this warrants further study

You sure about the size of your GPU?

Their documentation sucks or is difficult to find, but according to pretty much every site I looked GPU clearance WITH the drive cage installed is 290mm (400mm with it out)

https://www.gamersnexus.net/hwreviews/1930-be-quiet-silent-base-800-review

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Oxxidation posted:

it's a be quiet! 800 silent base, this guy right here

i really, really like this case

e: oh hey i checked a youtube video and it looks like they are removable, this warrants further study

Picture number 2 literally is a diagram of the cages being removed

90s Solo Cup
Feb 22, 2011

To understand the cup
He must become the cup



All I want to know is if it's worthwhile to jump from a 1070Ti to a 3070. Paying north of $600 for any vidya card is something I still don't feel comfortable with, which takes the 3080 right out of contention for me.

BIG HEADLINE posted:

Idiots can't be trusted to not plug a GPU wall wart into an un-UPSed or non-surge protected outlet.

People might spend $3000 on a ~bitchin' rig~, but they won't spend $150 on a decent 1000-1500VA UPS or even $50 on a pro-grade surge protector.

At least get a UPS just so you won't get turbofucked out of your work when your area suffers a massive power outage for the second time in a week due to a substation screwup.

MarcusSA
Sep 23, 2007

90s Solo Cup posted:

All I want to know is if it's worthwhile to jump from a 1070Ti to a 3070. Paying north of $600 for any vidya card is something I still don't feel comfortable with, which takes the 3080 right out of contention for me.


It 100% is IMO.

CaptainSarcastic
Jul 6, 2013



Martian Manfucker posted:

Can someone explain the weirdness I'm seeing happen with regards to idle clock speed on my 1660 TI? I recently got two 1440p/144hz FreeSync displays, the LG 27GL83A-B main and a ViewSonic VX2758-2KP-MHD secondary and hooked them up with DP and HDMI respectively. Both are running at 1440p/144hz with FreeSync enabled. This is all great and the jump from 60 to 144hz was pretty incredible, but my GPU refused to downclock when idling and was sitting at 1400MHz/6000MHz~ just staring at my desktop wallpaper. Normally I wouldn't notice something like this, but the difference between 45c and 40c means that the fan isn't turning off and I can hear it whirring away.

Did a bit of googling and this seemed to be a problem with nvidia cards and 2+ monitors running at different resolutions/hz a couple years ago, but it's happening to me with two monitors that are the same. In any case I downloaded nvidia profile inspector which had a multi display power saver feature that I've enabled and it's working pretty great and I haven't noticed any problems at all. The card idles at 300/405MHz sometimes. If I run a game and the card clocks up to gaming speeds, it doesn't drop back down to 300/405MHz anymore, instead sitting around 600/405 on the desktop until I shut a monitor off and turn it back on.

Is there another solution to this or is this janky utility something I've just got to live with?

Are you sure the displays are both actually running at 144hz?

A couple weeks ago I found that my 144hz monitor seemed to be capped at 60hz and I couldn't figure out why. Nvidia control panel showed the right refresh rate, Windows display settings showed the right refresh rate, games were set to the right refresh rate, but I was still capped at 60hz. I finally checked in Device Manager and it had decided I had a generic PnP monitor running at 60hz. Fixed the refresh rate there and everything started working normally again. It might be worth double-checking just to be sure.

Pilfered Pallbearers
Aug 2, 2007

90s Solo Cup posted:

All I want to know is if it's worthwhile to jump from a 1070Ti to a 3070. Paying north of $600 for any vidya card is something I still don't feel comfortable with, which takes the 3080 right out of contention for me.


At least get a UPS just so you won't get turbofucked out of your work when your area suffers a massive power outage for the second time in a week due to a substation screwup.

3070=2080ti basically.

So yes.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Looks like the Arm purchase goes is confirmed:

quote:

Update: SoftBank has agreed to sell Arm Holdings to US chip company Nvidia for $40 billion, ending four years of ownership as the Japanese technology group shifts towards becoming a global investment and asset management powerhouse.

The UK chip designer is the latest large asset disposal orchestrated by SoftBank founder Masayoshi Son as his newly built war chest opens up options for the group including an expansion of trading into publicly listed technology stocks and a potential delisting of its own shares.

Under the deal, SoftBank will become the largest shareholder in Nvidia, which will pay the Japanese group $21.5 billion in common stock and $12 billion in cash. “We look forward to supporting the continued success of the combined business,” Mr Son said in a joint statement late on Sunday.
https://arstechnica.com/gadgets/2020/09/nvidia-reportedly-to-acquire-arm-holdings-from-softbank-for-40-billion/

There might be some requirements from the UK side but it'll probably go through.

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:
Gamers Nexus upgraded their testing methodology for GPU tests and will be used with the 3080. Should be interesting to see!

https://www.youtube.com/watch?v=-P7-ML-bPCE

MarcusSA
Sep 23, 2007

8-bit Miniboss posted:

Gamers Nexus upgraded their testing methodology for GPU tests and will be used with the 3080. Should be interesting to see!

https://www.youtube.com/watch?v=-P7-ML-bPCE

Yeah this is a good video.

Very informative.

Cefte
Sep 18, 2004

tranquil consciousness
Do we have any reasonable guesses to go off as to when Nvidia will stop selling founder edition 3080s? I don't mind waiting, but I'd be sad if they spike interest with the FE versions and then never put them back in stock after the 17th, to make more profit off the third-party boards. The UK store still has 2000 series in stock, but that seems of little predictive value.

Bad Parenting
Mar 26, 2007

This could get emotional...


My case is an NZXT H200 mini ITX, I'm guessing pretty much the only 3080 I'll be able to fit in there is the FE? And I have zero chance of getting a 3090 in there?

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

Bad Parenting posted:

My case is an NZXT H200 mini ITX, I'm guessing pretty much the only 3080 I'll be able to fit in there is the FE? And I have zero chance of getting a 3090 in there?

The FE 3090 is 12.3 inches (313mm) and your case lists a max GPU length of 325.

E: having said that the 3090 is kinda tall so I would definitely check more before buying one for a small case.

Llamadeus
Dec 20, 2005

AirRaid posted:

The FE 3090 is 12.3 inches (313mm) and your case lists a max GPU length of 325.

E: having said that the 3090 is kinda tall so I would definitely check more before buying one for a small case.
The length or the height aren't the main problems there, more that the H200 only has two PCIe slots and the FE is a 3 slot card :v:

Bad Parenting
Mar 26, 2007

This could get emotional...


Llamadeus posted:

The length or the height aren't the main problems there, more that the H200 only has two PCIe slots and the FE is a 3 slot card :v:

Yeah it's the 'thickness' of the card that is the issue, length is not too bad as stated above, it's just there's only 2 slots on the board and there's not much of a gap between those slots and the power supply shroud, so a lot of the AIB cards that state their size as 2.5 slots are probably not gonna fit either as far as I can tell

BurritoJustice
Oct 9, 2012

Cefte posted:

Do we have any reasonable guesses to go off as to when Nvidia will stop selling founder edition 3080s? I don't mind waiting, but I'd be sad if they spike interest with the FE versions and then never put them back in stock after the 17th, to make more profit off the third-party boards. The UK store still has 2000 series in stock, but that seems of little predictive value.

Regions with direct NVIDIA sales should continue to sell founders editions. Regions where NVIDIA isn't selling direct and allocations are being bought by retailers are uncertain to guaranteed to stop, depending.

Drone
Aug 22, 2003

Incredible machine
:smug:


Are third-party boards launching on the 17th too? No idea how that normally works.

Adbot
ADBOT LOVES YOU

Cefte
Sep 18, 2004

tranquil consciousness

BurritoJustice posted:

Regions with direct NVIDIA sales should continue to sell founders editions. Regions where NVIDIA isn't selling direct and allocations are being bought by retailers are uncertain to guaranteed to stop, depending.
Makes sense, thanks!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply