Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Naffer
Oct 26, 2004

Not a good chemist

FaustianQ posted:

Never seen a low profile GTX 460?

Missed that bit.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

Sorry for nearly spamming the thread at this point, but asking for a friend who's budget is really low and ebaying a card: low profile R5 240 vs HD7570? Replacing their current setup isn't an option, and they're running an HD5450 so literally anything is better (it's chugging on 1080p and flash games, let alone a handful of steam games they play), but I'm iffy on recommending a R5 230 (or any VLIW card) or GT 720, but they'll get more oomph out of the 7570 compared to the R5 240. Budget is 40$.

Does it need to be low-profile? Could they get by on integrated graphics instead?

The GT 720 is Kepler, which is almost 3 generations old at this point but it's still officially supported and has H264 decode and all that jazz. The 7570 is actually a VLIW chip too - only the 7700 series and above got GCN chips. I would NOT recommend VLIW at this point.

I would take a look at the GT 640 too. It's another Kepler like the 720, but it has twice as many cores so it will be faster. Be careful what you buy - many of these low-end models are VERY confusing, for example different models of the GT 730 can range between a 96-core DDR3 Fermi and a 384-core GDDR5 Kepler and that's a massive difference in performance.

I have a used Zotac Zone Edition 2GB DDR3 GT 640 (384 Kepler cores) that I need to get rid of. It's a passively-cooled 2-slot model that's slightly taller than normal. I'd do $40 + shipping if they're interested in that. I also have a IceQ X TurboX 7850 2GB that I would do for $80+ shipping.

Be aware that your friend is shopping in the extreme shallow end of the market. $40 is basically "literally anything that will fit in the slot and has active driver support" money. Every marginal dollar they are willing to spend will pay outsized returns in performance. A $60 card instead of a $40 card is going to be a lot more than 50% extra performance - that gets you up to a 7770 which is more than twice as fast as a GT 640. Going to $75 gets you into 7850 territory, which is more than half again as fast as a 7770, or about three times as fast as a GT 640. The 7850 or 750 Ti is really where they need to be for low-end 1080p gaming, everything below that will absolutely chug.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

Never seen a low profile GTX 460?

Yeah, this wasn't clear. The only recent low-profile cards on the market are either Kepler GT cards, Maxwell 750/750 ti cards, or R5 240 and 7750/R7 250/7770/R7 250X. Maaaaybe a few specialty GTX 950s. Everything past that is out of driver support.

It's unfortunately quite difficult to find a workable upgrade for beige-box desktop PCs. There's the low-profile problem, and some of them don't like multi-slot coolers or have enough power to drive much of a GPU either. Maybe sink some money into a FM2 motherboard or a new case instead. :shrug:

Paul MaudDib fucked around with this message at 17:44 on Apr 22, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
It's an old quadcore Phenom II 820 running on an HD4200 integrated, so it's actually a worse situation to drop the HD5450.

Oh I know the HD 7570 is VLIW, that's why I cautioned against it, although they're not particularly concerned with future support since they want a new computer next year, just want something to hold them over until then, I think they're going low profile mini itx or just a Zbox. I'm cautioning against spending but then I'm also telling someone to not enjoy their computer for little more than a year so :shrug: :sigh:

wolrah
May 8, 2006
what?
e: inadvertently responded to a topic from pages ago, not worth bringing it back up.

Anime Schoolgirl
Nov 28, 2002

in december last year i picked up a refurb single slot low profile 7750 for $49 and still plays everything at 1600x900, see if there are any more of those around I guess

Salt Fish
Sep 11, 2003

Cybernetic Crumb
Do those grounding cards actually work? If my 980ti is running even mediumish hard it will produce enough EMI to affect guitar signals from 4-5 feet away.

Gonkish
May 19, 2004

xthetenth posted:

Price/Perfromance is usually reasonably close. What's the freesync range on that screen?

I'm not entirely sure. It's an LG 29UM67P. All I've managed to garner in my googling is that you need to use DP for FreeSync to do its thing? I guess?

EDIT: After further googling, the range is 48-75fps.

Gonkish fucked around with this message at 18:41 on Apr 22, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

It's an old quadcore Phenom II 820 running on an HD4200 integrated, so it's actually a worse situation to drop the HD5450.

Oh I know the HD 7570 is VLIW, that's why I cautioned against it, although they're not particularly concerned with future support since they want a new computer next year, just want something to hold them over until then, I think they're going low profile mini itx or just a Zbox. I'm cautioning against spending but then I'm also telling someone to not enjoy their computer for little more than a year so :shrug: :sigh:

Seconding the 7750 for $50 idea if you can get one that'll fit for that price. Be sure to check both profile, slot thickness, and card height.

I know I'm preaching to the choir here but the price scale for computer parts is not linear. It costs $40 just to get something that plugs into the slot and isn't end-of-life'd. Running his 5450 until he can replace it is the best idea, otherwise he'll be throwing away $50+ dollars on a part he doesn't care about. Even if he resells it, he'll still be out at least $20-25 of that $50. He could just spend that $50 on a case that will actually fit a decent GPU instead, and then incrementally upgrade.

wargames
Mar 16, 2008

official yospos cat censor

Gonkish posted:

I'm not entirely sure. It's an LG 29UM67P. All I've managed to garner in my googling is that you need to use DP for FreeSync to do its thing? I guess?

EDIT: After further googling, the range is 48-75fps.

Could go bigger for cheaper

http://www.newegg.com/Product/Product.aspx?Item=0JC-000D-003Z3R

http://www.newegg.com/Product/Product.aspx?Item=0JC-000D-003Z3

Anime Schoolgirl
Nov 28, 2002

http://videocardz.com/59266/nvidia-pascal-gp104-gpu-pictured-up-close
over/under on a $400 SKU out of the gate?

note: firesale prices on the 980ti are related to this :toot:

Gonkish
May 19, 2004


I grabbed this monitor at the beginning of the month for $270 (newegg had a flash sale of sorts), but poo poo those are still good deals if anyone else is looking.

Meanwhile, I guess I'm waiting on the new AMD lineup to get pushed out so I can take advantage of FreeSync.

wargames
Mar 16, 2008

official yospos cat censor

Gonkish posted:

I grabbed this monitor at the beginning of the month for $270 (newegg had a flash sale of sorts), but poo poo those are still good deals if anyone else is looking.

Meanwhile, I guess I'm waiting on the new AMD lineup to get pushed out so I can take advantage of FreeSync.

What do you think of 34 1080p ultrawide?

Gonkish
May 19, 2004

wargames posted:

What do you think of 34 1080p ultrawide?

Well mine is the 29" model (still 2560x1080), but it's beautiful. IPS is, of course, amazingly beautiful, the response time (5ms) is low enough that I haven't noticed ghosting, etc. The biggest difference is the FOV in games, which is one of those things you don't think about until you see it. Most of the games I've come across handle it without any shenanigans, though some will require you to dick around with .ini files and such (FO4 does this). All in all, it's really loving pretty and I love it.

penus penus penus
Nov 9, 2014

by piss__donald

Anime Schoolgirl posted:

http://videocardz.com/59266/nvidia-pascal-gp104-gpu-pictured-up-close
over/under on a $400 SKU out of the gate?

note: firesale prices on the 980ti are related to this :toot:

nice

wargames
Mar 16, 2008

official yospos cat censor

Gonkish posted:

Well mine is the 29" model (still 2560x1080), but it's beautiful. IPS is, of course, amazingly beautiful, the response time (5ms) is low enough that I haven't noticed ghosting, etc. The biggest difference is the FOV in games, which is one of those things you don't think about until you see it. Most of the games I've come across handle it without any shenanigans, though some will require you to dick around with .ini files and such (FO4 does this). All in all, it's really loving pretty and I love it.

Those 34's are still 2560x1080 its the um97 that are the cool 1440p.

wargames fucked around with this message at 20:11 on Apr 22, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo
Jesus, AMD stock up 40% on a single day's news? :stare:

craig588
Nov 19, 2005

by Nyc_Tattoo

Salt Fish posted:

Do those grounding cards actually work? If my 980ti is running even mediumish hard it will produce enough EMI to affect guitar signals from 4-5 feet away.

Yes they "work", but not nearly enough to have a meaningful impact. If you can get one for like 10 dollars or something insignificant sure throw it in and try it, but very likely you'd need a whole faraday cage for significant reduction. A case designed around EMI suppression would have a much greater impact.

Gonkish
May 19, 2004

wargames posted:

Those 34's are still 2560x1080 its the um97 that are the cool 1440p.

Oh poo poo. Yeah I didn't notice that. Welp. It's really pretty and I like it a lot. Considering I got the 29" for $270 shipped (it had $0.99 shipping), it was a steal. Really beautiful monitor and the aspect ratio is really nice in a lot of games (I notice it especially in FPS, flight sims, and racing/driving games).

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

SwissArmyDruid posted:

Jesus, AMD stock up 40% on a single day's news? :stare:

I'm so pissed. I'd been meaning to get into it.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Same, but I don't see why you couldn't still get into it now. I'm eyeing my brokerage account and weighing the $3000 or so in the hand. (metaphorically-speaking.) This would be a long-term investment.

repiv
Aug 13, 2009

Anime Schoolgirl posted:

http://videocardz.com/59266/nvidia-pascal-gp104-gpu-pictured-up-close
over/under on a $400 SKU out of the gate?

note: firesale prices on the 980ti are related to this :toot:

Yeah, this kills the 980ti. From the GP100 specs we know uncut Pascal is 18.6 SP-GFLOPS/mm2, so uncut GP104 is neck and neck with uncut GM200 at about 6200 SP-GFLOPS.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Gonkish posted:

I grabbed this monitor at the beginning of the month for $270 (newegg had a flash sale of sorts), but poo poo those are still good deals if anyone else is looking.

Meanwhile, I guess I'm waiting on the new AMD lineup to get pushed out so I can take advantage of FreeSync.

I was going to say, I got it like 4 months ago for $300 new from eBay (I think being sold through PCRush). That $400+ price on NewEgg right now is a ripoff. At $300 or less it's the best overall value in FreeSync monitors, which is why I got it.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

SwissArmyDruid posted:

Same, but I don't see why you couldn't still get into it now. I'm eyeing my brokerage account and weighing the $3000 or so in the hand. (metaphorically-speaking.) This would be a long-term investment.

Yeah, I was looking at it as a bet on Polaris, Zen, consoles and Boltzmann being enough to get them back to profitable because it was priced low enough that winning that bet means I'd come out ahead, and I still think that's a valid analysis, but fuuuuuuuuck.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

Yeah, this kills the 980ti. From the GP100 specs we know uncut Pascal is 18.6 SP-GFLOPS/mm2, so uncut GP104 is neck and neck with uncut GM200 at about 6200 SP-GFLOPS.

All of the initial GP104 releases are going to be die harvests, reportedly. The big question is whether GP104 has any DP capability, because that could increase the SP-GFLOPS/mm^2 rating over GP100.

If the 1080 is only almost as fast as the 980 Ti then I actually think that's not a particularly compelling product, that's basically just 980 performance. NVIDIA has to make some forward movement before I'll upgrade, if that's all they've got I guess I'll keep saving for the 1080 Ti or Vega or something.

I guess the speculation that putting DP units back on the die would kill Pascal's performance was correct...

Paul MaudDib fucked around with this message at 21:35 on Apr 22, 2016

EdEddnEddy
Apr 5, 2012



Paul MaudDib posted:

All of the initial GP104 releases are going to be die harvests, reportedly. The big question is whether GP104 has any DP capability, because that could increase the SP-GFLOPS/mm^2 rating over GP100.

If the 1080 is only almost as fast as the 980 Ti then I actually think that's not a particularly compelling product, that's basically just 980 performance. NVIDIA has to make some forward movement before I'll upgrade, if that's all they've got I guess I'll keep saving for the 1080 Ti or Vega or something.

I guess the speculation that putting DP units back on the die would kill Pascal's performance was correct...

Considering the Move to GDDR5X, I would imagine the 1070 would be closer to 980Ti speeds than the 1080. However until we actually see some #'s, we are all pulling hope and dreams out of the Unicorn's rear end here.

If after 2 years of Maxwell Nvidia can't make a new generation of GPU's faster than the previous then something would be seriously wrong.

SlayVus
Jul 10, 2009
Grimey Drawer

EdEddnEddy posted:

Considering the Move to GDDR5X, I would imagine the 1070 would be closer to 980Ti speeds than the 1080. However until we actually see some #'s, we are all pulling hope and dreams out of the Unicorn's rear end here.

If after 2 years of Maxwell Nvidia can't make a new generation of GPU's faster than the previous then something would be seriously wrong.

I can't imagine that VRAM speeds have much to do with performance now. Sure you can squeeze out 1-3 fps increasing memory speeds, but it's not much when you clocks can give 5-15.

Anime Schoolgirl
Nov 28, 2002

memory bandwidth is actually a huge bottleneck for gm200 cards at 1440p and higher

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

EdEddnEddy posted:

Considering the Move to GDDR5X, I would imagine the 1070 would be closer to 980Ti speeds than the 1080. However until we actually see some #'s, we are all pulling hope and dreams out of the Unicorn's rear end here.

If after 2 years of Maxwell Nvidia can't make a new generation of GPU's faster than the previous then something would be seriously wrong.

Yeah, we'll just have to wait and see. I think they will have to have the 1070 at least close to 980 Ti speeds and 1080 speeds somewhat faster than 980 Ti speeds out of pure marketing necessity. Better compute performance and slightly cooler cards are not going to compel a lot of upgrades, by themselves.

But this has always been the question - will they be putting DP back on-chip with consumer Pascal GPUs, and if so how much will it hurt them? Like I said three months ago:

Paul MaudDib posted:

Right, if we figure the standard generational improvement of 50% then 1.5^2 equals 2.25x, right in that ballpark.

Kaizinsal's right that the whole thing is a mess because it's been ages since we had a node shrink, interposer-based chips and HBM are coming into play, and the hand has been drastically overplayed on uarch improvements (Maxwell being terrible at scheduling and compute). You could make technically-feasible arguments for anything from zero performance improvement (NVIDIA puts out a 300mm^2 chip and spends the extra transistors on improving scheduling and DP) to 2.5x-3x the performance per die (NVIDIA keeps a gaming-specific chip and releases a 600mm^2 right away).

The thing to remember is that historically we don't get the GK100 (actually GK110) -sized chip on the first go-around. If the usual pattern is followed then we'll be going from a 600mm^2 chip to a 300mm^2 chip. The 1080 will probably be GP104, not GP100.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I'm going to say they did put the DP on it, if only because we know AMD did. Nvidia really doesn't want AMD to start selling workstation cards in numbers, and if they leave out the DP it makes it a choice between Kepler and Polaris, not a good matchup. Nvidia is going to feed those stuck with Kepler this generation, that's what is important to them monetarily, and if AMD gets a consumer gaming win? So what, market small, Volta crush.

Anime Schoolgirl
Nov 28, 2002

even if they didn't put DP they'd still have the hardware scheduler

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Anime Schoolgirl posted:

even if they didn't put DP they'd still have the hardware scheduler

Was this the main reason Maxwell wasn't well received?

EmpyreanFlux fucked around with this message at 23:20 on Apr 22, 2016

penus penus penus
Nov 9, 2014

by piss__donald

Anime Schoolgirl posted:

memory bandwidth is actually a huge bottleneck for gm200 cards at 1440p and higher

It is? Ill have to focus more on my memory OC then but I didnt notice much at +800 or whatever I left it at.

Anime Schoolgirl
Nov 28, 2002

THE DOG HOUSE posted:

It is? Ill have to focus more on my memory OC then but I didnt notice much at +800 or whatever I left it at.
it's a problem with games like witcher 3 and shadow of mordor that homph homph on textures, for most games it'll do fine

FaustianQ posted:

Was this the main reason Maxwell wasn't well received?
that's pretty much why it's Bad in DX12

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

FaustianQ posted:

Was this the main reason Maxwell wasn't well received?
Maxwell has been a phenomenal setup so far, as long as you're not trying to do compute with it. Gaming-wise ripping out DP and the scheduler was absolutely the right decision, as it has allowed NVidia to crush AMD for going on two years now. With DX12 and VR looking like they'll benefit from the scheduler being put back in, they might not age as gracefully as some other architectures, but until that stuff becomes commonplace, Maxwell's been brilliant.

Of course if you're a compute customer you're SOL, but then again NVidia doesn't really want compute customers buying a $300 card when they can sell them a $3,000 card instead.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Anime Schoolgirl posted:

that's pretty much why it's Bad in DX12

I'd figure for a different reason for why they weren't liked in workstation and HPC, is what I mean. Everyone's anxious for a Kepler replacement, Nvidia really can't afford AMD sneaking in. I get why Maxwell is a good pure game card, I'm just wondering if a hardware scheduler alone would be competitive vs something with hardware scheduling and DP, and thus DPless Pascal could replace Kepler.

SlayVus
Jul 10, 2009
Grimey Drawer
So if I'm reading this GP100 whitepaper right and am looking at these pictures clearly, the total die size includes the HBM chips as well?

They list the total GP100 accelerator as 140mmx78mm.

Edit: They list the GPU dize size separately. So the GPU size is ~78mmx~78mm.

SlayVus fucked around with this message at 05:54 on Apr 23, 2016

ItBurns
Jul 24, 2007

SwissArmyDruid posted:

Same, but I don't see why you couldn't still get into it now. I'm eyeing my brokerage account and weighing the $3000 or so in the hand. (metaphorically-speaking.) This would be a long-term investment.

Same but a boat instead of a single stock the day after it goes up 40% on the basis of how good I think its video game toy is.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

FaustianQ posted:

I'd figure for a different reason for why they weren't liked in workstation and HPC, is what I mean. Everyone's anxious for a Kepler replacement, Nvidia really can't afford AMD sneaking in. I get why Maxwell is a good pure game card, I'm just wondering if a hardware scheduler alone would be competitive vs something with hardware scheduling and DP, and thus DPless Pascal could replace Kepler.

Depends what you're using it for. For pure gaming, DP isn't really a selling point. For compute, though, it's a huge deal for a lot of people. A DP-less Pascal would probably not replace Kepler if it still used a Maxwell-like 1/32 FP64 vs Kepler's 1/3.

Adbot
ADBOT LOVES YOU

unixbeard
Dec 29, 2004

I'm about to get a Nvidia 970. I want to play games (windows) and do machine learning (linux). I thought I would be able to spin up vm's depending on what I was doing, but apparently the GPU device does not pass through with VMWare, even using ESXi. Is this correct? To be able to do both is my only option to dual boot?

Also when I look at the specs here http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications it says the 970 is only certified for Win 7 & 8, I was thinking of going with Win 10, will that be OK?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply