Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

weaaddar posted:

I'm somewhat space constrained in my current build, is it a good idea to go for a mini sized 970, or should I opt for a less powerful 960?
I'm using a thinkserver ts 140, with a xeon e1225v3 where I've replaced the psu with a 500w from evga.

For the 970, I'm looking at this zotac card, the reviews are somewhat favorable:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814500362&cm_re=gtx_970_zotac-_-14-500-362-_-Product

For the 960, I'm looking at this evga card:
2gb
http://www.newegg.com/Product/Product.aspx?Item=N82E16814487093&cm_re=evga_gtx_960-_-14-487-093-_-Product
4gb
http://www.newegg.com/Product/Product.aspx?Item=N82E16814487154&cm_re=evga_gtx_960-_-14-487-093-_-Product

I guess some questions, I'm wondering is does the 4gb actually matter for the 960? And is the 970 worth the price premium?
I'd prefer it to be relatively quiet, but I figure it won't be as one of the larger sized cards.

I'm targeting 1080p in modern games.

The 970 is a much more powerful card, you should get it if you can afford it. The only compact 970s I have heard people having problems with are the single fan ones, as far as I know the Zotac one is fine, it might not overclock as high as the ones with a larger cooler but other than that it should be just fine.

Adbot
ADBOT LOVES YOU

teagone
Jun 10, 2003

That was pretty intense, huh?

Nfcknblvbl posted:

Those slot-cpus look like a waste of material any way.

It'd be a hell of a lot more convenient to upgrade from.

japtor
Oct 28, 2005

DuckConference posted:

Whatever happened to that virtu mvp stuff that was big news around the ivy bridge launch and then went nowhere
I think it was with Sandy Bridge, and it was basically so you could use the IGP features along with a normal GPU right? I think Intel fixed the issue (where the IGP would be disabled without a display attached) with Ivy so...

Gwaihir posted:

I think the answer was either "It was garbage with vastly different cards" or more likely it just tried to serve a market that doesn't exist.
It might make a (very minor) comeback though! People apparently use it with Thunderbolt for hacked up external GPU setups, and with TB3 there's going to be official support for GPUs. Course Intel could just be doing their own software (and/or AMD who's demoed their GPUs with them) for the GPU switching.

SlayVus
Jul 10, 2009
Grimey Drawer
What is everyone's experience with game stream? I found that every game I tried to play that had its own launcher would not work past the launcher or just crash. Is this normal or fringe case?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

SlayVus posted:

What is everyone's experience with game stream? I found that every game I tried to play that had its own launcher would not work past the launcher or just crash. Is this normal or fringe case?

Game stream just got a beta update last night that would fix it crashing if the game ends unexpectedly... Maybe that will solve it for you.

Anime Schoolgirl
Nov 28, 2002

teagone posted:

It'd be a hell of a lot more convenient to upgrade from.

Slot-era CPUs also didn't have pesky things like heat dissipation and 90c temperatures to worry about

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I want them to go in the opposite direction and just design dual socket motherboards that take a CPU and a GPU chip.

Like, what would happen if Intel made a dual 2011 motherboard like you see for servers, then made one socket a Skylake with no IGP, and the other socket just a gigantic Iris-only chip?

Kazinsal
Dec 13, 2011



You'd be designing something unique and expensive that could be more efficiently implemented by slapping a big whopping Iris GPU on a PCIe card.

I can't really see any improvements you would get from having an Iris socket on your 2011 motherboard.

justdan
Mar 31, 2010

Rockybar posted:

I've been driving my 1440p monitor with my 2gb 560ti for too long now, and it's time to upgrade, mainly for Fallout 4, Battlefield, and then older stuff that didn't run too great anyway. Probably going to get a 970. Is EVGA still manufacturer of choice (this one specifically)? It's either that or gigabyte. Buying on Amazon and I've found their returns to be pretty good even with broken PC stuff.

I've had that exact card. It was loud and whiney and easily got to 70+ temps in moderate gaming.

I am SO glad I got rid of it and got the the MSI Gaming version. Night and day, whisper quiet and cool as a cucumber.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Kazinsal posted:

You'd be designing something unique and expensive that could be more efficiently implemented by slapping a big whopping Iris GPU on a PCIe card.

I can't really see any improvements you would get from having an Iris socket on your 2011 motherboard.

Okay, well then imagining it was just a socket made with some new standard that both Nvidia/AMD/Intel could all use to slot a GPU in. I mean, it would revolutionize small-form-factor gaming PCs. The PCI card graphics aren't going to be sustainable forever; the next die shrink with all three companies having HBM, shouldn't that enable everything to handily fit on a mobo, even some of the smaller forms?

Edit: Like, the Alienware Alpha solders a laptop 750ti into the motherboard, but if they had designed a socket instead they could tout the upgradeability.

Rastor
Jun 2, 2001

Zero VGS posted:

Okay, well then imagining it was just a socket made with some new standard that both Nvidia/AMD/Intel could all use to slot a GPU in. I mean, it would revolutionize small-form-factor gaming PCs. The PCI card graphics aren't going to be sustainable forever; the next die shrink with all three companies having HBM, shouldn't that enable everything to handily fit on a mobo, even some of the smaller forms?

Edit: Like, the Alienware Alpha solders a laptop 750ti into the motherboard, but if they had designed a socket instead they could tout the upgradeability.

You're describing what sounds like nVidia's NVLink mezzanine connector, but so far nVidia only has IBM / POWER on board AFAIK.

LiquidRain
May 21, 2007

Watch the madness!

He is describing MXM.

SlayVus
Jul 10, 2009
Grimey Drawer
To add expandability requires expanding the device's profile. For instance, the Alienware Alpha would either require MXM cards or a PCI-E slot to be upgradable. If you went with an MXM form, you're limited by the already installed thermal capabilities of the unit. If you went with PCI-E, you have to increase both the length and height of the unit. PCI-E would be cheaper and you could set it up where the only cards you can install are blower style. There by alleviating the need for internal fans still. Going MXM would require setting a max TDP limit on the card that the user can buy. You would probably want to setup your own web shop for customers to order upgraded MXM devices that you certify will work in the unit.

wolrah
May 8, 2006
what?

Zero VGS posted:

Okay, well then imagining it was just a socket made with some new standard that both Nvidia/AMD/Intel could all use to slot a GPU in. I mean, it would revolutionize small-form-factor gaming PCs. The PCI card graphics aren't going to be sustainable forever; the next die shrink with all three companies having HBM, shouldn't that enable everything to handily fit on a mobo, even some of the smaller forms?

Edit: Like, the Alienware Alpha solders a laptop 750ti into the motherboard, but if they had designed a socket instead they could tout the upgradeability.

Look in to MXM. (edit: that's what I get for taking too long to respond) It's an nVidia-designed spec for replaceable graphics in laptops and other SFF/AiO PCs.

The tricky thing is that in these environments the cooling and power delivery are limited by the host system, where in a standard desktop PC card format it's pretty much a free-for-all as long as it fits within a relatively large area. Need extra power? Just add a 6/8 pin plug MXM modules on the other hand have to place the parts requiring cooling in the same places so the chassis cooling system can be attached. If the chassis isn't designed with excess capacity you could only upgrade within the same power/thermal "bin".

When it initially came out all the tech journalists were hyped up about finally having upgradeable graphics in laptops, but that really hasn't gone anywhere Getting MXM cards isn't straightforward and compatibility issues are common. It makes bumps to a product line easier for the OEM but does very little for the consumer.


I wouldn't hold my breath waiting for any truly common standards smaller than half-height cards. As far as thin gaming hardware goes the popular option these days to retain upgradeability seems to be using 90 degree PCIe risers or extension cables to enable a standard dual-slot card to sit next to the motherboard instead of perpendicular to it.


Has anyone ever done a dual-slot half height? Taking a quick look at the Optiplex 755s and 760s I have around they have two slots to play with and are about the size of an original Xbox which isn't too bad for HTPC/SteamBox type use.

Kazinsal
Dec 13, 2011



wolrah posted:

Has anyone ever done a dual-slot half height? Taking a quick look at the Optiplex 755s and 760s I have around they have two slots to play with and are about the size of an original Xbox which isn't too bad for HTPC/SteamBox type use.

PowerColor did a dual-slot half-height Radeon 5750 that ran at reference speeds.

eggyolk
Nov 8, 2007


http://www.newegg.com/Product/Product.aspx?Item=N82E16814127836

Wouldn't something like this work? It includes the low profile bracket (but isn't in stock atm).

SwissArmyDruid
Feb 14, 2014

by sebmojo

wolrah posted:

Has anyone ever done a dual-slot half height? Taking a quick look at the Optiplex 755s and 760s I have around they have two slots to play with and are about the size of an original Xbox which isn't too bad for HTPC/SteamBox type use.

dual-slot as in has two slots' with of connectors, or as in has a heatsink that occupies a second slot? My parts 750 ti is the latter.

Germstore
Oct 17, 2012

A Serious Candidate For a Serious Time

Don Lapre posted:

There is no cooling benefit, if anything cooling is worse as you have less room

I was thinking you could cool on both sides, but now that I think about it that's nonsense because the die's only on one side.

1gnoirents
Jun 28, 2014

hello :)

Germstore posted:

I was thinking you could cool on both sides, but now that I think about it that's nonsense because the die's only on one side.

This does bring up something I've wondered about in the past. The backside of chips, even though there is a pcb layer, is very hot. I always thought you could get effective cooling out of that especially for GPU's since there is space to do so (unlike a cpu generally)

edit: I guess it'd just be too obnoxious to with the solder and whatnot. It'd have to be like a separate heat spreader of sorts built in, but it is extremely hot regardless and how GPU temp is externally measured. There would definitely be benefit in my mind if you could find a way

1gnoirents fucked around with this message at 18:16 on Oct 28, 2015

PC LOAD LETTER
May 23, 2005
WTF?!
Passive airflow over the board due to heat driven convection + heat dissipation through the copper in the mobo is typically enough to cool most of the SMD chips short of the voltage regulated stuff but even those tend to have some sort of chunk of aluminum on them as a heatspreader if not actual heatsink these days on non-bottom rung mobos.

For the GPU it could help yea but getting something to work well without interfering with the solder on the back could be problematic. That and it'd take up a fair amount of space which could interfere with a CPU or another GPU. Easier to just beef up the main HSF.

PC LOAD LETTER fucked around with this message at 18:39 on Oct 28, 2015

Rockybar
Sep 3, 2008

justdan posted:

I've had that exact card. It was loud and whiney and easily got to 70+ temps in moderate gaming.

I am SO glad I got rid of it and got the the MSI Gaming version. Night and day, whisper quiet and cool as a cucumber.

Just ordered the MSI version. It will be the best card I've owned (for the time released) so I'm excited :pcgaming:

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Zero VGS posted:

I want them to go in the opposite direction and just design dual socket motherboards that take a CPU and a GPU chip.

Like, what would happen if Intel made a dual 2011 motherboard like you see for servers, then made one socket a Skylake with no IGP, and the other socket just a gigantic Iris-only chip?

I don't know but I'd hang an NH-D15 or similar off it for sure.

SlayVus
Jul 10, 2009
Grimey Drawer
So Linus Tech Tips did a video a couple days ago on a dual-head VM gaming machine. Why is it possible to get Nvidia cards working using this software when Nvidia specifically prohibits desktop gamer GPUs from being used in a VM?

http://lime-technology.com/

https://www.youtube.com/watch?v=LuJYMCbIbPk

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Because it virtualizes better, likely. Most VMs don't try very hard to hide the fact that they're present.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

SlayVus posted:

So Linus Tech Tips did a video a couple days ago on a dual-head VM gaming machine. Why is it possible to get Nvidia cards working using this software when Nvidia specifically prohibits desktop gamer GPUs from being used in a VM?

http://lime-technology.com/

https://www.youtube.com/watch?v=LuJYMCbIbPk

does it have something to do with the 3 separate graphics cards being used?

SlayVus
Jul 10, 2009
Grimey Drawer

Fauxtool posted:

does it have something to do with the 3 separate graphics cards being used?

I think they used just what they had on hand. The only stipulations they put forth was that you can't use the exact same model USB keyboard/mouse for each VM. They all had to be different, but other than that they didn't stipulate anything on the GPUs.

Fauxtool posted:

he said something about 1 lovely card to boot and 1 card per VM. Functionally it seems like 2 computers with only the cpu being shared. A lot of the functions have their own non-shared parts

Said lovely card could also be the iGPU on the CPU.

\/\/

SlayVus fucked around with this message at 05:26 on Oct 29, 2015

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

SlayVus posted:

I think they used just what they had on hand. The only stipulations they put forth was that you can't use the exact same model USB keyboard/mouse for each VM. They all had to be different, but other than that they didn't stipulate anything on the GPUs.

he said something about 1 lovely card to boot and 1 card per VM. Functionally it seems like 2 computers with only the cpu being shared. A lot of the functions have their own non-shared parts

I dont know poo poo about poo poo when it comes to VMs so I could be totally wrong.

Fauxtool fucked around with this message at 05:25 on Oct 29, 2015

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Its WCCF but some 380x details:

http://wccftech.com/amd-radeon-r9-380x-confirmed-feature-256bit-bus-4-gb-gddr5-vram-antigua-xt-arriving-consumers-november/

1gnoirents
Jun 28, 2014

hello :)
Wow I figured someone just forgot about that card lol

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

PC LOAD LETTER posted:

Passive airflow over the board due to heat driven convection + heat dissipation through the copper in the mobo is typically enough to cool most of the SMD chips short of the voltage regulated stuff but even those tend to have some sort of chunk of aluminum on them as a heatspreader if not actual heatsink these days on non-bottom rung mobos.

For the GPU it could help yea but getting something to work well without interfering with the solder on the back could be problematic. That and it'd take up a fair amount of space which could interfere with a CPU or another GPU. Easier to just beef up the main HSF.

I dunno, could it be theoretically possible to integrate a small heatsink onto a GPU backplate? Just a small heatsink maybe 1/8" high with a thermal pad between the backplate and PCB where the GPU is. Or just put small fins on the backplate itself with thermal pads underneath for all the hot components so there's some passive cooling. Probably more cost/work than it would be worth I'd guess.

1gnoirents
Jun 28, 2014

hello :)

Ozz81 posted:

I dunno, could it be theoretically possible to integrate a small heatsink onto a GPU backplate? Just a small heatsink maybe 1/8" high with a thermal pad between the backplate and PCB where the GPU is. Or just put small fins on the backplate itself with thermal pads underneath for all the hot components so there's some passive cooling. Probably more cost/work than it would be worth I'd guess.

I looked into it briefly because its hard for me to imagine it not being effective since with thermal cameras you can see how hot the board itself gets (and how cooling directly affects both sides), and found some pretty vague and clearly poorly done job on a 9800 that gave him a 11 degree drop. I'm going to guess its just not worth the hassle especially since a modern card isn't really thermally limited to begin with anymore. There were those who starkly opposed the concept altogether which I believe is just incorrect, but I'm still leaning towards "not worth it at all".

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Since this seems to be the default Nvidia thread, question about VIAs upcoming 28/16nm CPUs, could Nvidia inject cash into VIA and get a licensing deal where it's VIA designed CPUs with an Nvidia iGPU for use in Tegra products? Would this trigger some bullshit with Intel? If not couldn't Nvidia stealth maneuver into the x86 market by holding VIAs purse strings? Wouldn't this be pretty much VIAs big break as well?

EmpyreanFlux fucked around with this message at 18:08 on Oct 30, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo
In theory, sure, provided that VIA could demonstrate the existence of a growth market segment that Nvidia could reliably grow an IGP business.

Can VIA demonstrate the existence of said segment? Ehhhhhhhhhhh.....

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Well it would potentially benefit nv to get VIA's chip designers.

(What I'm saying is that Denver was bad.)

b0lt
Apr 29, 2005

SlayVus posted:

So Linus Tech Tips did a video a couple days ago on a dual-head VM gaming machine. Why is it possible to get Nvidia cards working using this software when Nvidia specifically prohibits desktop gamer GPUs from being used in a VM?

http://lime-technology.com/

https://www.youtube.com/watch?v=LuJYMCbIbPk

Currently, NVIDIA is only using the cpuid and paravirtual hyperv enhancement vendor IDs to detect virtualization, which can be set to whatever the hypervisor wants to set them to. If NVIDIA really want to, there are lots of things they can look for that can be hard to hide (especially timing stuff, since they have access to a device with a trusted clock).

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
The funny thing is I've been doing that "headless" stuff without virtualization...

I ran two instances of Final Fantasy XIV, each on a different television, two different accounts, two Xbox controllers, one PC/GPU, and it worked perfectly fine. There was even an unofficial split-screen patch for Borderlands 2 and Pre-Sequel which worked flawlessly.

Then along comes Resident Evil 5, a game that was specifically designed for co-op to such an extent that I would argue it's not even intended to be experienced single player, and they couldn't get that to work in the PC port.

I guess what I'm saying is there wouldn't be a need for local virtualization if PC developers were held to do a modicum of work to enable split-screen when it's present on the console versions, or at the very least allow two instances of any game to run at once, I don't even think that requires any work except detecting the additional mouse/kb and binding it to the second instance.

PC LOAD LETTER
May 23, 2005
WTF?!

Ozz81 posted:

I dunno, could it be theoretically possible to integrate a small heatsink onto a GPU backplate?
Its sorta been done before by people, looked and worked much as how 1gnoirents described too.

I haven't seen anyone try it a long time with a GPU but it usually involved taking a older HSF they had lying around and using a bunch of zip ties and thermal pads to get it to stay on the back of the card.

Its all the solder bumps n' stuff on the back that puts the kibosh on the effectiveness I think. You just can't get really good HSF to card/GPU contact and the amount of TIM's you have to use to prevent a short is just plain silly.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

xthetenth posted:

Well it would potentially benefit nv to get VIA's chip designers.

(What I'm saying is that Denver was bad.)

Wut, where does this come from? Denver's performance is great unless you focus on a single micro-benchmark no one (except AnandTech) considers indicative of anything any more. It's one of the fastest non-Apple ARM cores, and remember it has a process node disadvantage compared to those.

Hiowf fucked around with this message at 09:05 on Oct 30, 2015

Panty Saluter
Jan 17, 2004

Making learning fun!
https://www.youtube.com/watch?v=6smx6S2G-D0

Not to drag up old bullshit but I just watched this video trawling through Newegg and did AMD switch back to the socket A band for piledriver??? :psyduck:

(jump to 5:40, I can't remember how to set a start time with embedded video)

Adbot
ADBOT LOVES YOU

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

FaustianQ posted:

Since this seems to be the default Nvidia thread, question VIAs upcoming 28/16nm CPUs, could Nvidia inject cash to into VIA and get a listening deal where it's VIA designed CPUs with Nvidia iGPU for use in Tegra products? Would this trigger some bullshit with Intel? If not couldn't Nvidia stealth maneuver into the x86 market by holding VIAs purse strings? Wouldn't this be pretty much VIAs big break as well?

Afaik Nvidia holds a x86 license which basically says "you can use our technology but never make a real cpu, only pcie add in cards".

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply