weaaddar posted:I'm somewhat space constrained in my current build, is it a good idea to go for a mini sized 970, or should I opt for a less powerful 960? The 970 is a much more powerful card, you should get it if you can afford it. The only compact 970s I have heard people having problems with are the single fan ones, as far as I know the Zotac one is fine, it might not overclock as high as the ones with a larger cooler but other than that it should be just fine.
|
|
# ? Oct 28, 2015 02:58 |
|
|
# ? Mar 29, 2024 15:01 |
|
Nfcknblvbl posted:Those slot-cpus look like a waste of material any way. It'd be a hell of a lot more convenient to upgrade from.
|
# ? Oct 28, 2015 07:24 |
|
DuckConference posted:Whatever happened to that virtu mvp stuff that was big news around the ivy bridge launch and then went nowhere Gwaihir posted:I think the answer was either "It was garbage with vastly different cards" or more likely it just tried to serve a market that doesn't exist.
|
# ? Oct 28, 2015 07:54 |
|
What is everyone's experience with game stream? I found that every game I tried to play that had its own launcher would not work past the launcher or just crash. Is this normal or fringe case?
|
# ? Oct 28, 2015 13:44 |
|
SlayVus posted:What is everyone's experience with game stream? I found that every game I tried to play that had its own launcher would not work past the launcher or just crash. Is this normal or fringe case? Game stream just got a beta update last night that would fix it crashing if the game ends unexpectedly... Maybe that will solve it for you.
|
# ? Oct 28, 2015 14:52 |
|
teagone posted:It'd be a hell of a lot more convenient to upgrade from. Slot-era CPUs also didn't have pesky things like heat dissipation and 90c temperatures to worry about
|
# ? Oct 28, 2015 14:56 |
|
I want them to go in the opposite direction and just design dual socket motherboards that take a CPU and a GPU chip. Like, what would happen if Intel made a dual 2011 motherboard like you see for servers, then made one socket a Skylake with no IGP, and the other socket just a gigantic Iris-only chip?
|
# ? Oct 28, 2015 15:37 |
|
You'd be designing something unique and expensive that could be more efficiently implemented by slapping a big whopping Iris GPU on a PCIe card. I can't really see any improvements you would get from having an Iris socket on your 2011 motherboard.
|
# ? Oct 28, 2015 15:42 |
|
Rockybar posted:I've been driving my 1440p monitor with my 2gb 560ti for too long now, and it's time to upgrade, mainly for Fallout 4, Battlefield, and then older stuff that didn't run too great anyway. Probably going to get a 970. Is EVGA still manufacturer of choice (this one specifically)? It's either that or gigabyte. Buying on Amazon and I've found their returns to be pretty good even with broken PC stuff. I've had that exact card. It was loud and whiney and easily got to 70+ temps in moderate gaming. I am SO glad I got rid of it and got the the MSI Gaming version. Night and day, whisper quiet and cool as a cucumber.
|
# ? Oct 28, 2015 15:47 |
|
Kazinsal posted:You'd be designing something unique and expensive that could be more efficiently implemented by slapping a big whopping Iris GPU on a PCIe card. Okay, well then imagining it was just a socket made with some new standard that both Nvidia/AMD/Intel could all use to slot a GPU in. I mean, it would revolutionize small-form-factor gaming PCs. The PCI card graphics aren't going to be sustainable forever; the next die shrink with all three companies having HBM, shouldn't that enable everything to handily fit on a mobo, even some of the smaller forms? Edit: Like, the Alienware Alpha solders a laptop 750ti into the motherboard, but if they had designed a socket instead they could tout the upgradeability.
|
# ? Oct 28, 2015 15:56 |
|
Zero VGS posted:Okay, well then imagining it was just a socket made with some new standard that both Nvidia/AMD/Intel could all use to slot a GPU in. I mean, it would revolutionize small-form-factor gaming PCs. The PCI card graphics aren't going to be sustainable forever; the next die shrink with all three companies having HBM, shouldn't that enable everything to handily fit on a mobo, even some of the smaller forms? You're describing what sounds like nVidia's NVLink mezzanine connector, but so far nVidia only has IBM / POWER on board AFAIK.
|
# ? Oct 28, 2015 16:06 |
|
He is describing MXM.
|
# ? Oct 28, 2015 16:25 |
|
To add expandability requires expanding the device's profile. For instance, the Alienware Alpha would either require MXM cards or a PCI-E slot to be upgradable. If you went with an MXM form, you're limited by the already installed thermal capabilities of the unit. If you went with PCI-E, you have to increase both the length and height of the unit. PCI-E would be cheaper and you could set it up where the only cards you can install are blower style. There by alleviating the need for internal fans still. Going MXM would require setting a max TDP limit on the card that the user can buy. You would probably want to setup your own web shop for customers to order upgraded MXM devices that you certify will work in the unit.
|
# ? Oct 28, 2015 16:43 |
|
Zero VGS posted:Okay, well then imagining it was just a socket made with some new standard that both Nvidia/AMD/Intel could all use to slot a GPU in. I mean, it would revolutionize small-form-factor gaming PCs. The PCI card graphics aren't going to be sustainable forever; the next die shrink with all three companies having HBM, shouldn't that enable everything to handily fit on a mobo, even some of the smaller forms? Look in to MXM. (edit: that's what I get for taking too long to respond) It's an nVidia-designed spec for replaceable graphics in laptops and other SFF/AiO PCs. The tricky thing is that in these environments the cooling and power delivery are limited by the host system, where in a standard desktop PC card format it's pretty much a free-for-all as long as it fits within a relatively large area. Need extra power? Just add a 6/8 pin plug MXM modules on the other hand have to place the parts requiring cooling in the same places so the chassis cooling system can be attached. If the chassis isn't designed with excess capacity you could only upgrade within the same power/thermal "bin". When it initially came out all the tech journalists were hyped up about finally having upgradeable graphics in laptops, but that really hasn't gone anywhere Getting MXM cards isn't straightforward and compatibility issues are common. It makes bumps to a product line easier for the OEM but does very little for the consumer. I wouldn't hold my breath waiting for any truly common standards smaller than half-height cards. As far as thin gaming hardware goes the popular option these days to retain upgradeability seems to be using 90 degree PCIe risers or extension cables to enable a standard dual-slot card to sit next to the motherboard instead of perpendicular to it. Has anyone ever done a dual-slot half height? Taking a quick look at the Optiplex 755s and 760s I have around they have two slots to play with and are about the size of an original Xbox which isn't too bad for HTPC/SteamBox type use.
|
# ? Oct 28, 2015 16:55 |
|
wolrah posted:Has anyone ever done a dual-slot half height? Taking a quick look at the Optiplex 755s and 760s I have around they have two slots to play with and are about the size of an original Xbox which isn't too bad for HTPC/SteamBox type use. PowerColor did a dual-slot half-height Radeon 5750 that ran at reference speeds.
|
# ? Oct 28, 2015 17:00 |
|
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127836 Wouldn't something like this work? It includes the low profile bracket (but isn't in stock atm).
|
# ? Oct 28, 2015 17:16 |
|
wolrah posted:Has anyone ever done a dual-slot half height? Taking a quick look at the Optiplex 755s and 760s I have around they have two slots to play with and are about the size of an original Xbox which isn't too bad for HTPC/SteamBox type use. dual-slot as in has two slots' with of connectors, or as in has a heatsink that occupies a second slot? My parts 750 ti is the latter.
|
# ? Oct 28, 2015 17:22 |
|
Don Lapre posted:There is no cooling benefit, if anything cooling is worse as you have less room I was thinking you could cool on both sides, but now that I think about it that's nonsense because the die's only on one side.
|
# ? Oct 28, 2015 18:05 |
|
Germstore posted:I was thinking you could cool on both sides, but now that I think about it that's nonsense because the die's only on one side. This does bring up something I've wondered about in the past. The backside of chips, even though there is a pcb layer, is very hot. I always thought you could get effective cooling out of that especially for GPU's since there is space to do so (unlike a cpu generally) edit: I guess it'd just be too obnoxious to with the solder and whatnot. It'd have to be like a separate heat spreader of sorts built in, but it is extremely hot regardless and how GPU temp is externally measured. There would definitely be benefit in my mind if you could find a way 1gnoirents fucked around with this message at 18:16 on Oct 28, 2015 |
# ? Oct 28, 2015 18:12 |
|
Passive airflow over the board due to heat driven convection + heat dissipation through the copper in the mobo is typically enough to cool most of the SMD chips short of the voltage regulated stuff but even those tend to have some sort of chunk of aluminum on them as a heatspreader if not actual heatsink these days on non-bottom rung mobos. For the GPU it could help yea but getting something to work well without interfering with the solder on the back could be problematic. That and it'd take up a fair amount of space which could interfere with a CPU or another GPU. Easier to just beef up the main HSF. PC LOAD LETTER fucked around with this message at 18:39 on Oct 28, 2015 |
# ? Oct 28, 2015 18:37 |
|
justdan posted:I've had that exact card. It was loud and whiney and easily got to 70+ temps in moderate gaming. Just ordered the MSI version. It will be the best card I've owned (for the time released) so I'm excited
|
# ? Oct 29, 2015 00:00 |
|
Zero VGS posted:I want them to go in the opposite direction and just design dual socket motherboards that take a CPU and a GPU chip. I don't know but I'd hang an NH-D15 or similar off it for sure.
|
# ? Oct 29, 2015 00:47 |
|
So Linus Tech Tips did a video a couple days ago on a dual-head VM gaming machine. Why is it possible to get Nvidia cards working using this software when Nvidia specifically prohibits desktop gamer GPUs from being used in a VM? http://lime-technology.com/ https://www.youtube.com/watch?v=LuJYMCbIbPk
|
# ? Oct 29, 2015 05:07 |
|
Because it virtualizes better, likely. Most VMs don't try very hard to hide the fact that they're present.
|
# ? Oct 29, 2015 05:09 |
|
SlayVus posted:So Linus Tech Tips did a video a couple days ago on a dual-head VM gaming machine. Why is it possible to get Nvidia cards working using this software when Nvidia specifically prohibits desktop gamer GPUs from being used in a VM? does it have something to do with the 3 separate graphics cards being used?
|
# ? Oct 29, 2015 05:10 |
|
Fauxtool posted:does it have something to do with the 3 separate graphics cards being used? I think they used just what they had on hand. The only stipulations they put forth was that you can't use the exact same model USB keyboard/mouse for each VM. They all had to be different, but other than that they didn't stipulate anything on the GPUs. Fauxtool posted:he said something about 1 lovely card to boot and 1 card per VM. Functionally it seems like 2 computers with only the cpu being shared. A lot of the functions have their own non-shared parts Said lovely card could also be the iGPU on the CPU. \/\/ SlayVus fucked around with this message at 05:26 on Oct 29, 2015 |
# ? Oct 29, 2015 05:17 |
|
SlayVus posted:I think they used just what they had on hand. The only stipulations they put forth was that you can't use the exact same model USB keyboard/mouse for each VM. They all had to be different, but other than that they didn't stipulate anything on the GPUs. he said something about 1 lovely card to boot and 1 card per VM. Functionally it seems like 2 computers with only the cpu being shared. A lot of the functions have their own non-shared parts I dont know poo poo about poo poo when it comes to VMs so I could be totally wrong. Fauxtool fucked around with this message at 05:25 on Oct 29, 2015 |
# ? Oct 29, 2015 05:21 |
|
Its WCCF but some 380x details: http://wccftech.com/amd-radeon-r9-380x-confirmed-feature-256bit-bus-4-gb-gddr5-vram-antigua-xt-arriving-consumers-november/
|
# ? Oct 29, 2015 13:59 |
|
Wow I figured someone just forgot about that card lol
|
# ? Oct 29, 2015 16:05 |
|
PC LOAD LETTER posted:Passive airflow over the board due to heat driven convection + heat dissipation through the copper in the mobo is typically enough to cool most of the SMD chips short of the voltage regulated stuff but even those tend to have some sort of chunk of aluminum on them as a heatspreader if not actual heatsink these days on non-bottom rung mobos. I dunno, could it be theoretically possible to integrate a small heatsink onto a GPU backplate? Just a small heatsink maybe 1/8" high with a thermal pad between the backplate and PCB where the GPU is. Or just put small fins on the backplate itself with thermal pads underneath for all the hot components so there's some passive cooling. Probably more cost/work than it would be worth I'd guess.
|
# ? Oct 29, 2015 18:24 |
|
Ozz81 posted:I dunno, could it be theoretically possible to integrate a small heatsink onto a GPU backplate? Just a small heatsink maybe 1/8" high with a thermal pad between the backplate and PCB where the GPU is. Or just put small fins on the backplate itself with thermal pads underneath for all the hot components so there's some passive cooling. Probably more cost/work than it would be worth I'd guess. I looked into it briefly because its hard for me to imagine it not being effective since with thermal cameras you can see how hot the board itself gets (and how cooling directly affects both sides), and found some pretty vague and clearly poorly done job on a 9800 that gave him a 11 degree drop. I'm going to guess its just not worth the hassle especially since a modern card isn't really thermally limited to begin with anymore. There were those who starkly opposed the concept altogether which I believe is just incorrect, but I'm still leaning towards "not worth it at all".
|
# ? Oct 29, 2015 18:31 |
|
Since this seems to be the default Nvidia thread, question about VIAs upcoming 28/16nm CPUs, could Nvidia inject cash into VIA and get a licensing deal where it's VIA designed CPUs with an Nvidia iGPU for use in Tegra products? Would this trigger some bullshit with Intel? If not couldn't Nvidia stealth maneuver into the x86 market by holding VIAs purse strings? Wouldn't this be pretty much VIAs big break as well?
EmpyreanFlux fucked around with this message at 18:08 on Oct 30, 2015 |
# ? Oct 29, 2015 22:22 |
|
In theory, sure, provided that VIA could demonstrate the existence of a growth market segment that Nvidia could reliably grow an IGP business. Can VIA demonstrate the existence of said segment? Ehhhhhhhhhhh.....
|
# ? Oct 29, 2015 23:33 |
|
Well it would potentially benefit nv to get VIA's chip designers. (What I'm saying is that Denver was bad.)
|
# ? Oct 30, 2015 02:22 |
|
SlayVus posted:So Linus Tech Tips did a video a couple days ago on a dual-head VM gaming machine. Why is it possible to get Nvidia cards working using this software when Nvidia specifically prohibits desktop gamer GPUs from being used in a VM? Currently, NVIDIA is only using the cpuid and paravirtual hyperv enhancement vendor IDs to detect virtualization, which can be set to whatever the hypervisor wants to set them to. If NVIDIA really want to, there are lots of things they can look for that can be hard to hide (especially timing stuff, since they have access to a device with a trusted clock).
|
# ? Oct 30, 2015 02:27 |
|
The funny thing is I've been doing that "headless" stuff without virtualization... I ran two instances of Final Fantasy XIV, each on a different television, two different accounts, two Xbox controllers, one PC/GPU, and it worked perfectly fine. There was even an unofficial split-screen patch for Borderlands 2 and Pre-Sequel which worked flawlessly. Then along comes Resident Evil 5, a game that was specifically designed for co-op to such an extent that I would argue it's not even intended to be experienced single player, and they couldn't get that to work in the PC port. I guess what I'm saying is there wouldn't be a need for local virtualization if PC developers were held to do a modicum of work to enable split-screen when it's present on the console versions, or at the very least allow two instances of any game to run at once, I don't even think that requires any work except detecting the additional mouse/kb and binding it to the second instance.
|
# ? Oct 30, 2015 02:49 |
|
Ozz81 posted:I dunno, could it be theoretically possible to integrate a small heatsink onto a GPU backplate? I haven't seen anyone try it a long time with a GPU but it usually involved taking a older HSF they had lying around and using a bunch of zip ties and thermal pads to get it to stay on the back of the card. Its all the solder bumps n' stuff on the back that puts the kibosh on the effectiveness I think. You just can't get really good HSF to card/GPU contact and the amount of TIM's you have to use to prevent a short is just plain silly.
|
# ? Oct 30, 2015 07:39 |
|
xthetenth posted:Well it would potentially benefit nv to get VIA's chip designers. Wut, where does this come from? Denver's performance is great unless you focus on a single micro-benchmark no one (except AnandTech) considers indicative of anything any more. It's one of the fastest non-Apple ARM cores, and remember it has a process node disadvantage compared to those. Hiowf fucked around with this message at 09:05 on Oct 30, 2015 |
# ? Oct 30, 2015 08:47 |
|
https://www.youtube.com/watch?v=6smx6S2G-D0 Not to drag up old bullshit but I just watched this video trawling through Newegg and did AMD switch back to the socket A band for piledriver??? (jump to 5:40, I can't remember how to set a start time with embedded video)
|
# ? Oct 30, 2015 13:15 |
|
|
# ? Mar 29, 2024 15:01 |
|
FaustianQ posted:Since this seems to be the default Nvidia thread, question VIAs upcoming 28/16nm CPUs, could Nvidia inject cash to into VIA and get a listening deal where it's VIA designed CPUs with Nvidia iGPU for use in Tegra products? Would this trigger some bullshit with Intel? If not couldn't Nvidia stealth maneuver into the x86 market by holding VIAs purse strings? Wouldn't this be pretty much VIAs big break as well? Afaik Nvidia holds a x86 license which basically says "you can use our technology but never make a real cpu, only pcie add in cards".
|
# ? Oct 30, 2015 13:27 |