|
Atomizer posted:Since there are some half-height 1050/Ti options coming out I was curious about the possibility of putting one of those into one of the business-class SFFs from HP, Dell, etc., something like a Lenovo Thinkcentre M9x or similar. While it can be hard to find details I read that some of the boards limit power to the PCIe 16x slots to 40 W. I know the 1050s can use 75 W max, from the slot with no external connectors, which is what makes them attractive for throwing in prebuilts. How would an AIB function if it could only draw 40 W? Would it work fine but with a handicap due to the power limitation, run unstably, or not run at all? Funnily enough I was thinking about this just a couple of days ago. The only confirmation I could find was here https://www.amazon.com/MSI-GTX-1050-TI-4GT/dp/B01N2W8MJ9/ where the second review has someone mention that it works in the Dell 9020 SFF, however it must be placed in the 4x PCIe slot because the 16x slot is right up against the PSU. Let me know if you end up doing this, I'd be interested in a second confirmation. Seems like a great way to make a budget rig since you can get used Dell 9020s for $200ish on eBay.
|
# ? Feb 15, 2017 01:28 |
|
|
# ? Apr 29, 2024 09:59 |
|
When people say "half height", do they mean half the height of the cards that used to be called "double height", i.e. a return to how tall graphics cards used to be in the early/mid 00's? Or are they even slimmer than old GPUs?
|
# ? Feb 15, 2017 02:12 |
|
Which 1070 card is the goon recommended one to grab?
|
# ? Feb 15, 2017 02:16 |
|
Grundulum posted:When people say "half height", do they mean half the height of the cards that used to be called "double height", i.e. a return to how tall graphics cards used to be in the early/mid 00's? Or are they even slimmer than old GPUs? This is a half-height card. Sometimes called low-profile. They're for small form factor computers where a full-height slot doesn't work. They almost always come with both brackets as shown so you can use them in normal computers as well. I think you're thinking of dual-slot graphics cards, which take up two side-by-side slots. This is still the standard for high-end GPUs. There are even half-height dual-slot cards. There's also full length cards, which are the reason some cases have a slotted area at the front. Those basically don't exist in the consumer market. This is obviously the exact opposite of that, it's basically as short as a functioning PCIe card can possibly be. wolrah fucked around with this message at 02:52 on Feb 15, 2017 |
# ? Feb 15, 2017 02:50 |
|
mrpeaches posted:Funnily enough I was thinking about this just a couple of days ago. The only confirmation I could find was here https://www.amazon.com/MSI-GTX-1050-TI-4GT/dp/B01N2W8MJ9/ where the second review has someone mention that it works in the Dell 9020 SFF, however it must be placed in the 4x PCIe slot because the 16x slot is right up against the PSU. Yeah, of all the prebuilt SFF PCs I saw, some had the 16x slot all the way at the bottom, with the PSU right next to that, so they can't fit a dual-width AIB. I'm kind of settling on the Lenovo ThinkCentre SFFs because not only do they come with decent specs and expandability (4xDIMM slots, at least 2xHDDs, up to 4 expansion slots) but they're actually made for dual GPUs: This board, again, is limited to 40 W per PCIe 16x slot though, and those Quadros in the photo are only rated for 32 W in case you were curious. I'd be perfectly fine if the 1050 Ti (or even a base 1050) was capped due to the power limitation because you still end up with a cheap, compact, moderately powerful system that was simple to upgrade. I just can't find any details about what would happen in such a power-limited situation. Anyone have any insight on this scenario? 89 posted:Which 1070 card is the goon recommended one to grab? For what it's worth, I'm happy with the Zotac Mini that I put in my main gaming desktop: https://www.amazon.com/gp/product/B01LLAJ8PU/ This particular system has size constraints that limit maximum length and height, and this was the only one I found at the time that could fit. It works as expected, though!
|
# ? Feb 15, 2017 08:38 |
|
Atomizer posted:I just can't find any details about what would happen in such a power-limited situation. For the most part, GPUs aren't smart enough to notice there's a power limitation and, when they go to draw whatever their max power is during use, will simply overload the PSU (which will then either shutdown to protect itself, or run higher than spec, likely killing it at some point). What you can do, though, is use one of the many overclocking programs available, like Afterburner, and underclock/undervolt the GPU. That will have the effect of reducing overall power use--people do this all the time with 480's for various reasons. Depending on the particular card and how much power you need to shave off you may lose some performance doing so, but it might let you slip inside the wattage envelope you need when you otherwise would be over budget.
|
# ? Feb 15, 2017 13:44 |
|
https://twitter.com/nvidia/status/831908618232422401 1080ti incoming? Maybe? Possibly? Who knows.
|
# ? Feb 15, 2017 18:30 |
|
since thats actually coming from nvidia i can only assume so.
|
# ? Feb 15, 2017 19:37 |
|
That + Pascal hype event, yeah.
|
# ? Feb 15, 2017 19:48 |
|
repiv posted:https://twitter.com/nvidia/status/831908618232422401 Man can they ever have these events not on a stupid work night? They are usually kinda fun but man making that trek from Sacramento is so much less appealing each time. Also some loudmouth still owes me $20 in a bet I couldn't beat him at a track/car of his choice in iRacing that was setup with a Surround 3D Racing Rig setup that was at the last one (GTX680 unveiling). I did Guess that teaches me. Make sure the money comes out and is in the hands of a trusted 3rd party so the looser can't just slip away.
|
# ? Feb 15, 2017 19:58 |
|
EdEddnEddy posted:Man can they ever have these events not on a stupid work night? always have a third party in all matters racing https://www.youtube.com/watch?v=CP-DEuIt_V8&t=271s
|
# ? Feb 15, 2017 22:37 |
|
"Capsaicin and Cream" event the same day as Nvidia's thing. Yawn. So we get Vega in May, right? How the hell can there be so much Zen leak news, but not a goddamn peep (relatively speaking) about Vega? SwissArmyDruid fucked around with this message at 00:39 on Feb 16, 2017 |
# ? Feb 16, 2017 00:36 |
|
Because the Zen product is interesting and competitive.
|
# ? Feb 16, 2017 00:50 |
Subjunctive posted:Because the Zen product is interesting and competitive. We hope.
|
|
# ? Feb 16, 2017 00:55 |
|
I actually caught myself thinking "wow, this zen lineup looks very un-AMD like and complete and competitive and -" wait a minute, its february 2017 I'd loving hope so. I also hope so for vega at this point but frankly I dont have anything hopeful to go on other than "they said it would be competitive". Which by all accounts it is... if it were released 10 months ago anyway.
|
# ? Feb 16, 2017 01:20 |
|
Where is anime school girl? MSI just released a low profile single slot design! https://videocardz.com/65991/msi-launches-low-profile-radeon-rx-460 With a dual slot cooler attached, but hey can't have everything. Maybe you can find a Cape Verde or Bonaire HSF to replace it?
|
# ? Feb 16, 2017 05:19 |
|
FaustianQ posted:Where is anime school girl? MSI just released a low profile single slot design! Wish that HDMI was a DisplayPort for MST
|
# ? Feb 16, 2017 05:43 |
|
Intel GPUs get Vulkan support. https://arstechnica.com/gadgets/2017/02/intels-newest-gpu-driver-adds-vulkan-support-for-skylake-and-kaby-lake-gpus/FaustianQ posted:Where is anime school girl? MSI just released a low profile single slot design! Just because it uses a single-slot bracket does not make it a single-slot card. I think Anime Schoolgirl genuinely wants a low-profile single-slot card, as in, an HD 7750: Otherwise, they might have already gotten like, one of the MSI 1050 TIs by now. (Get a new frikkin' case, AS!)
|
# ? Feb 16, 2017 08:06 |
|
I no longer expressly need such a card since the computer I'd like it for fried due to a 145-150v/10a power outlet I'm going to settle for a regular ole boringnuts card with one of those xbone cases as well as plug it and the tv it's attached to to an extension cord
|
# ? Feb 16, 2017 08:53 |
|
SwissArmyDruid posted:Just because it uses a single-slot bracket does not make it a single-slot card. That was the joke I was making. AGS is (was) constantly tantalized by cards that almost or could meet the requirement but were designed in just a stupid enough way to be useless to him.
|
# ? Feb 16, 2017 09:25 |
|
Anime Schoolgirl posted:I no longer expressly need such a card since the computer I'd like it for fried due to a 145-150v/10a power outlet I'm so happy for you, lol. My RVZ02B is like 3 inches wide (or tall) or something and the competitors are the same. So its very likely wherever you had your other one, this is going to fit as well. And you dont have to have a crappy gimped single slot non existent card either
|
# ? Feb 16, 2017 15:46 |
|
I've noticed my computer is a bit choppy with things like window animations ever since getting my 3rd monitor I have running in 4K. The other 2 monitors are running in 1080p. It's fine when I turn off and disable the 4K display. Is this just my GTX 970 chugging? Would upgrading make that go away? Windows 10. 16GB of ram. i7 4770k EDIT: I'm an idiot. The 4K display had reset to 30Hz for some reason. It's fine now at 60Hz 89 fucked around with this message at 19:32 on Feb 16, 2017 |
# ? Feb 16, 2017 19:21 |
|
I am trying to play Rise of the Tomb Raider at 2880x1800 (roughly equivalent to UW1440p) with the High Preset except for Textures set to Very High on my 1060-6GB. It looks great. Framerates (~40 FPS) are acceptable to me because I'm using it for steam in home streaming. The only problem is that the game keeps crashing with a message that it ran out of RAM. My OSD sometimes shows frame buffer usage up to 5.8GB. Is 6GB really not enough at this resolution?
|
# ? Feb 16, 2017 23:52 |
|
RAM is your local memory so either it has some sort of memory leak and its eating up your full Ram amount + swap file, or your swap file is off and you still don't have enough ram for all it's trying to load and use at the time. Might be worth googling as threads like these talk about how using Very High textures most definitly goes over the 6G VRAM limit and may be eating into your Physical Ram memory as well causing your issue. What are you running for RAM again? And is your swap file on? The only two games I have had issues like this before were Sid Meier's Railroads and Crysis Warhead running the Living Legends Mechwarrior mod.. You could watch the memory usage increase nonstop until it crashed.
|
# ? Feb 17, 2017 00:21 |
|
Are you using DX11 or DX12 mode? When I played it back on launch DX12 was notably less stable for me and didn't appear to have any dramatic image quality improvements.
|
# ? Feb 17, 2017 00:34 |
|
FaustianQ posted:Where is anime school girl? MSI just released a low profile single slot design! That would be neat to put in a case like this: https://www.aliexpress.com/item/Des.../753785717.html or this: https://www.aliexpress.com/item/CEMO-1001-HTPC-ITX-Mini-case-USB3-0-3-5-HDD-3-PCI-slots-all-aluminum/32340430970.html KingEup fucked around with this message at 03:44 on Feb 17, 2017 |
# ? Feb 17, 2017 03:27 |
|
craig588 posted:Are you using DX11 or DX12 mode? When I played it back on launch DX12 was notably less stable for me and didn't appear to have any dramatic image quality improvements. DX11 for the same reasons you mentioned. DX12 also seems to double the steam streaming latency. EdEddnEddy posted:RAM is your local memory so either it has some sort of memory leak and its eating up your full Ram amount + swap file, or your swap file is off and you still don't have enough ram for all it's trying to load and use at the time. Swap file is on & automatic, it's running in a VM and I had 8GB assigned to it. I bumped it up to 12GB RAM and the crashes stopped. Looks like the "Very High" texture setting really does use more than the 6GB VRAM of my card and starts swapping into RAM (failing that perhaps even swapping textures onto the SSD). I've set textures to "high" now and things have settled at ~4GB VRAM usage. Note to self: Buy 8GB+ card next generation
|
# ? Feb 17, 2017 08:02 |
|
eames posted:DX11 for the same reasons you mentioned. DX12 also seems to double the steam streaming latency. Is there some cool reason you're running games in a VM?
|
# ? Feb 17, 2017 08:40 |
|
DrDork posted:For the most part, GPUs aren't smart enough to notice there's a power limitation and, when they go to draw whatever their max power is during use, will simply overload the PSU (which will then either shutdown to protect itself, or run higher than spec, likely killing it at some point). The thing is that this would be one of the 1050s, though, and would be drawing all of its power from the PCIe slot. While some aftermarket 1050s do come with a 6-pin connector, they're not supposed to need them, and indeed that's one of the reasons I like this series because they're very easy to drop into any prebuilt and turn them into decent gaming systems. As I mentioned, this would also be one of the half-height 1050s, none of which (there are 3 that I know of, MSI, Gigabyte, Palit,) have the 6-pin connectors. Ironically if there was a half-height card with a power connector that would obviate the need for this whole exercise. So that's my main concern in this situation, overloading a 25 (or 40?) W slot with a 75 W card à la RX 480.
|
# ? Feb 17, 2017 08:51 |
|
Phosphine posted:Is there some cool reason you're running games in a VM? Not really, I'm just using my always on headless linux NAS to stream games to my rMBP at full resolution. It only cost me the price of the GPU (~$300) instead of the ~$1500 for a 1060 based notebook, plus the laptop stays cool when gaming and I prefer macOS without rebooting.
|
# ? Feb 17, 2017 08:52 |
|
Since I got my 3440x1440 100hz monitor, I noticed that videos played on one of the other monitors (1920x1200@60hz and an old Dell 1680x1050@60Hz) has very noticable tearing when I am playing a game on the main screen. It doesn't matter if the game is demanding or not, even playing fairly simple games like, say Pixeljunk Shooter in a 1920x1080 window on the big screen results in tearing of video content, be it in Media Player Classic or Youtube in a browser. I am currently still using a 970 from MSI (come oooooon Nvidia, 1080ti!). Is this a known issue?
|
# ? Feb 17, 2017 09:52 |
|
Video + game + different refresh rates (and especially gsync I hear) cause all sorts of issues, yes. People have reported anything from stuttery video to stuttery game.
|
# ? Feb 17, 2017 09:54 |
|
repiv posted:https://twitter.com/nvidia/status/831908618232422401 My X34 hungers for this 1080Ti. Apparently they will launch late March. I don't really need to upgrade but I want to dump my 1070 into an external eGPU for CUDA related work. Rabid Snake fucked around with this message at 10:45 on Feb 17, 2017 |
# ? Feb 17, 2017 10:38 |
|
mcbexx posted:Since I got my 3440x1440 100hz monitor, I noticed that videos played on one of the other monitors (1920x1200@60hz and an old Dell 1680x1050@60Hz) has very noticable tearing when I am playing a game on the main screen. It doesn't matter if the game is demanding or not, even playing fairly simple games like, say Pixeljunk Shooter in a 1920x1080 window on the big screen results in tearing of video content, be it in Media Player Classic or Youtube in a browser. I haven't noticed any tearing. Using a 1070; Acer X34 (3440x1440p @ 100hz) playing CIV 6 with my Acer XB271HU (1440p @ 165HZ) playing Netflix or using VLC with movie files. I tried to recreate it again with a secondary non gsync monitor instead of my Acer XB271HU (Dell UltraSharp U2715H 1440p @ 60Hz) and I still see no tearing. Rabid Snake fucked around with this message at 10:46 on Feb 17, 2017 |
# ? Feb 17, 2017 10:42 |
|
Yeah I game all the time on a 1440p 144Hz gsync monitor while watching video (streaming and local playback both) on a secondary 1440p 60Hz, and have never had any problems at all. This is with a 1080 card, primary monitor is display port and secondary is dual link dvi.
|
# ? Feb 17, 2017 15:01 |
|
Rabid Snake posted:I haven't noticed any tearing. I have. 1080 + X34 + 2x generic 60Hz 1440p's, when I game on the X34 and have Plex running on one of the side monitors, there's noticeable tearing. Not enough to make me stop watching, but it's certainly there. I've yet to see if turning GSync off or futzing between fullscreen and windowed modes makes a difference.
|
# ? Feb 17, 2017 15:03 |
|
Why does it appear that the 970 is at fault here as I have seen that card brought up with this issue more than once?
|
# ? Feb 17, 2017 17:10 |
|
This seems relevant: https://www.humblebundle.com/pc-lovers-software-bundle Humble Bundle with the GPU thread's favorite videogame: 3DMark. You can give all the money to charity instead of 3DMark if it makes you feel better.
|
# ? Feb 17, 2017 18:29 |
|
Don't buy benchmarks, benchmark the games you want to run. Chasing numbers just leads to chasing more numbers.
|
# ? Feb 17, 2017 18:36 |
|
|
# ? Apr 29, 2024 09:59 |
|
craig588 posted:Don't buy benchmarks, benchmark the games you want to run. Chasing numbers just leads to chasing more numbers. 3dmark is excellent to verify card (and system) functionality and is one of the easiest form of diagnostics if you suspect issues. Now if thats worth money to someone is another issue (the demo essentially does the same thing but just forces you to watch the demo), but I dont think anybody here is chasing numbers in 3dmark. I've used it quite a bit and I couldn't even guess what my actual scores actually are
|
# ? Feb 17, 2017 19:58 |