|
beejay posted:The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle. Which is exactly my point. If their implementation of motion blur causes an apparent 100ms hit, I don't think I can trust anything about their claims of DirectX vs Mantle performance.
|
# ? Jan 19, 2014 21:43 |
|
|
# ? May 12, 2024 15:37 |
|
Jan posted:Which is exactly my point. If their implementation of motion blur causes an apparent 100ms hit, I don't think I can trust anything about their claims of DirectX vs Mantle performance. They're using a blur algorithm that normally needs to be rendered on a scene by scene basis, in realtime. True motion blur is an amazing effect. Crysis did motion blur really well. It'll basically make 20FPS feel smoother than 60 when implemented properly.
|
# ? Jan 19, 2014 23:21 |
|
DiggityDoink posted:I like how this has pretty much combined with the OC thread since Haswell has it's low headroom. I have a few pics up of the rig, I'll have new pics up later. I ended up moving the power supply to the other side of the machine and rewiring a lot of poo poo for better hiding. And now for story time. No matter how loving 100% confident you are about your tubing. ALWAYS USE PROPER LEAK TESTING. In the past I have been known to fire my rig up without proper testing. Last night I put the third card in and plugged everything in. Then I realized, bad idea. So I unplugged everything and did a proper leak test. Apparently I forgot to tighten down the plug on the new card. As soon as the cards got pressure, water when errywhere. So I pulled everything apart and let it dry out over night. While watching the Broncos romp over Brady I figured out where I went wrong and redid everything. Couple hours of leak testing again and all is good. But let this be a lesson, no matter how sure you are of your skills, take the time to properly double and triple check your poo poo. Anyhoo. Only pictures I apparently took. I'll get some more pics of my proper clean wiring later. My next stupid project is going to be rigid tubing because I'm a god drat glutton for punishment. Also. Let me know what kinds of free benchmarks you'd like to see run on this. ATM the processor is only running at 4300 because I haven't bothered to push it due to well... the lack of need. But I haven't run any benches since the time of when Vantage was the big thing. veedubfreak fucked around with this message at 04:26 on Jan 20, 2014 |
# ? Jan 20, 2014 04:23 |
|
I have a question, do you like it? Does your setup play games well? That'd be cool.
|
# ? Jan 20, 2014 09:55 |
Stanley Pain posted:They're using a blur algorithm that normally needs to be rendered on a scene by scene basis, in realtime. True motion blur is an amazing effect. Crysis did motion blur really well. It'll basically make 20FPS feel smoother than 60 when implemented properly.
|
|
# ? Jan 20, 2014 11:10 |
|
I'll agree that anything resembling a full-screen shader version of quincunx antialiasing looks like garbage, but I've also got to agree with Jan that I can't see any possible way that a motion blur implementation could take 100ms, and with an expected theoretical improvement ceiling of 10%-20% or so, magically work fine. It just doesn't make sense. 100ms is an unbelievably huge render time cost, it's a shitload of frames and a tenth of a second. If something improved performance by the full 20% that's still 80ms, which is also unbelievably high and something really oddball is going on there if our read on what they're actually claiming there is accurate. I mean, when it comes to rendering, we're talking about maximizing every possible angle to stay within the frame. It's a big deal to go beyond 1.5-2ms for any particular pass. Posting an implementation that throws all logic out the window but then brings it back in time for a de facto proprietary graphics API to enable it just does not make sense.
|
# ? Jan 20, 2014 11:17 |
|
Agreed posted:I have a question, do you like it? Does your setup play games well? That'd be cool. veedubfreak's secret shame is that he plays MWO, a game so terribly developed that it doesn't support SLI. This rig might be some sort of huge e-peen, but the reality is he can *only* get just the tip in. I also play this game
|
# ? Jan 20, 2014 19:22 |
|
EoRaptor posted:veedubfreak's secret shame is that he plays MWO, a game so terribly developed that it doesn't support SLI. This rig might be some sort of huge e-peen, but the reality is he can *only* get just the tip in. Did ever get DX11 support out for full release yet? If not (of course not), when?
|
# ? Jan 20, 2014 19:30 |
|
Sidesaddle Cavalry posted:Did ever get DX11 support out for full release yet? If not (of course not), when? In GPU news, there was an Anandtech article a few days ago about the AMD Kaveri APU's documentation showing that AMD was considering a GDDR5 option. Given that the PS4 uses GDDR5 memory, and AMD is experimenting with it in their PC APUs, could we see laptops with dual DDR3/GDDR5 memory in the future? Would it even be worth it for a low end gaming solution?
|
# ? Jan 20, 2014 20:03 |
|
One thing I'm surprised hasn't really filtered into the motion blur thing is non-post-processed blur. Way back in 2010, Lucasarts of all studios demo'd an interpolation add-on to The Force Unleashed 2 that rendered 30 real FPS but interpolated for 60 FPS. The shipping game was 30 FPS + motion blur, on consoles at least. Artifacting from the interpolation was left to a minimum because you already knew where the geometry would be in the next frame, so the interframe was calculated from the previous full frame using that data (as both a depth map and a velocity map). The method even slightly reduced lag, because the interframe was more updated than the real frame it was based off of. Only catch was that some shader effects would have to be rewritten (so, e.g., the tech demo didn't have lightsaber effects). Did this technique just get dropped or what? Because it seems like it'd have a good application on, e.g., an AMD APU or an Xbone to preserve higher details per frame but also get higher motion fidelity. Or use it to upscale 60 FPS to 120 FPS.
|
# ? Jan 20, 2014 20:04 |
|
Nalin posted:In GPU news, there was an Anandtech article a few days ago about the AMD Kaveri APU's documentation showing that AMD was considering a GDDR5 option. Given that the PS4 uses GDDR5 memory, and AMD is experimenting with it in their PC APUs, could we see laptops with dual DDR3/GDDR5 memory in the future? Would it even be worth it for a low end gaming solution? Edit: Here's the post from Professor Science I was talking about, now that I'm home and had time to find it. Alereon fucked around with this message at 00:54 on Jan 21, 2014 |
# ? Jan 20, 2014 23:16 |
|
Factory Factory posted:Only catch was that some shader effects would have to be rewritten (so, e.g., the tech demo didn't have lightsaber effects). edit: LucasArts used to have a pretty serious engine department, Marco Salvi (I think he was either graphics or engine lead on Heavenly Sword back in the proverbial day) worked there around that time. he's now one of the advanced rendering folks at Intel, publishing lots of papers about cool tricks you can do with Gen. (and before you poo-poo that it's Intel and they don't care about graphics, Tomas Akenine-Moller is one of the guys in that department, and he wrote Real-Time Rendering, a book that literally everyone has read.) Professor Science fucked around with this message at 00:53 on Jan 21, 2014 |
# ? Jan 21, 2014 00:45 |
|
Phuzun posted:
Yes, I never use that option, I've had that break audio in some poorly coded games too.
|
# ? Jan 21, 2014 12:46 |
|
GX is their EAX emulation which is pretty pointless these days.
|
# ? Jan 21, 2014 15:54 |
|
EoRaptor posted:veedubfreak's secret shame is that he plays MWO, a game so terribly developed that it doesn't support SLI. This rig might be some sort of huge e-peen, but the reality is he can *only* get just the tip in. To be fair I have played through about half of the BF4 campaign. It's a gorgeous game.
|
# ? Jan 21, 2014 20:16 |
|
Ok guys, an update from my NCIX debacle with the r9 270x I wanted the MSI Hawk edition, but they don't think they're going to get anymore. They offered me a choice between a Powercolor model, an Asus model and a Sapphire model I think there might be a few different Powercolor and Sapphire models available, but only that particular Asus model Powercolor and Sapphire both come with BF4 but the Asus model apparently doesn't. I can't seem to find any decent benchmarks on these, so I dunno what to choose. I was leaning towards the Asus model, but I was hoping to get BF4 with my card as well Which should I choose?
|
# ? Jan 21, 2014 22:14 |
|
Powercolor doesn't look like a bad brand, but ASUS is more of a known entity (at least for me). Sapphire is kind of bad brand with an unreliability streak going on now. Performance-wise, there should be less than 1% better them. Anyway, ASUS R9 270X DirectCU II TOP and Powercolor R9 270X PCS+. If BF4 is important to you and you don't want to wait for the inevitable Steam deal, the Powercolor looks to be a decent deal.
|
# ? Jan 21, 2014 22:31 |
|
They also have this slightly more expensive powercolor model Will there be much of a difference between the two? I never knew Powercolor was so trusted, I would have thought the opposite between Powercolor and Sapphire but I'm only basing it on the very limited knowledge I've accumulated over the years.
|
# ? Jan 21, 2014 22:42 |
|
I haven't run into many people who've complained about their PowerColor products, but I can't say I met many people who even heard about PowerColor. They seem to be a budget OEM, so there is a risk that they are using cheap parts. Slightly mitigated by them using their PCS+ and Devil brand name on these specific cards. While not the greatest resource, this French retailer is one of the few who keep tracks of returns for specific products and publish them. Sapphire does not look good for their 7850/7870 cards. It could easily have been a bad batch, but between that and the horror stories I have heard, why take the chance? ASUS would be the most reputable brand to buy. edit: The Devil doesn't many reviews worth checking out, but it doesn't look like much of a step up from the PCS+ card in terms of performance or acoustics.
|
# ? Jan 21, 2014 23:14 |
|
I got the Dual-X Sapphire card, and it's working great so far.
|
# ? Jan 22, 2014 02:23 |
PowerColor has been around for ages, I'm not sure why they're so small/unheard of. I think my GF4 4400Ti was a PowerColor, even.
|
|
# ? Jan 22, 2014 02:35 |
|
So the PSU I got is faulty, and is being sent back. I'm back to the 750W PSU for now. For kicks, I got a Kill-A-Watt, and during 3dmark extreme, with 1.38125v through the 780Ti, the system was drawing peaks at around 948W from the wall. Assuming about 80% efficiency, that is right around where the overcurrent protection would trip. So yeah, no poo poo, overvolting these cards draws massive amounts of power.
|
# ? Jan 22, 2014 03:03 |
|
I'm building a new computer at home to let me do more 3D and video production outside of my work office and I'd love to stick an nVidia Quadro in it. Since it's going to be my home computer, though, I'd still like to be able to run games decently on it. I've heard that I should be able to also put a GTX in there since the Quadro won't really handle games but I'm finding mixed messages as to how easy it is to do this. Any advice? And if I can, is there any limitations on what nVidia cards will run well with it? (Posted this over in the PC building thread and got directed over here for it.) V No problem, thanks! Bat Ham fucked around with this message at 06:11 on Jan 22, 2014 |
# ? Jan 22, 2014 05:54 |
|
The guy that told you to post here was incorrect, this thread isn't for parts picking. I think the reason you didn't get any good answers there is that your post is a bit confusing. I'm not sure what you're asking. If you are doing 3D stuff that will benefit from a Quadro then you probably should have a different computer for gaming. I am just not sure what you're asking. Try clarifying your question and posting again in the parts picking thread, sorry to bounce you around. Edit: Are you asking if you can have a Quadro and a gaming GTX card in your system at the same time? I don't think that is going to work and if it would, it's probably not worth the trouble. You will just have to decide whether your work can be done on a gaming card, or whether your gaming can be done on a Quadro. beejay fucked around with this message at 06:11 on Jan 22, 2014 |
# ? Jan 22, 2014 06:04 |
|
Both cards can be in a system at the same time just fine, if you have the money to want to do that. If you don't believe me, Linus does it in his video editing/workstation rig here.
|
# ? Jan 22, 2014 07:17 |
|
He's not picking parts, he's asking if he can use a quadro and a gtx in the same machine and which I think the answer is yes. You'll probably want to get a higher end mother board made for dual GPUs so you can have both cards on x16. You might even be able to do a set up which uses the quadro as a physx processor too, since those are math beasts right? edit: You might need to get a KVM switch or something if youre going to be outputting to the same monitor but I think there might be better solutions now . Fallows fucked around with this message at 15:27 on Jan 22, 2014 |
# ? Jan 22, 2014 15:25 |
|
Id just spend the money on a better quadro card and game on it.
|
# ? Jan 22, 2014 15:53 |
|
Anyone see this yet? 750ti rumor was further substantiated with a product listing: http://www.tomshardware.com/news/nvidia-maxwell-geforce-gtx-750-ti-listing,25813.html I'd totally be down for SLI with two of those, sounds like it could be the new performance-per-dollar king on the NVidia side if it gets a sub-$200 MSRP.
|
# ? Jan 22, 2014 15:59 |
|
Don Lapre posted:Id just spend the money on a better quadro card and game on it. Unless I am misunderstanding the naming format for Quadro cards, you go from $1,799 for a 1536 core K5000 to $4,999 for the 2880 core K6000. You could easily afford to get the K5000 and a GTX 780 Ti for specialized tasks, instead of getting an equivalent Quadro for gaming.
|
# ? Jan 22, 2014 16:09 |
|
Is it possible to use the bottom card as primary in an SLI setup? Makes no sense to me that the top card - which runs warm due to restricted airflow - has to be the one to run in single GPU workloads.
|
# ? Jan 22, 2014 16:19 |
|
I just recently got into cryptocurrency mining. My MSI 660Ti (the 2gb Power Edition with the Twin Frozr cooler) is chugging away using cudaminer at 260-270 khash/s. The problem is that if I leave my computer idle for just a few minutes, the core and load throttle down, really slowing down the number of khash/s I can get. I'm on Windows 7 Pro. I don't want my GPU to throttle down when I'm idle. The second I come back and wake the monitors up (the monitors are set to turn off after 5 mins of inactivity), it throttles back up. Do you guys know what setting causes this? I looked in the NVIDIA control panel and didn't see anything obvious. [edit] Well, I thought it was inactivity. I just left the computer idle for about 10 mins, and it's still going full blast. I'm leaving GPU-Z's logging going, so that'll hopefully help me determine when it throttles. [edit2] It just happened. Here are some relevant lines from GPU-Z's log file: code:
code:
I'm going to leave GPU-Z logging all day to see what happens while I'm at work. Fangs404 fucked around with this message at 17:59 on Jan 22, 2014 |
# ? Jan 22, 2014 17:09 |
|
Unless your power is literally free, I don't think it would be worth mining on an NVIDIA GPU, honestly. vv Are there any settings in cudaminer? I know in cgminer it can manage clock rates, fan speeds, throttle based on temp etc. all from inside the miner. I would guess that there's something deep in the driver that pushes the thing into a lower power mode when the monitor output is turned off. I would definitely leave the monitors turned on in software, and just hit their power buttons instead, for now at least. I run cgminer on a 280X and leave the monitor output on, because even though I never tried, I figured it would do something bad if the monitor output was off. I just don't have a monitor plugged in at all HalloKitty fucked around with this message at 18:05 on Jan 22, 2014 |
# ? Jan 22, 2014 17:59 |
|
HalloKitty posted:Unless your power is literally free, I don't think it would be worth mining on an NVIDIA GPU, honestly. I know
|
# ? Jan 22, 2014 18:00 |
|
You can use the multi monitor power saver settings in nvinspector to manually set power state thresholds
|
# ? Jan 22, 2014 18:08 |
|
Well Litecoin miners have broke me, couldn't wait for an R9 280X any longer. Ended up getting a Zotac GTX 770 that was on sale for 299.99 off Amazon today, this will be the first nVidia card I've owned since a Geforce 2 MX PCI card. From what I've read here about Zotac being a decent brand and nVidia's Greenlight program, I should feel confident in this choice.
|
# ? Jan 22, 2014 20:24 |
|
Beautiful Ninja posted:Well Litecoin miners have broke me, couldn't wait for an R9 280X any longer. The funny thing is, in the UK, a lot of major sites are out of stock, but oddly, Overclockers has quite a few in stock at the right price - that's almost exactly 300 USD + 20% tax, totally fair pricing. I bought one of these for £30 more in November. Obviously that doesn't help you at all. HalloKitty fucked around with this message at 20:34 on Jan 22, 2014 |
# ? Jan 22, 2014 20:30 |
|
Beautiful Ninja posted:Well Litecoin miners have broke me, couldn't wait for an R9 280X any longer. Ended up getting a Zotac GTX 770 that was on sale for 299.99 off Amazon today, this will be the first nVidia card I've owned since a Geforce 2 MX PCI card. From what I've read here about Zotac being a decent brand and nVidia's Greenlight program, I should feel confident in this choice. I've been debating doing the same thing, but is there any info on the performance gains of the 800 series that reportedly could come out in the next few months? I'm running a 560ti-448 now, and while it's not bad, it's showing his age a bit. If there will most likely be an 860/860ti at the 250 price point, I have no issue waiting.
|
# ? Jan 22, 2014 21:14 |
|
Yeah, and NewEgg keeps doing poo poo like this:
|
# ? Jan 22, 2014 21:21 |
|
I thought the MSRP of the R9 290X is like $700, if that $2300 bundle has three of them you're only paying $200 for everything else. Mining or not, that's not as gouged as I'd have thought.
|
# ? Jan 22, 2014 21:31 |
|
|
# ? May 12, 2024 15:37 |
|
Zero VGS posted:I thought the MSRP of the R9 290X is like $700, if that $2300 bundle has three of them you're only paying $200 for everything else. Mining or not, that's not as gouged as I'd have thought. MSRP at launch for the 290X was 550 dollars. I've heard rumors that AMD has actually increased the MSRP by 100 dollars recently, in part because of miners buying everything and in part that their parts were underpriced in comparison to nVidia's performance equivalent parts in the high end.
|
# ? Jan 22, 2014 21:43 |