Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

beejay posted:

The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle.

Which is exactly my point. If their implementation of motion blur causes an apparent 100ms hit, I don't think I can trust anything about their claims of DirectX vs Mantle performance.

Adbot
ADBOT LOVES YOU

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Jan posted:

Which is exactly my point. If their implementation of motion blur causes an apparent 100ms hit, I don't think I can trust anything about their claims of DirectX vs Mantle performance.

They're using a blur algorithm that normally needs to be rendered on a scene by scene basis, in realtime. True motion blur is an amazing effect. Crysis did motion blur really well. It'll basically make 20FPS feel smoother than 60 when implemented properly.

veedubfreak
Apr 2, 2005

by Smythe

DiggityDoink posted:

I like how this has pretty much combined with the OC thread since Haswell has it's low headroom.

Seriously, less threads to check.

Speaking of, I'd still like to see your full build when you're done veedub, I don't think you put up pics of it yet.

I have a few pics up of the rig, I'll have new pics up later. I ended up moving the power supply to the other side of the machine and rewiring a lot of poo poo for better hiding. And now for story time.

No matter how loving 100% confident you are about your tubing. ALWAYS USE PROPER LEAK TESTING. In the past I have been known to fire my rig up without proper testing. Last night I put the third card in and plugged everything in. Then I realized, bad idea. So I unplugged everything and did a proper leak test. Apparently I forgot to tighten down the plug on the new card. As soon as the cards got pressure, water when errywhere. So I pulled everything apart and let it dry out over night. While watching the Broncos romp over Brady I figured out where I went wrong and redid everything. Couple hours of leak testing again and all is good.

But let this be a lesson, no matter how sure you are of your skills, take the time to properly double and triple check your poo poo. Anyhoo.

Only pictures I apparently took. I'll get some more pics of my proper clean wiring later.




My next stupid project is going to be rigid tubing because I'm a god drat glutton for punishment.

Also. Let me know what kinds of free benchmarks you'd like to see run on this. ATM the processor is only running at 4300 because I haven't bothered to push it due to well... the lack of need. But I haven't run any benches since the time of when Vantage was the big thing.

veedubfreak fucked around with this message at 04:26 on Jan 20, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I have a question, do you like it? Does your setup play games well? That'd be cool.

Straker
Nov 10, 2005

Stanley Pain posted:

They're using a blur algorithm that normally needs to be rendered on a scene by scene basis, in realtime. True motion blur is an amazing effect. Crysis did motion blur really well. It'll basically make 20FPS feel smoother than 60 when implemented properly.
Yeah, I immediately disable motion blur because most games don't implement it very nicely at all. By definition, you basically need to know the current frame and the next frame to properly implement motion blur, which is why it looks acceptable on TV/movies, so you could fake it if you could do it really loving quickly with low overhead (or I guess good "prediction" etc. since the engine obviously has a pretty good idea of what's going to happen in the next few ms), and it seems like a lot of people forget that. I'm honestly kind of sad motion blur is such a thing in video games, the way most games do it it's like implementing antialiasing by just blurring everything.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I'll agree that anything resembling a full-screen shader version of quincunx antialiasing looks like garbage, but I've also got to agree with Jan that I can't see any possible way that a motion blur implementation could take 100ms, and with an expected theoretical improvement ceiling of 10%-20% or so, magically work fine. It just doesn't make sense. 100ms is an unbelievably huge render time cost, it's a shitload of frames and a tenth of a second. If something improved performance by the full 20% that's still 80ms, which is also unbelievably high and something really oddball is going on there if our read on what they're actually claiming there is accurate.

I mean, when it comes to rendering, we're talking about maximizing every possible angle to stay within the frame. It's a big deal to go beyond 1.5-2ms for any particular pass. Posting an implementation that throws all logic out the window but then brings it back in time for a de facto proprietary graphics API to enable it just does not make sense.

EoRaptor
Sep 13, 2003

by Fluffdaddy

Agreed posted:

I have a question, do you like it? Does your setup play games well? That'd be cool.

veedubfreak's secret shame is that he plays MWO, a game so terribly developed that it doesn't support SLI. This rig might be some sort of huge e-peen, but the reality is he can *only* get just the tip in.


I also play this game

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

EoRaptor posted:

veedubfreak's secret shame is that he plays MWO, a game so terribly developed that it doesn't support SLI. This rig might be some sort of huge e-peen, but the reality is he can *only* get just the tip in.

I also play this game

Did :pgi: ever get DX11 support out for full release yet? If not (of course not), when? :haw:

Nalin
Sep 29, 2007

Hair Elf

Sidesaddle Cavalry posted:

Did :pgi: ever get DX11 support out for full release yet? If not (of course not), when? :haw:
They claim DX11 is done, but they won't release it until after UI 2.0 is done, much like every other feature they have promised.

In GPU news, there was an Anandtech article a few days ago about the AMD Kaveri APU's documentation showing that AMD was considering a GDDR5 option. Given that the PS4 uses GDDR5 memory, and AMD is experimenting with it in their PC APUs, could we see laptops with dual DDR3/GDDR5 memory in the future? Would it even be worth it for a low end gaming solution?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
One thing I'm surprised hasn't really filtered into the motion blur thing is non-post-processed blur. Way back in 2010, Lucasarts of all studios demo'd an interpolation add-on to The Force Unleashed 2 that rendered 30 real FPS but interpolated for 60 FPS. The shipping game was 30 FPS + motion blur, on consoles at least.

Artifacting from the interpolation was left to a minimum because you already knew where the geometry would be in the next frame, so the interframe was calculated from the previous full frame using that data (as both a depth map and a velocity map). The method even slightly reduced lag, because the interframe was more updated than the real frame it was based off of. Only catch was that some shader effects would have to be rewritten (so, e.g., the tech demo didn't have lightsaber effects).

Did this technique just get dropped or what? Because it seems like it'd have a good application on, e.g., an AMD APU or an Xbone to preserve higher details per frame but also get higher motion fidelity. Or use it to upscale 60 FPS to 120 FPS.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Nalin posted:

In GPU news, there was an Anandtech article a few days ago about the AMD Kaveri APU's documentation showing that AMD was considering a GDDR5 option. Given that the PS4 uses GDDR5 memory, and AMD is experimenting with it in their PC APUs, could we see laptops with dual DDR3/GDDR5 memory in the future? Would it even be worth it for a low end gaming solution?
This was discussed in the AMD thread, there was a standard by Elpida called "GDDR5M" that allowed SO-DIMMs containing GDDR5 to be used in DDR3 slots on supported CPUs. Elpida went bankrupt so this product will never see the light of day, and it's likely that the PS4 will be the only AMD APU with GDDR5 memory. I wouldn't rule out mobile products with a 256bit DDR3 memory interface, but that wouldn't work on existing sockets and motherboards.

Edit: Here's the post from Professor Science I was talking about, now that I'm home and had time to find it.

Alereon fucked around with this message at 00:54 on Jan 21, 2014

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Factory Factory posted:

Only catch was that some shader effects would have to be rewritten (so, e.g., the tech demo didn't have lightsaber effects).
that's a pretty significant caveat--anything that makes the asset creation process more taxing is going to be viewed with huge skepticism because of the potential to inflate production costs.

edit: LucasArts used to have a pretty serious engine department, Marco Salvi (I think he was either graphics or engine lead on Heavenly Sword back in the proverbial day) worked there around that time. he's now one of the advanced rendering folks at Intel, publishing lots of papers about cool tricks you can do with Gen. (and before you poo-poo that it's Intel and they don't care about graphics, Tomas Akenine-Moller is one of the guys in that department, and he wrote Real-Time Rendering, a book that literally everyone has read.)

Professor Science fucked around with this message at 00:53 on Jan 21, 2014

Supradog
Sep 1, 2004

A POOOST!?!??! YEEAAAAHHHH

Phuzun posted:


on the topic of Shadowplay. Has anyone encountered an issue where the audio is not captured. Or if it is captured, it is only the rear/surround audio? Figured this out. Asus Xonar DSX has an option called GX and this is what was causing audio to go missing in Shadowplay.

Yes, I never use that option, I've had that break audio in some poorly coded games too.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
GX is their EAX emulation which is pretty pointless these days.

veedubfreak
Apr 2, 2005

by Smythe

EoRaptor posted:

veedubfreak's secret shame is that he plays MWO, a game so terribly developed that it doesn't support SLI. This rig might be some sort of huge e-peen, but the reality is he can *only* get just the tip in.


I also play this game

To be fair I have played through about half of the BF4 campaign. It's a gorgeous game.

Duro
May 1, 2013

by Lowtax
Ok guys, an update from my NCIX debacle with the r9 270x

I wanted the MSI Hawk edition, but they don't think they're going to get anymore. They offered me a choice between a Powercolor model, an Asus model and a Sapphire model

I think there might be a few different Powercolor and Sapphire models available, but only that particular Asus model

Powercolor and Sapphire both come with BF4 but the Asus model apparently doesn't.

I can't seem to find any decent benchmarks on these, so I dunno what to choose. I was leaning towards the Asus model, but I was hoping to get BF4 with my card as well

Which should I choose?

GrizzlyCow
May 30, 2011
Powercolor doesn't look like a bad brand, but ASUS is more of a known entity (at least for me). Sapphire is kind of bad brand with an unreliability streak going on now. Performance-wise, there should be less than 1% better them. Anyway, ASUS R9 270X DirectCU II TOP and Powercolor R9 270X PCS+.

If BF4 is important to you and you don't want to wait for the inevitable Steam deal, the Powercolor looks to be a decent deal.

Duro
May 1, 2013

by Lowtax
They also have this slightly more expensive powercolor model

Will there be much of a difference between the two?

I never knew Powercolor was so trusted, I would have thought the opposite between Powercolor and Sapphire but I'm only basing it on the very limited knowledge I've accumulated over the years.

GrizzlyCow
May 30, 2011
I haven't run into many people who've complained about their PowerColor products, but I can't say I met many people who even heard about PowerColor. They seem to be a budget OEM, so there is a risk that they are using cheap parts. Slightly mitigated by them using their PCS+ and Devil brand name on these specific cards.

While not the greatest resource, this French retailer is one of the few who keep tracks of returns for specific products and publish them. Sapphire does not look good for their 7850/7870 cards. It could easily have been a bad batch, but between that and the horror stories I have heard, why take the chance?

ASUS would be the most reputable brand to buy.

edit: The Devil doesn't many reviews worth checking out, but it doesn't look like much of a step up from the PCS+ card in terms of performance or acoustics.

ninjagrips
Mar 19, 2007
I got the Dual-X Sapphire card, and it's working great so far.

Straker
Nov 10, 2005
PowerColor has been around for ages, I'm not sure why they're so small/unheard of. I think my GF4 4400Ti was a PowerColor, even.

forbidden dialectics
Jul 26, 2005





So the PSU I got is faulty, and is being sent back. I'm back to the 750W PSU for now. For kicks, I got a Kill-A-Watt, and during 3dmark extreme, with 1.38125v through the 780Ti, the system was drawing peaks at around 948W from the wall. Assuming about 80% efficiency, that is right around where the overcurrent protection would trip. So yeah, no poo poo, overvolting these cards draws massive amounts of power.

Bat Ham
Apr 22, 2008

Bat Nan
I'm building a new computer at home to let me do more 3D and video production outside of my work office and I'd love to stick an nVidia Quadro in it. Since it's going to be my home computer, though, I'd still like to be able to run games decently on it.
I've heard that I should be able to also put a GTX in there since the Quadro won't really handle games but I'm finding mixed messages as to how easy it is to do this. Any advice? And if I can, is there any limitations on what nVidia cards will run well with it?

(Posted this over in the PC building thread and got directed over here for it.)

V No problem, thanks!

Bat Ham fucked around with this message at 06:11 on Jan 22, 2014

beejay
Apr 7, 2002

The guy that told you to post here was incorrect, this thread isn't for parts picking. I think the reason you didn't get any good answers there is that your post is a bit confusing. I'm not sure what you're asking. If you are doing 3D stuff that will benefit from a Quadro then you probably should have a different computer for gaming. I am just not sure what you're asking. Try clarifying your question and posting again in the parts picking thread, sorry to bounce you around.

Edit: Are you asking if you can have a Quadro and a gaming GTX card in your system at the same time? I don't think that is going to work and if it would, it's probably not worth the trouble. You will just have to decide whether your work can be done on a gaming card, or whether your gaming can be done on a Quadro.

beejay fucked around with this message at 06:11 on Jan 22, 2014

BurritoJustice
Oct 9, 2012

Both cards can be in a system at the same time just fine, if you have the money to want to do that. If you don't believe me, Linus does it in his video editing/workstation rig here.

Fallows
Jan 20, 2005

If he waits long enough he can use his accrued interest from his savings to bring his negative checking balance back into the black.
He's not picking parts, he's asking if he can use a quadro and a gtx in the same machine and which I think the answer is yes. You'll probably want to get a higher end mother board made for dual GPUs so you can have both cards on x16. You might even be able to do a set up which uses the quadro as a physx processor too, since those are math beasts right?

edit: You might need to get a KVM switch or something if youre going to be outputting to the same monitor but I think there might be better solutions now .

Fallows fucked around with this message at 15:27 on Jan 22, 2014

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Id just spend the money on a better quadro card and game on it.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Anyone see this yet? 750ti rumor was further substantiated with a product listing:

http://www.tomshardware.com/news/nvidia-maxwell-geforce-gtx-750-ti-listing,25813.html

I'd totally be down for SLI with two of those, sounds like it could be the new performance-per-dollar king on the NVidia side if it gets a sub-$200 MSRP.

Phuzun
Jul 4, 2007

Don Lapre posted:

Id just spend the money on a better quadro card and game on it.

Unless I am misunderstanding the naming format for Quadro cards, you go from $1,799 for a 1536 core K5000 to $4,999 for the 2880 core K6000. You could easily afford to get the K5000 and a GTX 780 Ti for specialized tasks, instead of getting an equivalent Quadro for gaming.

Animal
Apr 8, 2003

Is it possible to use the bottom card as primary in an SLI setup? Makes no sense to me that the top card - which runs warm due to restricted airflow - has to be the one to run in single GPU workloads.

Fangs404
Dec 20, 2004

I time bomb.
I just recently got into cryptocurrency mining. My MSI 660Ti (the 2gb Power Edition with the Twin Frozr cooler) is chugging away using cudaminer at 260-270 khash/s. The problem is that if I leave my computer idle for just a few minutes, the core and load throttle down, really slowing down the number of khash/s I can get. I'm on Windows 7 Pro. I don't want my GPU to throttle down when I'm idle. The second I come back and wake the monitors up (the monitors are set to turn off after 5 mins of inactivity), it throttles back up. Do you guys know what setting causes this? I looked in the NVIDIA control panel and didn't see anything obvious.

[edit]
Well, I thought it was inactivity. I just left the computer idle for about 10 mins, and it's still going full blast. I'm leaving GPU-Z's logging going, so that'll hopefully help me determine when it throttles.

[edit2]
It just happened. Here are some relevant lines from GPU-Z's log file:

code:
Date,GPU Core Clock [MHz],GPU Memory Clock [MHz],GPU Temperature [°C],Fan Speed (%) [%],Fan Speed (RPM) [RPM],Memory Used [MB],GPU Load [%],Memory Controller Load [%],Video Engine Load [%],Power Consumption [% TDP],VDDC [V]
1/22/2014 9:35,1136.6,1502.3,80,48,2280,616,92,61,0,91.1,1.15
1/22/2014 9:35,1149.7,1502.3,80,47,2280,616,89,60,0,90.1,1.162
1/22/2014 9:35,1149.7,1502.3,81,48,2250,618,89,60,0,90.2,1.162
1/22/2014 9:35,1149.7,1502.3,75,48,2250,559,18,13,0,43.1,1.162
1/22/2014 9:35,1149.7,1502.3,74,47,2280,559,23,16,0,45.8,1.162
1/22/2014 9:35,1149.7,1502.3,74,47,2340,554,23,16,0,43.7,1.162
1/22/2014 9:35,1019,1502.3,72,46,2220,554,23,14,0,36.1,1.05
1/22/2014 9:35,1019,1502.3,72,46,2220,553,25,15,0,35.3,1.05
1/22/2014 9:35,1019,1502.3,71,45,2160,553,23,14,0,36.9,1.05
1/22/2014 9:35,1019,1502.3,71,45,2160,553,23,14,0,34.8,1.05
1/22/2014 9:35,1019,1502.3,70,44,2100,554,23,14,0,36.2,1.05
And then it picked up again randomly:

code:
1/22/2014 9:38,1019,1502.3,59,39,1620,660,24,15,0,36.1,1.062
1/22/2014 9:38,1019,1502.3,59,39,1590,688,25,16,0,36.6,1.062
1/22/2014 9:38,1019,1502.3,59,39,1620,673,26,17,0,34.6,1.062
1/22/2014 9:38,1019,1502.3,59,38,1590,643,29,18,0,34.9,1.062
1/22/2014 9:38,1019,1502.3,59,39,1590,649,22,14,0,36.8,1.062
1/22/2014 9:38,1019,1502.3,59,38,1560,648,30,19,0,35.3,1.062
1/22/2014 9:38,1162.7,1502.3,64,40,1590,661,68,46,0,73.7,1.175
1/22/2014 9:38,1162.7,1502.3,65,41,1560,655,67,46,0,73.5,1.175
1/22/2014 9:38,1162.7,1502.3,65,41,1620,655,70,48,0,74.6,1.175
1/22/2014 9:38,1162.7,1502.3,65,41,1710,654,66,45,0,73.5,1.175
1/22/2014 9:38,1162.7,1502.3,65,41,1770,654,70,49,0,74.7,1.175
I'm having a hard time figuring out what's causing it to throttle. It looks like when the temp hits 81, the load drops, which causes the clock to drop. But I'm not sure why it picks back up again - maybe because the load increases? Could this be cudaminer throttling itself based on temp, or is this an NVIDIA or Windows setting maybe?

I'm going to leave GPU-Z logging all day to see what happens while I'm at work.

Fangs404 fucked around with this message at 17:59 on Jan 22, 2014

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Unless your power is literally free, I don't think it would be worth mining on an NVIDIA GPU, honestly.

vv Are there any settings in cudaminer? I know in cgminer it can manage clock rates, fan speeds, throttle based on temp etc. all from inside the miner.

I would guess that there's something deep in the driver that pushes the thing into a lower power mode when the monitor output is turned off. I would definitely leave the monitors turned on in software, and just hit their power buttons instead, for now at least.

I run cgminer on a 280X and leave the monitor output on, because even though I never tried, I figured it would do something bad if the monitor output was off. I just don't have a monitor plugged in at all :v:

HalloKitty fucked around with this message at 18:05 on Jan 22, 2014

Fangs404
Dec 20, 2004

I time bomb.

HalloKitty posted:

Unless your power is literally free, I don't think it would be worth mining on an NVIDIA GPU, honestly.

I know ATIs AMDs are better, but I'm just playing around with cryptocurrency. I just want to solve this throttling problem.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
You can use the multi monitor power saver settings in nvinspector to manually set power state thresholds

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Well Litecoin miners have broke me, couldn't wait for an R9 280X any longer. Ended up getting a Zotac GTX 770 that was on sale for 299.99 off Amazon today, this will be the first nVidia card I've owned since a Geforce 2 MX PCI card. From what I've read here about Zotac being a decent brand and nVidia's Greenlight program, I should feel confident in this choice.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Beautiful Ninja posted:

Well Litecoin miners have broke me, couldn't wait for an R9 280X any longer.

The funny thing is, in the UK, a lot of major sites are out of stock, but oddly, Overclockers has quite a few in stock at the right price - that's almost exactly 300 USD + 20% tax, totally fair pricing. I bought one of these for £30 more in November.

Obviously that doesn't help you at all.

HalloKitty fucked around with this message at 20:34 on Jan 22, 2014

Schiavona
Oct 8, 2008

Beautiful Ninja posted:

Well Litecoin miners have broke me, couldn't wait for an R9 280X any longer. Ended up getting a Zotac GTX 770 that was on sale for 299.99 off Amazon today, this will be the first nVidia card I've owned since a Geforce 2 MX PCI card. From what I've read here about Zotac being a decent brand and nVidia's Greenlight program, I should feel confident in this choice.

I've been debating doing the same thing, but is there any info on the performance gains of the 800 series that reportedly could come out in the next few months? I'm running a 560ti-448 now, and while it's not bad, it's showing his age a bit. If there will most likely be an 860/860ti at the 250 price point, I have no issue waiting.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
Yeah, and NewEgg keeps doing poo poo like this:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I thought the MSRP of the R9 290X is like $700, if that $2300 bundle has three of them you're only paying $200 for everything else. Mining or not, that's not as gouged as I'd have thought.

Adbot
ADBOT LOVES YOU

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Zero VGS posted:

I thought the MSRP of the R9 290X is like $700, if that $2300 bundle has three of them you're only paying $200 for everything else. Mining or not, that's not as gouged as I'd have thought.

MSRP at launch for the 290X was 550 dollars. I've heard rumors that AMD has actually increased the MSRP by 100 dollars recently, in part because of miners buying everything and in part that their parts were underpriced in comparison to nVidia's performance equivalent parts in the high end.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply