Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Wozbo
Jul 5, 2010

Agreed posted:

Jesus, they are stingy with buses these days. I remember when all the top end cards had at least a 512-bit bus, granted much slower RAM, but we already know Kepler is shackled to the bus from overclocking it. Cripes.

I never understood this? Its like saying an engine is 3.3 Liters. Wouldn't you want to also consider things like frequency and word length too? Heck, I thought that the 256 bit bus could saturate PCIE2 x16 in SLI mode.

E: With the latest gen of GDDR, sorry if being obtuse.

Wozbo fucked around with this message at 14:19 on Jul 28, 2012

Adbot
ADBOT LOVES YOU

Wozbo
Jul 5, 2010

Tunga posted:

As long as you are okay with having to run everything full screen, and there are potential microlag and other incompatibilities to consider.

Not with NVIDIA, they do a little bit of runwaiting on a finished frame using a rolling average. ATI though is painful.

Wozbo
Jul 5, 2010
It looks like opengl and direct3d are running side by side, not being translated.

quote:

The second category would include reducing overhead in calling OpenGL, and extending our renderer with new interfaces for better encapsulation of OpenGL and Direct3D.

Or am I reading that wrong? If it is then hot drat that's some sick performance.

Wozbo
Jul 5, 2010

Anti-Hero posted:

Are any of the Kepler offerings when SLI'd tempting enough to consider an upgrade from my 580SLI rig? I game at 2560x1600 on a Nehalim i5.

My thinking is this: Unless you're moving to something crazy like an extreme resolution change or adding two monitors, wait at least 2 to 3 card generations. I had SLI 8800 GTS cards, and they lasted me fine until I gifted the computer to family and migrated to 2 x 6870. I'm waiting for the inevitable 8xx geforce series / 9xxx radeon before I'm even thinking of upgrading. Not to mention, since the new consoles are coming out in what, one/ two years? There are very few developers that will have programs that will put out the eye candy because the thing that pays the bills is COD Black Mops with the Floor Buffer mode for your ps360. That's my thoughts and how I'm saving my bank account from yearly upgrades :( I will say though, the 670 really really tempted me.

Wozbo
Jul 5, 2010
690 doesn't require a sli compliant motherboard/ controller as it's on the card if I recall correctly (please doublecheck this).

I'm thinking of going to team green too, but I'm trying as hard as I can to wait for Big Kepler. So tempting to get a 670 though.

Wozbo
Jul 5, 2010
Question: For these 120Hz monitor, is there a difference between running at 120Hz and running it in 3D mode (aka 60 hz per eye?)? Like is it less of a burden to run 3D mode since the scene is prepped and all you have to do is shift the view or what? I'm really planning on mixing and matching here.

I'm currently considering moving to a nice 120Hz monitor and going to team green. I currently have 2 x 6870 and I'm considering 2 x 670. Yes, I know, waste of money blah blah expecting to see crackbone call me an idiot (Hi crackbone!). Also, I really want adaptive vsync so I'm not constantly at 30fps :(

Anyone want to chime in at all? I'm really only waiting for a deal to pop up before I push buy.

Wozbo
Jul 5, 2010

HalloKitty posted:

I love how they sold adaptive vsync as something new - surely it's nothing more than a framerate cap, unless I'm misunderstanding it..

Supposedly, no tearing at all (which is the tech part of it) where other methods MIGHT and usually do tear.

Wozbo
Jul 5, 2010
You, you arent plugging it into the super small one up top above the huge 16x right? the light brown right?

The other things that could be: If you didn't seat the card right. You might not have it fully inserted so it is running in 2x because that's all it has full contacts for.

I'd suggest unseating fully and reseating the card.

Wozbo
Jul 5, 2010

Whale Cancer posted:

We both just ran the test once and we used the exact same settings in the test. As far as pci bandwidth I haven't changed whatever it runs stock. I know we are both running the same nvidia driver as well.

Some thoughts:

Antivirus being a dick (check it with AV off), any random other program asking for processing time. I'd average like 3 or 4 of em.

Wozbo
Jul 5, 2010
Isn't this not true with at least the latest Nvidia generation because they all had to follow a certain baseline spec to the letter? I'm getting a Gigabyte 670 (with the 3 fans) because it looks to perform pretty awesomely.

Wozbo
Jul 5, 2010
That's not what I'm getting at. I'm getting at the fact that you always have to meet at LEAST <bar> and then the card could boost up based on what else you may have put in there (better cooling etc, but always has to be better than reference). For example I'm fairly sure the 670 from gigabyte is actually based off of a reference 680 board, and meets that baseline (meaning pretty good boost).

At least, that's my understanding. I guess I'll make a E/N post when I get mine and it runs like dog poo poo.

Wozbo
Jul 5, 2010

Glen Goobersmooches posted:

I got the Gigabyte 670 Windforce 3, and it's been performing fantastically. I've been running it at 1060Mhz core (it can go higher but I don't have the need to push it harder atm) and GPU-Z tells me the boost ratchets up to 1250Mhz during the most demanding games, so it ain't no gimp boost.

Not to mention that the VRAM on my card clocks up like nuts. I've been able to max what the sliders allow (1502Mhz -> 1915Mhz). In addition to all this, it and the Asus DirectCU seem to have the most effective and quietest cool setups in all the 670 showdowns I've read. Go figure.

Incidentally, Borderlands 2 has been running super nicely with PhysX on Medium. I'm gonna test out some gameplay on high and I'm still in the intro areas, but I was dreading major slowdown with it enabled and am pleasantly surprised thus far. BL2 has no in-game MSAA options though, just postAA.

This is what I like to hear. If I don't dork around with the settings will it still boost that high if it is able to?

Wozbo
Jul 5, 2010
I'm wondering if Nvidia finally stopped juggling for physx and just dedicates a flat amount to physx + some leeway for larger workloads. Seems like it would make the most sense for now.

Wozbo
Jul 5, 2010
Heh, maybe they found something akin to the magic number for inverse roots.

Wozbo
Jul 5, 2010
The gigabyte gtx 670 is pretty nice. If you don't have a restricted airflow it will only ever hit 65c max and stay at about 20% fan. If you do though, prep for 90% fan at 90C stable (furmark tested). Also, close slots for sli is a pretty big pita and it looks like they might grind up against one another (having one lifted slightly via some zip ties on power cables instead of letting gravity drop it a little does the trick). I wish my mobo had farther away pci-e slots :(.

Speaking of extravagant expenses, I got the 144hz monitor out on newegg, and I can just say goddayum. This + adaptive vsync is the freaking bees knees.

(Side note: If you have nvidia glasses + 3d emitter, don't use a USB 3.0 port; it fucks up something fierce)

Wozbo fucked around with this message at 15:39 on Sep 21, 2012

Wozbo
Jul 5, 2010
The card takes up 2.00001 slots. The pins from the power connector stick out a good 3 to 4 mm farther than they should (This will cause a bzzzzzzzzz->SNAP as the fan from the other card runs against it). I really wish I could mount something to suspend the cards perfectly, because its literally just the smallest bit that's causing it. I'm not going to dork with sanding it down, I just have a zip tie on the cables forcing a perfect suspension (no bending), so nothing (not even any vibration) will cause any touching. I just forced adaptive because that way I don't have to worry about legacy games (things like tf2 come to mind). I've had a few instances where some of the games I play a) don't ask what to use and b) will take 100% power to go at a fps larger than the ref rate of my monitor. They are few and far in between, but considering the near-0 cost of forcing adaptive, I see no reason not to.

http://www.newegg.com/Product/Product.aspx?Item=N82E16824236293

This is the monitor. Funny story: The 3d glasses actually do work at 144hz as well, it just took the slightest bit of finnegaling.

E: I have a vga bracket apparently still in my case box. I'm going to go get that and see if that helps.

E2: Its too big to install and keep side fan, so no go.

Wozbo fucked around with this message at 16:26 on Sep 21, 2012

Wozbo
Jul 5, 2010
No tearing from adaptive and no vsync afteraffects @ 144; pretty much absolutely no reason not to turn on.

Wozbo fucked around with this message at 17:01 on Sep 21, 2012

Wozbo
Jul 5, 2010
FYI guys Gigabyte's OC Guru fucks with the blizzard launcher something fierce and causes it to crash.

Wozbo
Jul 5, 2010
Also, I believe the 8xx from nvidia series will be the first to have its own dedicated mini cpu for talking to the CPU (If I'm not mistaken, this is to make things like context switching easier and do the whole GPGPU thing).

Wozbo
Jul 5, 2010
http://www.xbitlabs.com/news/cpu/display/20110119204601_Nvidia_Maxwell_Graphics_Processors_to_Have_Integrated_ARM_General_Purpose_Cores.html

This is the first article that came up (there are many more, look up Maxwell architecture), but basically its automating off all the stuff that you have to do on a gpu but with a cpu + some nice things like preemption. I think they are also going to skip a fab step and go smaller, but I'm a bit lazy to look right now. If this pans out the way they say it does, its going to make somthing like 8k UHD resolution viable with crysis <foo> going full blast. More likely it will be a 30 - 50% boost on the first generation, with the "tock" generation right after adding another 20 - 30% on top of normal gains as they figure out what to optimize. If I remember correctly, its currently on track to be something like 8x the power of the current 6xx series, but not out till late 2014-2015. Would be cool to see prototypes on the new consoles, so we all get some awesome graphics for the next xxx years.

Wozbo
Jul 5, 2010
It's two separate addressing spaces.

E: And the VC's controller might have a separate addressing scheme, so the CPU literally doesn't know/ care other than feeding/ fetching data. I know Maxwell will almost for sure, but not 100% sure on current tech.

Wozbo
Jul 5, 2010
Agreeing with dekks on ps issue. Sounds like you may have fried something. Or maybe it wasnt powerful enough to drive the two (I think 700 w)?

Wozbo
Jul 5, 2010
Support != recommended?

Wozbo
Jul 5, 2010

Alereon posted:

There's basically no way it's a custom part, they're just trying to find a way to sell GPUs that had one too many defects to make it as a 7950. The PS4's GPU is going to be a very customized part because they eventually want to integrate it with the CPU, if it isn't at launch.

There's going to be some difference due to all the DMA stuff.

Wozbo
Jul 5, 2010
I think it's really going to be "Jaguar + <foo>" where foo might be instruction sets etc that are more tailored to the logic we see in games which would be a lot easier than "3570k + <foo>." Just a hunch.

Wozbo
Jul 5, 2010
Walp guys, the "leaked" info was fake:

http://x-surface.tumblr.com/post/41282771026/x-surface-dont-believe-everything-you-read

Wozbo
Jul 5, 2010

El Scotch posted:

I'm very curious what AMD is working on in response.

Do they have to be? At a certain point I would just clap my hands, implement the api, and move on. No reason to RnD another one, just like there's no reason to RnD another Mantle.

Wozbo
Jul 5, 2010

HalloKitty posted:

It seems ridiculous that on the PS4, all the RAM isn't available. It has a secondary processor for background tasks with 256MiB RAM attached to it - the same amount as all of the PS3 system RAM - and yet it's still reserving an enormous amount of the main RAM too.

Giving the machines more resources has made the OS bloated as hell, as far as I can tell. I'd love to know what that background chip is responsible for, and why on earth all the RAM is not available. If it's because of some bullshit feature like the streaming and sharing, to hell with that.

Yeah, this goes with a popular console concept of "Don't gently caress with the baseline." Unless you are braindead and change up the hardware *cough*, you have to be very conservative at the start, otherwise you might find out that you needed 512 mb when you only had 256 and welp, cant take it back now. Right now, what, 2.5 gigs is on lockdown? Devs aren't being too majorly constrained by 5.5 GB, so likely we'll be seeing small bumps here and there. My estimate is that probably 1 - 1.5 gigs will be on lockdown by the end of the lifespan.

Wozbo
Jul 5, 2010

Arzachel posted:

Someone asked if 290X are fully enabled on the Beyond3d forums a good while ago, to which Dave Baumann answered with a straight yes.

Sounds like they were selling faulty 295x's then while binning the 100% perfect ones?

Wozbo
Jul 5, 2010

1gnoirents posted:

I was considering the h55 for my 780ti but the performance didn't seem all that great, especially when they were $70. I didn't know GPU's made more heat though. I would have thought they made at least slightly less heat just based on the cooler sizes vs CPU coolers.

I have the kraken bracket just sitting on my desk with no AIO for it I really should pick one up.

I don't think you can just slap on any of those cpu coolers onto a GPU; I thought the RAM gets super hot as well?

Wozbo
Jul 5, 2010
So honest question: Do we eventually get to a resolution point where there's a slight percieved boost in perf because AA / AF and the other little resolution based tweaks no longer become as necessary to run as mathematically detailes/ times per frame?

Wozbo
Jul 5, 2010
So I have a i7 970 (yeesh its 6 years old now) and 2x 670's. I'm basically waiting to see what shakes out of Kaby lake for intel and now its looking like my splurge is gonna be either 2x 1080's or one 1080Ti when it comes out for a new desktop PC. Just playing the waiting game now and it's killing me slowly.

Wozbo
Jul 5, 2010
So I grabbed a evga 1080ti off of amazon a week ago because I made a new system build and I'm going to be watching for the step up program. I'm really hoping reviewers focus on some apples to apples with the 1080ti because the raytracing bit really seems like something that will be mature in a RTX 23XX series card.

Not to mention, I'm 99% sure that the implementation and integration of these is going to have to be matured a helluva lot before anyone that's not a AAA developer will use it.

Adbot
ADBOT LOVES YOU

Wozbo
Jul 5, 2010
So this honestly seems like a “wait for 2280 RTX before upgrade” scenario. Glad I got my 1080ti in the step up window but sad to see that it’s not worth it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply