Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Sorry if this has been asked a lot, but I've had a 560ti for a while now, and I'm wondering if I should move on to something new. It's not slow, as such, but I could do with more.

I've been thinking about getting a 770 and the reason I'm asking here is the real world difference enough that I should bother? I only game at 1080p and there's not much stuff out there that kicks my computer's rear end, but I don't have the opportunity to upgrade often and this is one of those times where I have the chance.

Adbot
ADBOT LOVES YOU

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Is there going to be a newer variant to the 760 coming soon, like a ti model or whatever? Those performance numbers look good and I guess you're right about the price of the 770, it's $150 more than the 760 at the least here.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Agreed posted:

There is no indication of such, we don't have a good reason to think it'll happen, the performance gap is already remarkably small with the 760 performing more like a 670 than a 660Ti, but they have really liked the Ti branding in the past so if they do end up doing it, it would be because of price pressure from AMD, and I'd speculate they'd do it by pulling a 560Ti-448 and just tossing the mid-range price:performance curve totally out of order (the 560Ti-448 was really another step down from the GF110 chip based GTX 570, which was, performance-wise, basically a GTX 480... Fermi minus).

So if you were in my position, would you buy one? I'm pretty close to pulling the trigger. If I were to sell the 560ti, bearing in mind the 650ti goes for about $170 here, what price should I ask? It might mitigate the costs somewhat.

Oh and I guess this question doesn't matter as such, but is there a model that's quieter than the stock reference card and doesn't do any gimmicky overclocking? I need a quiet computer since it doubles as a DAW work machine.

Factory Factory posted:

Doesn't look like there's a thread in Games or IYG. It's a bit of a weird device; just a small Android tablet (well, closer in size to a phablet), nearly stock, with a controller stuck on it. Nvidia wrote a pair of apps to enable PC game streaming, which is only unique as far as the hardware goes and how well it works.

I'm not even sure where talk would go. It's got SH/SC because of the Tegra 4 and streaming stuff, IYG because it's an Android device, and Games because games.

Theoretically this streaming tech could work on anything, right? Even to another computer? Why lock it down to some lovely tegra powered handheld thing? I'd love to be able to stream games to my laptop from my gaming PC while in another room of the house. That would rule.

cat doter fucked around with this message at 22:50 on Aug 8, 2013

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I'd just like something that pushes out video over a network, rather than a service or whatever. I've often wanted to hang out in my lounge room while playing a PC game but I'd never want to buy 2 PCs or a gaming laptop to achieve that. Just a simple local network thing similar to how the Wii U handles its tablet but over a router.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

canyoneer posted:

So let's talk about the Jimi Hendrix Geforce Experience. Useful tool or nice placebo?
I notice that games themselves are sometimes terrible at predicting what are "optimal" settings for your current hardware configuration. I used it when I reformatted a couple weeks ago and Shogun 2 appears to look nice and run well :shobon:

I dunno whether or not it's the geforce experience or the nvidia drivers themselves but every time I open the gefore experience window in fucks with my TV's display drivers and the image goes all weird. I have to turn off the TV, uninstall the drivers, rescan for plug and play then it works fine. It's weird as hell and kind of annoying.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Not sure if this is the best thread to ask this, but does anyone have a shield yet? I'm kind of interested in the streaming tech. Apparently it's a bit rough out the moment, limited to 30fps and often spits out duplicate frames leading to a juddery appearance. The compression affects texture quality in a minor way too.

I don't even own a smartphone and the shield looks sort interesting even apart from the streaming tech. 720p games on a 5" screen would look pretty nice too. I'm still a little pissed that nvidia is restricting this tech to the shield, there's no reason to. Maybe it'll bomb hard and it'll become a standard geforce feature.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
This R9 280X is confusing. I looked up some benchmarks and it seems to either get beaten by the 760, or is pretty much on par, or has a pretty huge advantage. Is it worth getting one over a 760? I can get a 760 here for $279 or the 280X for $369. Looking at the benchmarks was confusing me so I thought I'd ask you guys about it. What's the dealio?

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Zero VGS posted:

If those are the prices in dingoland then get the 760. But it sounds like your 280X is probably being gouged by retailers because it of the initial lack of supply. The 7970 GHZ Edition is the same card so compare that too.

You're probably right, the 7970 seems to hover around $300ish at the moment so that price seems a bit out of whack. I'll probably just grab the 760. Thanks.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Yeah Pccasegear is usually where I go to check out what's new. I thought the 280X would be much cheaper than the 7970 but I was completely wrong. If that was the cast it'd be, what, $250? I'd buy one at that price.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Welp, either my GPU is dying or the factory overclock is causing issues, because I've been getting driver resets/freezes/crashes, but if I underclock the card to stock (it's a 560ti @900MHz, stock is 822MHz where it seems stable). How do I force the card back down to 822MHz and stay there? I've been using Nvidia inspector to underclock it but if I forget to underclock every time I boot my computer and I play a game, it doesn't take long for it to go badly.

I'd like to permanently bump it back down to stock, whatever performance boost it gets is not worth it.

Anyway, if I do have to get a new card, how does the R9 290 compare to the 4GB 770? The 290 is a tad more expensive here but if it's faster than the 4GB 770 I might get it instead.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I'm looking for a decent 4GB card because I want to get a card that'll last a long as possible since I can't be buying video cards once a year.

Is it really just the driver? I feel like the card has been doing this since I got it but I guess it has been a little more severe lately. Dropping it down to reference clocks seems to help unless it's just a placebo/coincidence.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I'm not sure I follow, do you mean by the time games actually start using more than 3GB I'll likely need a new video card anyway? Or are the 4GB versions of cards bad bandwidth wise?

I'm gonna test Crysis 3 at reference clocks with my 560ti for a while and see if it crashes, I'll go back down to 314 if it does.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Sir Unimaginative posted:

It might not be just the driver, but that's probably the cheapest/least effort thing that can be done to check it. Reports pretty consistently point out that dropping to reference clocks doesn't help much (although you might be lucky, but give it a week uptime on 314 and see.)

So yeah, I set my card to its reference clock speed and played a bunch of Crysis 3 at very high 1600x900 and Bioshock Infinite at ultra 1920x1080 and neither of them crashed. Maybe I'm one of the lucky ones? If anyone has a way to set it to reference speeds permanently that'd be awesome because I have to do it every time I boot.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
So I finally got my new video card to replace the dying 560ti that I had after hmming and hawwing over the 770 and the 280x. I got a 290 instead! Haven't installed it yet since it's far too hot in my room but from the looks of it, it's quite a significant upgrade. And now I see something about some 290s being 290xs with a different bios or some poo poo? That's nuts. Is it like the old days with some nvidia cards had locked shaders or some poo poo? I have no idea. It's an XFX model if that matters.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Huh, so the 290 has a 6 pin and an 8 pin connector, but it looks like I don't have an 8 pin connector on my power supply. It came with a 2x 6 pin to 8 pin converter but the quick start guide that came with it says it "doesn't support" that under the "bad power connection" section. Are they just covering their asses or do I have to buy a new loving power supply or something?

Edit: oh wait, I found the other 2 PCI power connectors in my mess of a case, they're 6 pins but there's like a 2 pin extension thing that you can attach to it to make an 8 pin, I just use that right?

cat doter fucked around with this message at 09:34 on Nov 26, 2013

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Yeah I figured it was fine I just get anxious about dumb poo poo with computers. Power supply fuckery is easily the highest on my list when it comes to computer anxiety. I'd hate to plug in a new GPU then immediately kill it.

I've got it plugged in now but windows is doing some update poo poo so I got out of my stinking hot room. How does the 290 handle heat? I've got summer coming up and I'm not looking forward to the 40+ C temps.

Speaking of computer anxiety, does anyone else have that moment of mild panic when plugging in a new GPU and waiting for your monitor to get a signal? It's always the worst part of upgrading. I've only ever had one DOA GPU too.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

td4guy posted:

You needn't worry about this. They actually shape the plastic stuff around the pins uniquely. It's done on purpose to prevent you from accidentally plugging your CPU 8-pin power cord into your GPU. Or vice-versa.

I actually noticed that when plugging it in which made me not quite so unsure.


td4guy posted:

Yeah, oh man, yeah.

Really glad to know I'm not the only one.

My DVI to VGA converter doesn't fit in my new card! I run my TV for big screen gamin in VGA because it's a cheap poo poo TV that has overscan built into the HDMI ports with no 1:1 pixel mode so PC games look like poo poo. I'm not sure I have any adapters that do fit it either. There's always one unpleasant surprise.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I think a 3 monitor setup for black flag would require a pretty insane setup. I get about 30fps on my new r9 290 at full details and 8x MSAA and 1080p. I can get roughly 60fps locked with just SMAA. Black flag pushes hardware way more than you'd think, I could barely push 30fps at 900p with my 560ti.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

veedubfreak posted:

Black flag is a horrible pc port, so don't expect 60 fps with any set up. Read some reviews. As far as multimonitor support, that's actually where the 290 beats the Ti. Not so much because of the extra memory, but the extra ROPS. I'd wait for a 290 with aftermarket cooler if you plan on going triple monitor.

Black Flag's issues seem to mainly be the god rays, which on high seem to cause some framerate issues(bumping it down one solves that), and its horrible v-sync solution. Simply disabling these on my R9 290 took it from 40fps to pretty much consistently 63fps (it has a weird hard framerate cap) apart from some microstutters. It's definitely not a great port but there's stuff you can do to help.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Baron Bifford posted:

OP should update the OP. I need to know if the Geforce 770 I'm about to buy is a solid choice.

Try and get a 7979 ghz edition for cheap, it's far better value.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Baron Bifford posted:

So I cancelled my order for a Geforce 770 and instead put in an order for a Radeon 7970. Did I do a good thing?

Probably! I was going to do the same thing but opted paying more for the r9 290. The AMD cards seem like better deals for the moment, I'd get nvidia if you're married to their proprietary stuff, which admittedly some of is pretty cool.

Is the driver stuff for the 290 still being worked out? I've been getting strange behaviour in some games where it freezes for sometimes several seconds with audio stuttering very loudly. It looks a lot like a full on crash but it continues on normally afterwards. I'm worried it's a hardware fault rather than driver infancy.

Also, kind of a dumb question, but the DVI-VGA converter I was using with my 560ti doesn't fit in the new 290, I can't just break off the excess pins in the converter and continue using it can I?

Speaking of nvidia proprietary poo poo, is there a hack-y way of getting physx stuff working well on AMD GPUs? I know it's mostly pretty dumb but I kinda like seeing shards flying everywhere and poo poo in borderlands 2 and such.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

HalloKitty posted:

No, no, the extra pins are for the analogue connection.

The connectors without the extra pins around the flat ground pin are called DVI-D, and will only output digital signals.

By the look of it, neither of the DVI connectors on the 290 are DVI-I, so there's no way you can hook up a VGA monitor without an active adapter.

And I gather the active adapters are expensive rather than $2 little doohickeys? This is what happens when you buy cheap TVs with overscan on every resolution folks.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Sidesaddle Cavalry posted:

Your freezes are definitely crashes, but it's more likely your hardware isn't keeping up with something. I don't mean to assume anything about what kind of parts your system is made out of, but what's your power supply rated for in watts? It's still a big jump to the 290 architecture's monster consumption from the 560 Ti, so I'm wondering if your card isn't getting enough juice. Sometimes I get driver hiccups on this overclocked 780 because I'm fairly sure I need a line conditioner :(

It's either an 850 or 900w supply, I forget which. Not generic either. Other than the video card I have an i5 3570k and a single hard drive and a bunch of USB poo poo, so I highly doubt the PSU is being choked.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Fuzz1111 posted:

Are you sure it's every resolution? If it's a 720p TV with 1366*768 panel then you might be able to output that res (or close eg 1360*768) and you may find that overscan is absent (because that's the res those TVs typically run VGA at, hell my TV calls that resolution "VGA" regardless of which input I am using).

Another option is to adjust your vidcards overscan settings - basically the card will output a black border around the image, and you adjust it such that your TV crops only the border (returning your image it back to normal).

This takes a little explaining, but basically since it's such a cheap poo poo TV, at least on my nvidia card there was overscan on every resolution, and the native resolution was reported as 720p, even though it's a 1080p screen. Even on its supposed native resolution there was a hideous amount of overscan and using the nvidia options to reduce the overscan was literally just reducing the resolution to fit the screen, so you'd have like 1136x650 or something weird if you set it to 720p. Forcing it to 1080p would have similar amounts of overscan and there's no manual 1:1 pixel mapping option on the TV at all, so it's up to the video card/EDID to figure out what the gently caress is going on. The only way to circumvent this is to connect the display via VGA and create a 1920x1080 resolution manually which while not perfect, fixed the problem well enough for basically anyone but the most fastidious viewer.

Now since the VGA converters I have are useless, I have to use HDMI again, but strangely enough this AMD card seems to handle display resolutions on TVs with overscan/underscan a little differently and it looks WAY better than the same TV did on an Nvidia card. But, there's some residual blurriness that I can't seem to fix, especially with white text on black backgrounds. It seems the picture is slightly off by a few pixels horizontally but there doesn't seem to be an option in the AMD control panel to move the picture around and try to clear it up.

Basically my dream solution to this problem is to somehow edit the EDID manually and magically have just a 1080p resolution with absolutely no reported overscan or underscan and I could finally have a clear loving picture on this piece of poo poo TV. I wish I could afford something better.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Is there a similar program to nvidia inspector for AMD cards? I quite liked having better control over my drivers.

Also, how does this tessellation option work? Is it just for games that already have it or can you force it on older games that can support it kinda like nvidia's ambient occlusion options?

I actually kinda miss the ambient occlusion options.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I think I'm being CPU limited since I'm running my 3570k at stock (3.4/3.8 turbo) and apparently I should be getting 64fps average at 1080p very high in Crysis 3 but I'm getting closer to 50 average with some dips under that when it gets nuts, it will go above 50 it's just not really maintaining that. I don't have a chipset that allows overclocking either.

Total bummer.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

SourKraut posted:

Hmm, didn't think a 3570K would ever be limiting at just 1080p...

Probably, I downloaded 3dmark just to be sure and it seems to be performing within expectations (seems I only score 1000 less in firestrike than a 4770k and titan system? noice) so perhaps I was just expecting more than I should have.

Although I would like to overclock sometime, I should probably get a motherboard that can before they disappear.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
How would I go about trying to figure out if I can flash my r9 290 to a 290x? There's $200 difference between the 2 cards here and I'd love to get that extra performance for free.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I actually really like the new Splinter Cell, even speaking as a total Chaos Theory fanboy. There's a really interesting flow to it and it's super loving fun when you pull off crazy poo poo.

I played it on PS3 though so I dunno how it actually takes advantage of PCs other than high res textures. Are there any crazy shaders or effects or whatever?

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Is litecoin an offshoot of bitcoin because the market on them crashed or litecoin improved on it significantly or something? Or is it just the same thing but rebranded?

If you can't even break even in America based on power usage then you'd never profit in Australia, our electricity prices are insane. The whole idea continues to sound insane.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Killer robot posted:

Also it sounds like even when it's a huge risk to buy GPUs for the sake of mining long-term, using one you already have and seeing how a boom rides out is a lot less to lose.

This is honestly the only reason I'm curious. I already have this 290 sitting around, why not see if it can earn me money?

Agreed posted:

Litecoin was made to be more difficult to mine via GPU, which is good for some loving reason. Turns out it's still way more efficient, just maybe an order of magnitude less so, to mine GPGPU than CPU. Oops. Now there's also Primecoin, which REALLY, HONESTLY, PROBABLY, can't be optimized to perform better for wattage on a GPU than a CPU.

These things get made to fix "bugs" in the old system, and to create new ... how to describe it, new ways of approaching the imaginary problem of "how do we create an electronic currency that starts from nothing and builds to something?" The emergent problem that Litecoin and Primecoin address is that Bitcoins, even as difficulty raised up, became too easy to mine, leading to things like the 51%-ownership vulnerability, or the ability to throw a shitload of integer-specialized ASICs with a microcontroller all working in parallel to work off of 500mA at 5v via USB instead of hundreds of watts via PCI-e making what was supposed to be a "currency for everyone" into a currency for the few people who can afford to heavily buy into the requisite hardware to make them.

The other reason that new *coins get made is because the blockchain for old *coins is so damned complex that transactions can take a really long time to validate since it has to wait until a successful hash collision occurs and propagates through the network for verification.

But all these coins exist on the same network and there's nothing to stop people from continuing to use bitcoin and mine with a shitload of ASICs right?

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Oh great another awesome problem. It seems my audio interface has died (Uuuuuuuuugh $300 down the drain) so I decided to switch to AMD audio through my video card, and it turns out it doesn't like spitting out audio over the DVI-HDMI cable I use. Going HDMI-HDMI works, obviously, but looks like poo poo, which I may have mentioned.

On my 560ti I could get audio out of the DVI port using the DVI-HDMI cable, so it's not the display. Is it just a limitation on AMD cards? There's no audio output on the DVI ports or something?

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Factory Factory posted:

Audio over DVI is 1000% non-standard and depends a lot on who has bothered to do what custom and proprietary bullshit.

Why is HDMI-HDMI shitlooking when DVI-HDMI is not? It's the same video signal.

I have absolutely no idea, it looks completely different when using the DVI-HDMI cable. It looks closer to what the image is SUPPOSED to look like (ie 1080p image with 1:1 pixel mapping) but it's still off by a bit causing some image issues, especially with white text on a black background.

Going HDMI to HDMI has a very blurry image and the pixel mapping is nowhere near 1:1 and I have to use the underscan settings to fix the image. Fuckin cheap chinese TVs man, I'd save up for a better one but I've got other priorities at the moment.

My fix for this has always been using a DVI to VGA converter, using the TV's VGA port, editing the EDID using phoenix to make it support 1080p (and think it's the native resolution) which produced a near perfect image. I can't do that now though because the R9 290 doesn't support it.

This is super edge casey though.

Shadowhand00 posted:

Can't you warranty it because the audio interface died? If you made the original purchase on an Amex Card, you also get an extended (doubled) warranty period.

The audio interface is like 3-4 years old now and has had some issues for a while now. It would freeze, then the audio would go out completely. My computer would crash if I unplugged it/plugged it back in. Right now it caused a crash when switching songs, my computer wouldn't even boot with it plugged in.

I'll try fiddling around with drivers but my suspicion is that it's dead.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

This Jacket Is Me posted:

Update on this.

I clicked around, and found this and this which suggests that HDMI on certain monitors defaults to "HDMI (TV)" instead of "HDMI (PC)". I changed mine to the latter, and all issues are fixed. No idea why that is, and the setting was deeply buried in the settings under something like "Rename the input types", but if anyone else runs across HDMI-to-HDMI image problems, be sure that it's set to the correct HDMI input type. Why one HDMI is different than another HDMI doesn't make any sense to me, HDMI being a standard and all.

Are you talking about settings with the TV itself? Because one of the reasons I lament this lovely generic chinese TV so much is there's almost no image settings on it at all. It has an aspect ratio option, a zoom option (in case you wanted even more overscan!) a noise filter, and your standard colour/brightness/contrast settings. That's it. Connecting via VGA has a bunch of options to fix the image, but there's loving nothing on the HDMI inputs.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
How would I go about diagnosing whether or not I have a hardware fault with my video card? I've posted about this before, but essentially the issue is that my computer will outright freeze for a period of between a fraction of a second to several seconds (no response, audio hitches very loudly) but continue on normally without a complete crash. Anyone that's had a weird hard lock with that sound looping/hitching thing will know what I'm describing, it just doesn't reboot or require a shutdown because weirdly enough, there don't seem to be any other issues with my computer at the moment.

This issue appeared after the video card was installed and it's the only part that's changed, so I'm pretty sure it's the root of the issue, I just don't know if it's a hardware fault or a software one, although I am leaning pretty hard towards hardware fault. I just need concrete proof of what's wrong so I can either get it replaced without any hassle or actually fix what the problem is. So if anyone knows of video card specific diagnosing techniques that'd be awesome.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Factory Factory posted:

You could run OCCT's stability tests and see if the card fails any. If it fails those tests at stock clocks, that's a pretty definitive indicator of hardware fault.

I've never used this tool before, but I just did this test and is that my video card reaching 94C? That can't be right, can it?

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Doesn't seem to be, I ran OCCT again with GPU-Z and under 51% load for 3 minutes it reached 91C.

I don't think the temperature is the cause since it does it at idle or watching videos, it might be memory related, since I've seen it do more often under load in games, but using this program hasn't caused the problem.

There's a memory option in OCCT, can I just make it use more memory or something? Would that help diagnose if it's the card's memory?

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Ghostpilot posted:

First go here: http://www.overclock.net/t/1445030/is-your-r9-290-unlockable-find-out-here

And if your card's unlockable, head here: http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread

Just finished unlocking mine. I actually hard more trouble making my USB drive dos bootable than anything else (kept hitting snags with 3 different methods before finally just snagging this, making an MS-DOS boot and dragging the atiflash.exe and the asus.rom onto the USB).

Went off without a hitch once I got past that bit.

Here are my results with ShaderToyMark at standard settings.

Pre-Flash
290 @ 1 ghz
644 points
107 FPS

Post-Flash
290x @ 1 ghz
689 points
114 FPS

This is the post I got when I asked the same question which worked out for me. Turns out mine isn't flashable, but you know, I needed to know.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Phuzun posted:

The only memory options I see in OCCT are for the CPU.

For GPU, there is memtestCL and memtest80. But I don't know how good this works since I can put a several hundred mhz increase on my memory and not see an error, yet games will absolutely poo poo themselves the second they try to render at the same memory speed.
https://simtk.org/home/memtest/

As far as the issue you are having, that sounds exactly like the video driver stopping. Try different drivers with clean installs. Other things, re-seat the card in the motherboard and try another power supply (preferably larger).

I doubt it's the power supply, it's 850w non generic so that can't be it. I think there's a new beta driver out, I'll try that, then onto the GPU memory tests if that doesn't work.

Adbot
ADBOT LOVES YOU

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I've noticed that there are DP to VGA converters, would this be perfect for my needs since I can't use a DVI-VGA converter on my new video card? I'm not entirely sure how DP works, I've never used it.

By the way, is it still early days as far as driver support goes with the R9 290? Is it possible we could be seeing further performance gains in most games? I was playing Assassin's Creed 4 and either the driver performance is bad or the game is just awfully optimised. I can't seem to get more than 50fps stably in cities with drops to 40fps quite common. Kind of a bummer.

I checked GPU-Z while it was running and it was using 97% of my GPU but only roughly 60% of my CPU. I'm not sure that matters much but it didn't seem processor limited.

Oh also, since 13.12 is out I installed it and was able to turn the fan up so it's not hitting 94C all the time. I mean it's rated for 95 but I worry it's causing increases in ambient temps.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply