Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo
dGPU shat itself at almost, but not quite December 31st 2016, 11:59 GMT.

If I didn't already have ENOUGH reason to hate 2016 already, it's a mobile workstation GPU, too.

Thank god, my warranty is still until May of 2017.

I need a loving proper desktop.

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
AMD's dropping marketing buzzwords again.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Finally, someone paying attention to bandwidth per pin. I assume that helps Ashes?

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

AMD's dropping marketing buzzwords again.



....is that a transparent PNG? What the everloving gently caress, AMD, fire your marketing team.

I wouldn't have even known to look for "bandwidth per pin" if you hadn't said anything, and I'm using my laptop's built-in IPS panel.

repiv
Aug 13, 2009

did they think we wouldn't notice them listing fast cache and fp16 twice each to fill space

Shrimp or Shrimps
Feb 14, 2012


I've honestly never heard of bandwidth per pin before. Is that a vram thing?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Shrimp or Shrimps posted:

I've honestly never heard of bandwidth per pin before. Is that a vram thing?

Betting HBM2.

wolrah
May 8, 2006
what?
I'm not entirely getting my hopes up with nVidia. Their CES stuff always seems to be focused on, unsurprisingly, consumer electronics. Mobile devices, Shield, autonomous cars and other "learning" technology. Have they had a gaming GPU announcement at CES ever? The last two major families got their own special events and 980ti was at Computex.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

....is that a transparent PNG? What the everloving gently caress, AMD, fire your marketing team.

They did manage this though.



Some speculation that the really buzzwordy Draw Stream Binning Rasterizer is just Tiled Rasteriser and they're aping Maxwell. Which is good but it's funny they're burying it under word salad.

penus penus penus
Nov 9, 2014

by piss__donald
Never forget the unironic fps per inch

Enos Cabell
Nov 3, 2004


I'm the Rapid Packed Math.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Look, I'm just glad that AMD are deciding that what they have is capable of attacking the high end again.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

SwissArmyDruid posted:

Look, I'm just glad that AMD are deciding that what they have is capable of attacking the high end again.

Whether or not what they have is actually capable of attacking the high end again is the real question, though.

penus penus penus
Nov 9, 2014

by piss__donald
Four years in the making!!!!

Anime Schoolgirl
Nov 28, 2002

AMD marketing :allears:

repiv
Aug 13, 2009


Speak of the devil: http://videocardz.com/65301/amd-announces-freesync-2

The tech sounds fine, but it has absolutely nothing to do with adaptive sync. Why is it being folded under the Freesync brand? :psyduck:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

repiv posted:

Speak of the devil: http://videocardz.com/65301/amd-announces-freesync-2

The tech sounds fine, but it has absolutely nothing to do with adaptive sync. Why is it being folded under the Freesync brand? :psyduck:

They explain it in the comments...

"It appears that Freesync 2 shunts all of the tone mapping to the GPU side of things to keep any heavy processing off the display."

"In case you didn't pay attention to the slides it is actually related to Display sync. It cuts out the Display Tone Mapping Lag."

repiv
Aug 13, 2009

Yeah, I get what they're doing. There's just no connection to adaptive sync there - the same frontloaded tone-mapping could be applied to a fixed-refresh display like a HDR VR headset.

Jumbling together two unrelated display technologies under one name is needlessly confusing.

repiv fucked around with this message at 01:16 on Jan 3, 2017

Assepoester
Jul 18, 2004
Probation
Can't post for 10 years!
Melman v2

SwissArmyDruid posted:

https://twitter.com/Radeon/status/813561408612945920

For the love of all that is unholy, please don't bring back The Fixer again.
They're bringing back the box art cg samurai DIGITAL SUPERSTAR "Ruby"

pyf favorite terrible cg box art waifu

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Cardboard Box A posted:

They're bringing back the box art cg samurai DIGITAL SUPERSTAR "Ruby"

pyf favorite terrible cg box art waifu

My favorite is the nVidia fairy with accurately modeled nipples :colbert:

Shrimp or Shrimps
Feb 14, 2012


The weird chrome dragon-gargoyle thing on the 9800pro was always my favourite.

beergod
Nov 1, 2004
NOBODY WANTS TO SEE PICTURES OF YOUR UGLY FUCKING KIDS YOU DIPSHIT
loving hell. So if I have a G-Sync monitor and a 1080, do I enable or disable V-Sync in game? Or in the GeForce Control Panel? And then I enable G-Sync in the Control Panel regardless, correct? I feel like a moron.

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
Vsync is only if you want frame limiting to 144/165/whatever so you dont get tearing. Turn on gsync in the control panel, and vsync whichever you like better

beergod
Nov 1, 2004
NOBODY WANTS TO SEE PICTURES OF YOUR UGLY FUCKING KIDS YOU DIPSHIT
So turn them both on?

repiv
Aug 13, 2009

beergod posted:

So turn them both on?

Yeah, enabling both GSync and VSync is the safest bet unless you're extremely particular about input lag. In that case you might want no-sync or fast-sync.

Shrimp or Shrimps
Feb 14, 2012


vsync adds input lag though, right? Why not just frame limit to your monitor's refresh rate so you're always "in" adaptive sync?

beergod
Nov 1, 2004
NOBODY WANTS TO SEE PICTURES OF YOUR UGLY FUCKING KIDS YOU DIPSHIT
I'm playing SF V, so I suppose that could be an issue. I'm having weird stuttering and slowdown and I'm trying to fix it.

repiv
Aug 13, 2009

beergod posted:

I'm playing SF V, so I suppose that could be an issue. I'm having weird stuttering and slowdown and I'm trying to fix it.

Isn't SFV a 60fps locked game? If so you'll always be in G-Sync mode and the V-Sync setting will make absolutely no difference.

beergod
Nov 1, 2004
NOBODY WANTS TO SEE PICTURES OF YOUR UGLY FUCKING KIDS YOU DIPSHIT

repiv posted:

Isn't SFV a 60fps locked game? If so you'll always be in G-Sync mode and the V-Sync setting will make absolutely no difference.

It is but it is running supremely hosed up right now for some reason. It's either slow motion or way too fast.

Col.Kiwi
Dec 28, 2004
And the grave digger puts on the forceps...

repiv posted:

Isn't SFV a 60fps locked game? If so you'll always be in G-Sync mode and the V-Sync setting will make absolutely no difference.
Sounds like it, I googled it because I was curious and it sounds like it's one of those games that you really want to run at 60, no more no less. Maybe this will be helpful to the other guy https://www.reddit.com/r/StreetFighter/comments/4ad17i/pc_framerate_fix/

penus penus penus
Nov 9, 2014

by piss__donald

Shrimp or Shrimps posted:

vsync adds input lag though, right? Why not just frame limit to your monitor's refresh rate so you're always "in" adaptive sync?

This is always what I'd assume was best. There's no reason to put up with vsync if you had a gsync monitor and even if you had to have a specific fps then you should just frame limit, which adds no overhead at all (even saves some) and gsync will prevent it from tearing. I guess in this scenario it wouldn't work out if you actually went below 60 fps due to graphical demand though (edit: not that it would work with vsync either), but having to mash vysnc by necessity and gsync together sounds like a headache to me

penus penus penus fucked around with this message at 23:46 on Jan 3, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Lenovo hosed up again and let slip that AMD is doing the RX500 series (like they did with the RX400 series)



Likely Polaris 11, my guess based on past behavior is the Polaris 10XT2 will be the RX 560/70 for desktop. Small Vega for 580/90, Big Vega for Fury/FuryXXX. I'm hesitant to say it gets moved down the stack much further because that's some extreme drop in value.

This also kind of reinforces the idea the 400 series was forced out just to have something for sale if it's replaced in about 6-8 months.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Annoyingly, that's actually at a pretty good price point, too, or at least it would be, if it weren't Lenovo.

SwissArmyDruid
Feb 14, 2014

by sebmojo
1050 and 1050ti revealed for mobile.

http://techreport.com/review/31180/nvidia-unveils-its-gtx-1050-and-gtx-1050-ti-for-laptops

Of note: Mobile 1050 only 16 ROPs compared to desktop 1050's 32. Clock speed bumps across the board on the mobile variants. Nvidia taking a page from AMD's book and putting in frame pacing, re-using the old Battery Boost name.

I don't think I'd mind grabbing a laptop with a 1050 Ti in it, if the price is right. It looks very good for 1080p60 gaming.

SwissArmyDruid fucked around with this message at 04:03 on Jan 4, 2017

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
The 1050 looks like it won't be that great but the 1050ti should make for a nice light, decent gaming machine.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

B-Mac posted:

The 1050 looks like it won't be that great but the 1050ti should make for a nice light, decent gaming machine.

Unfortunately, it does seem like there's going to be a 4GB 1050 and a 2GB 1050Ti, so welcome to another few years of convincing family and friends to 'get the *good* good one.'

EdEddnEddy
Apr 5, 2012



BIG HEADLINE posted:

Unfortunately, it does seem like there's going to be a 4GB 1050 and a 2GB 1050Ti, so welcome to another few years of convincing family and friends to 'get the *good* good one.'

Oh great..

I have been tossing the idea around that a 1060 in a laptop would be enough for me for another 6+ year laptop (can't believe my G73JH has made it since 2010). But a 1070 or 1080 would make that last a hell of a lot longer even if the portability is going to have to suffer, Unless I get a Razer Blade Pro or something...

Seamonster
Apr 30, 2007

IMMER SIEGREICH

B-Mac posted:

The 1050 looks like it won't be that great but the 1050ti should make for a nice light, decent gaming machine.

1050 will do fine in laptops that aren't chintzed out spaceships like Thinkpads (eventually, hopefully) and the XPS line. I'm planning on getting the next XPS 15 and ~25% over a 960m is a big boost coming from a 750m. But yeah for real 13 inch gaming laptops the 1050ti should be quite nice.

SwissArmyDruid
Feb 14, 2014

by sebmojo

repiv posted:

Speak of the devil: http://videocardz.com/65301/amd-announces-freesync-2

The tech sounds fine, but it has absolutely nothing to do with adaptive sync. Why is it being folded under the Freesync brand? :psyduck:

I somehow didn't hit submit on the :effortpost:, but I'm too lazy to retype it so I'll just say that I just hope that it doesn't require a Freesync 2 monitor, because I really want one of those loving Samsung curved dealios that was teased late last year and were announced to be unveiled for realsies at CES in a day or two, and not have to wait. Again.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy
So kaby lake is out and none of the desktop chips have edram on them. lol

Wasn't that thing in 5775c a pretty hefty boost in FPS in basically any game that's even slightly cpu bound, to the point of skylake looking extremely anemic as an upgrade? Do they just not want nerd money?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply