Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Farmer Crack-rear end posted:

...isn't that what the 1070/1080 are for? :confused:
no there are fe versions of the 1070 and 1080 cards too

Adbot
ADBOT LOVES YOU

Fuzzy Mammal
Aug 15, 2001

Lipstick Apathy
remember when cards launched there were reference designs? first you'd get ones made by nvidia themselves then the aib vendors would put out rebadged versions. then after a month or something they'd put out their own updated versions with different coolers / vrms and maybe factory overclocks?

that's all the fe is. the reference copy. they just decided to make it more expensive now because

a) they can
b) early adopter tax
c) to 'not compete' with the varied designs from the aibs

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum
nvidia won't be producing a reference card for the 1060

MrMoo
Sep 14, 2000

Is Ars wrong again?

quote:

Like the GTX 1080 and GTX 1070, the GTX 1060 will be available from manufacturers like Asus, Zotac, and Gigabyte, as well as directly from Nvidia in Founders Edition form at a higher $299 (~£260) price. The extra $50 buys a dual-FET power supply, as well as a similar blower-style cooler to the more expensive Pascal cards, albeit one made out of plastic rather than metal and measuring a shorter 240mm.
http://arstechnica.com/gadgets/2016/07/nvidia-gtx-1060-specs-price-release-date/

Endless Mike
Aug 13, 2003



the gigabyte g1 1070 i got is overclocked and cost less than an fe 1070 lmao

hobbesmaster
Jan 28, 2008

don't the founders edition cards ship before the others? so its like preordering a card to get it asap instead of waiting for the trickle of cards to hit stores during a soft launch

i mean, i am assuming there is a technical reason. it might also be a way to keep the other OEMs happy so nvidia can say "we'll price the reference card really high so we won't compete with you as directly"

Perplx
Jun 26, 2004


Best viewed on Orgasma Plasma
Lipstick Apathy

hobbesmaster posted:

i mean, i am assuming there is a technical reason. it might also be a way to keep the other OEMs happy so nvidia can say "we'll price the reference card really high so we won't compete with you as directly"

they definitely are competing with partners when nvidia.com has 1080's in stock and newegg/amazon etc don't

hobbesmaster
Jan 28, 2008

thats why i suspected its a "soft launch" - low yield, but they can push out a trickle at a high premium

Endless Mike
Aug 13, 2003



I had to drive to the Baltimore Microcenter to get my 1070.

SO DEMANDING
Dec 27, 2003

why are nerds so bad at waiting a couple months for card prices/stock to level out

oh whoops forgot

Farmer Crack-rear end posted:

loving consumerism

triple sulk
Sep 17, 2014



Endless Mike posted:

I had to drive to the Baltimore Microcenter to get my 1070.

microcenter owns except they work on commission so they bother the poo poo out of you every two seconds

Endless Mike
Aug 13, 2003



triple sulk posted:

microcenter owns except they work on commission so they bother the poo poo out of you every two seconds

they never bug me at all, but that's because i usually order online and just pick up when i get there

triple sulk
Sep 17, 2014



Endless Mike posted:

they never bug me at all, but that's because i usually order online and just pick up when i get there

yeah it's the best option

Endless Mike
Aug 13, 2003



tbf the nearest one is 40 minutes away, so it's not like i'm popping in for an hdmi cable or w/e

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

never buy reference design cards unless you love the whine of a lovely fan that you will have to void the warranty to replace

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

if a 250w server gpu can run off a loving passive heatsink then amd/nvidia can do better with their lovely reference designs

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

BangersInMyKnickers posted:

if a 250w server gpu can run off a loving passive heatsink then amd/nvidia can do better with their lovely reference designs

a passive heatsink





with a dozen howling 20,000 rpm Delta fans blasting through it 24/7

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

atomicthumbs posted:

a passive heatsink





with a dozen howling 20,000 rpm Delta fans blasting through it 24/7

being cooled with the waste heat of the hard drives, memory, and processors

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

BangersInMyKnickers posted:

being cooled with the waste heat of the hard drives, memory, and processors
lol

spankmeister
Jun 15, 2008






Vintersorg posted:

im all for competition buy why does AMD have to suck so loving much

They're playing their cards right with the 480 because they can be the price/performance leader in the $200 segment which is a huge slice of the market.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

BangersInMyKnickers posted:

being cooled with the waste heat of the hard drives, memory, and processors

:cripes:

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
mark my words, we have not seen the last of 480s burning out motherboards.

PleasureKevin
Jan 2, 2011

Farmer Crack-rear end posted:

also all the comments are saying the cards are all going to actually sell for ~$50 over MSRP? i got my 760 for like $260, have video cards actually got more expensive? wtf happened here


Farmer Crack-rear end posted:

what the gently caress is a "founders edition"

PleasureKevin
Jan 2, 2011

Alereon posted:

mark my words, we have not seen the last of 480s burning out motherboards.

from what i've read, the average is ~5W over the limit, with very infrequent spikes higher than that. that is when playing maxed games at 4k that aren't even playable like that.

compare that to other cards, especially overclocked, and they often spike to 100W+ from the mobo.

it seems pretty overblown to me, but apparently a bitcoiner did fry some cards or a board, not sure which.

Agile Vector
May 21, 2007

scrum bored



Endless Mike posted:

they never bug me at all, but that's because i usually order online and just pick up when i get there

they don't bother me much but it's the corporate test store and in an area with little else computer related so theres lots of other more confused looking customers

maniacdevnull
Apr 18, 2007

FOUR CUBIC FRAMES
DISPROVES SOFT G GOD
YOU ARE EDUCATED STUPID

Endless Mike posted:

I had to drive to the Baltimore Microcenter to get my 1070.

the one off perring pkwy? it's the best and the fact that there's a harbor freight across the street is wonderful. money is dumb i want cool things instead apparently.

Phoenixan
Jan 16, 2010

Just Keep Cool-idge

PleasureKevin posted:

from what i've read, the average is ~5W over the limit, with very infrequent spikes higher than that. that is when playing maxed games at 4k that aren't even playable like that.

compare that to other cards, especially overclocked, and they often spike to 100W+ from the mobo.

it seems pretty overblown to me, but apparently a bitcoiner did fry some cards or a board, not sure which.
lol @ bitminers

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

PleasureKevin posted:

from what i've read, the average is ~5W over the limit, with very infrequent spikes higher than that. that is when playing maxed games at 4k that aren't even playable like that.

compare that to other cards, especially overclocked, and they often spike to 100W+ from the mobo.

it seems pretty overblown to me, but apparently a bitcoiner did fry some cards or a board, not sure which.
no other cards had similar sustained power draw though. motherboards arent really meant for that and plenty of them have trouble with single cards under spec. good motherboards wouldnt but plenty of people bought gigabyte or the cheapest asrock

Lutha Mahtin
Oct 10, 2010

Your brokebrain sin is absolved...go and shitpost no more!

apparently razer already makes a usb-c thunderbolt enclosure that fits a desktop graphics card! downside is that it costs $500 and that doesn't include any graphics card :v:

http://www.razerzone.com/gaming-systems/razer-blade-stealth

Endless Mike
Aug 13, 2003



Lutha Mahtin posted:

apparently razer already makes a usb-c thunderbolt enclosure that fits a desktop graphics card! downside is that it costs $500 and that doesn't include any graphics card :v:

http://www.razerzone.com/gaming-systems/razer-blade-stealth

the other downside is that razer makes it

skimothy milkerson
Nov 19, 2006

Endless Mike posted:

the other downside is that razer makes it

KOTEX GOD OF BLOOD
Jul 7, 2012

BangersInMyKnickers posted:

never buy reference design cards unless you love the whine of a lovely fan that you will have to void the warranty to replace
don;t sign ur poasts

PleasureKevin
Jan 2, 2011



in Doom 2016 with Vulkan, an RX480 matches a GTX 980 (also using Vulkan)

Endless Mike
Aug 13, 2003



nice. amd's top of the line card is almost as good as nvidia's last-generation best card. all while destroying motherboards!

FormatAmerica
Jun 3, 2005
Grimey Drawer
Also there's gonna be a $150 1060 so :byewhore: to AMD's low-end market share.

ate shit on live tv
Feb 15, 2004

by Azathoth

Cybernetic Vermin posted:

they exist only because intel knew being the only cpu maker in a bunch of markets would cause more trouble than secretly propping up a terrible competitor would be :ssh:

I'm pretty sure it's because of the x64 architecture patent that AMD owns, and Intel pays royalties on. It's more complicated then that, but basically neither company owns the full specs for 64-bit computing.

Deep Dish Fuckfest
Sep 6, 2006

Advanced
Computer Touching


Toilet Rascal
how is it that amd ended up being the ones that introduced the x64 extensions to x86 anyways? it was initially called amd64 and all, so i assume they designed it. was intel still going "no guys, itanium is totally going to take off any fiscal quarter now!!!" and not keen on what they saw as competing with themselves at the time?

Maximum Leader
Dec 5, 2014
that was the "one core should be enough for anyone" and "p4 can reach 10ghz" era intel

Maximum Leader
Dec 5, 2014
and also itanium i guess

Adbot
ADBOT LOVES YOU

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

YeOldeButchere posted:

how is it that amd ended up being the ones that introduced the x64 extensions to x86 anyways? it was initially called amd64 and all, so i assume they designed it. was intel still going "no guys, itanium is totally going to take off any fiscal quarter now!!!" and not keen on what they saw as competing with themselves at the time?

intel thought itanium was going to be the future and pae would extend the life of x86-32 for another 5 to 10 years. Microsoft disagreed and AMD offered them a better compatibility path with x86-64 so obviously everything went that direction

  • Locked thread