|
Farmer Crack-rear end posted:...isn't that what the 1070/1080 are for?
|
# ? Jul 7, 2016 18:48 |
|
|
# ? Apr 25, 2024 23:29 |
|
remember when cards launched there were reference designs? first you'd get ones made by nvidia themselves then the aib vendors would put out rebadged versions. then after a month or something they'd put out their own updated versions with different coolers / vrms and maybe factory overclocks? that's all the fe is. the reference copy. they just decided to make it more expensive now because a) they can b) early adopter tax c) to 'not compete' with the varied designs from the aibs
|
# ? Jul 7, 2016 19:02 |
|
nvidia won't be producing a reference card for the 1060
|
# ? Jul 7, 2016 19:05 |
|
Is Ars wrong again?quote:Like the GTX 1080 and GTX 1070, the GTX 1060 will be available from manufacturers like Asus, Zotac, and Gigabyte, as well as directly from Nvidia in Founders Edition form at a higher $299 (~£260) price. The extra $50 buys a dual-FET power supply, as well as a similar blower-style cooler to the more expensive Pascal cards, albeit one made out of plastic rather than metal and measuring a shorter 240mm.
|
# ? Jul 7, 2016 19:17 |
|
the gigabyte g1 1070 i got is overclocked and cost less than an fe 1070 lmao
|
# ? Jul 7, 2016 19:20 |
|
don't the founders edition cards ship before the others? so its like preordering a card to get it asap instead of waiting for the trickle of cards to hit stores during a soft launch i mean, i am assuming there is a technical reason. it might also be a way to keep the other OEMs happy so nvidia can say "we'll price the reference card really high so we won't compete with you as directly"
|
# ? Jul 7, 2016 19:24 |
|
hobbesmaster posted:i mean, i am assuming there is a technical reason. it might also be a way to keep the other OEMs happy so nvidia can say "we'll price the reference card really high so we won't compete with you as directly" they definitely are competing with partners when nvidia.com has 1080's in stock and newegg/amazon etc don't
|
# ? Jul 7, 2016 19:30 |
|
thats why i suspected its a "soft launch" - low yield, but they can push out a trickle at a high premium
|
# ? Jul 7, 2016 19:38 |
|
I had to drive to the Baltimore Microcenter to get my 1070.
|
# ? Jul 7, 2016 19:53 |
|
why are nerds so bad at waiting a couple months for card prices/stock to level out oh whoops forgot Farmer Crack-rear end posted:loving consumerism
|
# ? Jul 7, 2016 19:56 |
|
Endless Mike posted:I had to drive to the Baltimore Microcenter to get my 1070. microcenter owns except they work on commission so they bother the poo poo out of you every two seconds
|
# ? Jul 7, 2016 19:59 |
|
triple sulk posted:microcenter owns except they work on commission so they bother the poo poo out of you every two seconds they never bug me at all, but that's because i usually order online and just pick up when i get there
|
# ? Jul 7, 2016 20:16 |
|
Endless Mike posted:they never bug me at all, but that's because i usually order online and just pick up when i get there yeah it's the best option
|
# ? Jul 7, 2016 20:27 |
|
tbf the nearest one is 40 minutes away, so it's not like i'm popping in for an hdmi cable or w/e
|
# ? Jul 7, 2016 20:31 |
|
never buy reference design cards unless you love the whine of a lovely fan that you will have to void the warranty to replace
|
# ? Jul 7, 2016 20:50 |
|
if a 250w server gpu can run off a loving passive heatsink then amd/nvidia can do better with their lovely reference designs
|
# ? Jul 7, 2016 20:51 |
|
BangersInMyKnickers posted:if a 250w server gpu can run off a loving passive heatsink then amd/nvidia can do better with their lovely reference designs a passive heatsink† †with a dozen howling 20,000 rpm Delta fans blasting through it 24/7
|
# ? Jul 7, 2016 21:51 |
|
atomicthumbs posted:a passive heatsink† being cooled with the waste heat of the hard drives, memory, and processors
|
# ? Jul 7, 2016 22:34 |
|
BangersInMyKnickers posted:being cooled with the waste heat of the hard drives, memory, and processors
|
# ? Jul 7, 2016 22:43 |
|
Vintersorg posted:im all for competition buy why does AMD have to suck so loving much They're playing their cards right with the 480 because they can be the price/performance leader in the $200 segment which is a huge slice of the market.
|
# ? Jul 7, 2016 23:36 |
|
BangersInMyKnickers posted:being cooled with the waste heat of the hard drives, memory, and processors
|
# ? Jul 8, 2016 00:49 |
|
mark my words, we have not seen the last of 480s burning out motherboards.
|
# ? Jul 8, 2016 02:22 |
|
Farmer Crack-rear end posted:also all the comments are saying the cards are all going to actually sell for ~$50 over MSRP? i got my 760 for like $260, have video cards actually got more expensive? wtf happened here Farmer Crack-rear end posted:what the gently caress is a "founders edition"
|
# ? Jul 8, 2016 02:39 |
|
Alereon posted:mark my words, we have not seen the last of 480s burning out motherboards. from what i've read, the average is ~5W over the limit, with very infrequent spikes higher than that. that is when playing maxed games at 4k that aren't even playable like that. compare that to other cards, especially overclocked, and they often spike to 100W+ from the mobo. it seems pretty overblown to me, but apparently a bitcoiner did fry some cards or a board, not sure which.
|
# ? Jul 8, 2016 02:45 |
|
Endless Mike posted:they never bug me at all, but that's because i usually order online and just pick up when i get there they don't bother me much but it's the corporate test store and in an area with little else computer related so theres lots of other more confused looking customers
|
# ? Jul 8, 2016 02:56 |
|
Endless Mike posted:I had to drive to the Baltimore Microcenter to get my 1070. the one off perring pkwy? it's the best and the fact that there's a harbor freight across the street is wonderful. money is dumb i want cool things instead apparently.
|
# ? Jul 8, 2016 03:02 |
|
PleasureKevin posted:from what i've read, the average is ~5W over the limit, with very infrequent spikes higher than that. that is when playing maxed games at 4k that aren't even playable like that.
|
# ? Jul 8, 2016 03:06 |
|
PleasureKevin posted:from what i've read, the average is ~5W over the limit, with very infrequent spikes higher than that. that is when playing maxed games at 4k that aren't even playable like that.
|
# ? Jul 8, 2016 04:00 |
|
apparently razer already makes a usb-c thunderbolt enclosure that fits a desktop graphics card! downside is that it costs $500 and that doesn't include any graphics card http://www.razerzone.com/gaming-systems/razer-blade-stealth
|
# ? Jul 10, 2016 21:05 |
|
Lutha Mahtin posted:apparently razer already makes a usb-c thunderbolt enclosure that fits a desktop graphics card! downside is that it costs $500 and that doesn't include any graphics card the other downside is that razer makes it
|
# ? Jul 11, 2016 01:55 |
Endless Mike posted:the other downside is that razer makes it
|
|
# ? Jul 11, 2016 02:21 |
|
BangersInMyKnickers posted:never buy reference design cards unless you love the whine of a lovely fan that you will have to void the warranty to replace
|
# ? Jul 11, 2016 05:43 |
|
in Doom 2016 with Vulkan, an RX480 matches a GTX 980 (also using Vulkan)
|
# ? Jul 13, 2016 19:07 |
|
nice. amd's top of the line card is almost as good as nvidia's last-generation best card. all while destroying motherboards!
|
# ? Jul 13, 2016 19:21 |
|
Also there's gonna be a $150 1060 so to AMD's low-end market share.
|
# ? Jul 13, 2016 20:03 |
|
Cybernetic Vermin posted:they exist only because intel knew being the only cpu maker in a bunch of markets would cause more trouble than secretly propping up a terrible competitor would be I'm pretty sure it's because of the x64 architecture patent that AMD owns, and Intel pays royalties on. It's more complicated then that, but basically neither company owns the full specs for 64-bit computing.
|
# ? Jul 13, 2016 20:18 |
|
how is it that amd ended up being the ones that introduced the x64 extensions to x86 anyways? it was initially called amd64 and all, so i assume they designed it. was intel still going "no guys, itanium is totally going to take off any fiscal quarter now!!!" and not keen on what they saw as competing with themselves at the time?
|
# ? Jul 13, 2016 20:33 |
|
that was the "one core should be enough for anyone" and "p4 can reach 10ghz" era intel
|
# ? Jul 13, 2016 20:36 |
|
and also itanium i guess
|
# ? Jul 13, 2016 20:37 |
|
|
# ? Apr 25, 2024 23:29 |
|
YeOldeButchere posted:how is it that amd ended up being the ones that introduced the x64 extensions to x86 anyways? it was initially called amd64 and all, so i assume they designed it. was intel still going "no guys, itanium is totally going to take off any fiscal quarter now!!!" and not keen on what they saw as competing with themselves at the time? intel thought itanium was going to be the future and pae would extend the life of x86-32 for another 5 to 10 years. Microsoft disagreed and AMD offered them a better compatibility path with x86-64 so obviously everything went that direction
|
# ? Jul 13, 2016 20:47 |