|
top is theory, bottom is practice I prefer this one because puppies
|
# ? Apr 14, 2015 17:16 |
|
|
# ? May 6, 2024 06:03 |
|
Panty Saluter posted:top is theory, bottom is practice That's terrific -- especially puppies
|
# ? Apr 18, 2015 06:00 |
|
http://www.siliconbeat.com/2015/04/15/intel-said-to-offer-500-million-for-san-diego-based-via-telecom/ gently caress Altera Intel can buy VIA. Wonder what Cyrix would cost
|
# ? Apr 19, 2015 16:28 |
|
WhyteRyce posted:http://www.siliconbeat.com/2015/04/15/intel-said-to-offer-500-million-for-san-diego-based-via-telecom/ Probably not much since they've been defunct for almost 20 years. The last Cyrix processor I owned was a soddered on BGA 486 I believe. I imagine you could get Cyrix for under a million bucks unless they do something else now. I don't know why Intel or AMD would want to own Cyrix. Via wouldn't want to own them neither would Qualcomm they don't even possess an x86 license to my knowledge.
|
# ? Apr 20, 2015 22:55 |
|
Amazingly, the Cyrix MediaGX micro arch lives on in the AMD Geode GX/LX ultra-low-power processors, not due to be EOLed until Q4 2017. Cyrix was bought by Nat Semi, then sold to VIA, and VIA sold Geode to AMD. I've seen some Geode on system-on-module form factors, i'm guessing they go into ATMs, digital signs, that kind of thing. http://www.amd.com/en-us/products/embedded/processors/lx
|
# ? Apr 21, 2015 08:56 |
|
MediaGX is just branding now. They replaced the original Cyrix design with what amounts to an old K7 Athlon core soon after they acquired it. When they were 1st released though MediaGX was kinda interesting as being the first chipset integrated IGP + south bridge that had almost OK performance for pretty cheap. You could get a whole system for around $4-500 less than a Intel based system. Too bad the CPU wasn't good even for its time so they had to use a customized version of windows that was optimized for that chip to get good performance. I don't know if you could even boot a regular version of windows on one.
|
# ? Apr 22, 2015 00:05 |
|
Panty Saluter posted:http://www.userbenchmark.com/UserRun/201044 So I ran this benchmark because in my DBC Task Manager(Essentially Win8 task manager for Win7) the speed of my CPU says 0.80 GHz. Thankfull though, I was not having problems like the previous user with his 0.80GHz CPU. However, my benchmark showed a little something else funny. My SSD had a read speed of 505 MBps. My Western Digital Green 3TB drive? 4,692 MBps read It says it has RAM caching enabled on it. I don't have any kind of RAM caching thing installed on my PC. I have a mushkin Chronos, so I definitely don't have Samsung Magician. http://www.userbenchmark.com/UserRun/216509
|
# ? Apr 22, 2015 01:32 |
|
PC LOAD LETTER posted:MediaGX is just branding now. They replaced the original Cyrix design with what amounts to an old K7 Athlon core soon after they acquired it. I don't know anything about them needing a special version of Windows nor can I find any info about that. We sold some systems that have MediaGX chips back at a computer shop I worked at back in the day and we just used regular Windows 95 (and maybe 98, but I don't remember for sure) on them.
|
# ? Apr 22, 2015 13:41 |
|
That was what I remembered about them. Apparently its not really correct. There was some bug with installing win98 on them and a default win98 install disc won't work without some shenanigans.
|
# ? Apr 22, 2015 14:56 |
|
PC LOAD LETTER posted:That was what I remembered about them. Now that you mention that, I kind of do remember some issue like that but I don't remember the details. Something like you had to have a driver or something on the install disk that didn't come on it by default or something like that.
|
# ? Apr 22, 2015 16:10 |
|
Sorry for the dumb question, but realistically, if I am just using my computer for gaming, how much life does this i7 2600k have in it? I really dont want to upgrade my motherboard + cpu to game so when it comes to that I will probably just switch to consoles again.
|
# ? Apr 22, 2015 23:13 |
|
loudog999 posted:Sorry for the dumb question, but realistically, if I am just using my computer for gaming, how much life does this i7 2600k have in it? I really dont want to upgrade my motherboard + cpu to game so when it comes to that I will probably just switch to consoles again. More than enough, CPUs don't get that much increase in processing power each generation these days and virtually nothing caps it out.
|
# ? Apr 22, 2015 23:20 |
|
evilweasel posted:More than enough, CPUs don't get that much increase in processing power each generation these days and virtually nothing caps it out. Great, thanks. This stuff is still all Greek to me and you guys are always a bunch of help. Cheers
|
# ? Apr 22, 2015 23:31 |
|
If it's still stock you can get a ton of mileage out of it via overclocking. I still have mine at 4.6ghz from day 1 and it runs quiet and cool under a massive gently caress-off HR-02.
|
# ? Apr 23, 2015 00:07 |
|
Just going to put this in here. What kind of processor do you guys recommend for Autocad? Intel Core i7-4790K?
|
# ? Apr 23, 2015 00:40 |
|
evilweasel posted:More than enough, CPUs don't get that much increase in processing power each generation these days and virtually nothing caps it out. I wonder how soon we're going to start seeing Moore's Law degrading rapidly as the number of transistors per square inch we can increase successive die generations by approaches "zero, unless you like massive electron migration and quantum tunneling". Is 10nm going to be it? IIRC you start to see gnarly quantum tunneling happening at that point.
|
# ? Apr 23, 2015 00:42 |
|
the1onewolf posted:Just going to put this in here. A fast GPU doesn't mean much if you have a lovely GPU. While a fast processor is important, make sure you have an appropriate GPU (Quadro or FireGL) and a lot of fast RAM if you plan on doing professional workflows.
|
# ? Apr 23, 2015 00:57 |
|
Kazinsal posted:I wonder how soon we're going to start seeing Moore's Law degrading rapidly as the number of transistors per square inch we can increase successive die generations by approaches "zero, unless you like massive electron migration and quantum tunneling". Is 10nm going to be it? IIRC you start to see gnarly quantum tunneling happening at that point. Moore's original paper had a forecast horizon of 10 years. For the last 50 years, the industry leaders (including Intel) have said that they had line of sight to continue Moore's Law for the next 10 years, but no idea beyond that. All throughout, analysts have been predicting the death of Moore's law. I haven't heard Intel announce or say anything that indicates they don't have line of sight to Moore's Law over the next 10 years. I mean, before immersion lithography was developed some people thought that sub 65nm was never going to work. And there are papers that were published 10 years ago saying that EUV probably wouldn't be able to work in a high volume setting so
|
# ? Apr 23, 2015 01:06 |
|
canyoneer posted:Moore's original paper had a forecast horizon of 10 years. For the last 50 years, the industry leaders (including Intel) have said that they had line of sight to continue Moore's Law for the next 10 years, but no idea beyond that. All throughout, analysts have been predicting the death of Moore's law. The problems that are in the future start becoming more basic physics barriers rather than it being inconceivable you could build accurately on that scale.
|
# ? Apr 23, 2015 02:26 |
|
For anyone who cares about upcoming tech and some science/engineering behind it, I highly recommend this: http://www.realworldtech.com/intel-10nm-qwfet/
|
# ? Apr 23, 2015 03:48 |
|
mayodreams posted:A fast GPU doesn't mean much if you have a lovely GPU. While a fast processor is important, make sure you have an appropriate GPU (Quadro or FireGL) and a lot of fast RAM if you plan on doing professional workflows. Is having one of the "designer" card like the quadro or firegl important or will something like the GTX 970 be okay?
|
# ? Apr 23, 2015 04:56 |
|
the1onewolf posted:Is having one of the "designer" card like the quadro or firegl important or will something like the GTX 970 be okay? Quadro and FireGL cards have drivers specially designed for those programs. This is what give them the advantage over a normal desktop GPU. A K2200 Quadro is $430. http://www.newegg.com/Product/Product.aspx?Item=N82E16814133559
|
# ? Apr 23, 2015 13:52 |
|
You don't have to go crazy on a Quadro/FireGL card, but like SlayVus said, the drivers are validated for professional applications and will work better than consumer cards. Professional cards focus on accuracy and stability rather than just speed.
|
# ? Apr 23, 2015 17:05 |
|
mayodreams posted:You don't have to go crazy on a Quadro/FireGL card, but like SlayVus said, the drivers are validated for professional applications and will work better than consumer cards. Professional cards focus on accuracy and stability rather than just speed. Indeed, if you're doing nothing but 2D raster and no compute the only advantage enterprise cards have is support. AutoCAD's recommended specs are a bit of joke, sure guys you totally need that slow 6 core Xeon for a single threaded application that only benefits from pure CPU/memory clock speed (a baseline amount of CPU cache is necessary of course, anything above an i3 is fine). A consumer card with generous amounts of VRAM and a boatload of system RAM will usually suffice (24-32 GB appears to be the point of diminishing returns for my office's workload, we work entirely in Civil 3D). I built my personal PC as proof of concept for the office with a 4690k/GTX 970/DDR3 2400 setup, naturally when overclocked it dominates our HP Z420's for nearly a 1/4 of the cost. Long term reliability under our workloads is an unknown, but our HP's have quite frankly been huge overpriced pieces of poo poo that were clearly rushed. Mr SoupTeeth fucked around with this message at 17:36 on Apr 23, 2015 |
# ? Apr 23, 2015 17:22 |
|
AutoCAD barely support mutiple cores anyways
|
# ? Apr 23, 2015 17:55 |
|
go3 posted:AutoCAD barely support mutiple cores anyways The Xeon recommendation made more sense when consumer chipsets couldn't utilize as much RAM but that's no longer an issue unless you need a comical amount. I've wondered how well a clocked to hell G3258 would chew on AutoCAD since more threads and cores don't do poo poo, the low cache is probably a deal breaker but it'd be interesting none the less. Mr SoupTeeth fucked around with this message at 18:12 on Apr 23, 2015 |
# ? Apr 23, 2015 18:00 |
|
Mr SoupTeeth posted:I built my personal PC as proof of concept for the office with a 4690k/GTX 970/DDR3 2400 setup, naturally when overclocked it dominates our HP Z420's for nearly a 1/4 of the cost. Long term reliability under our workloads is an unknown, but our HP's have quite frankly been huge overpriced pieces of poo poo that were clearly rushed. I worked with a small 2D/3D graphics company as they started growing and I had work very hard to break them from overclocking their servers and workstations. Overclocking is fine for gaming where the workload comes in short to medium periods, but running a workstation into the ground by overclocking it for real work is a very bad idea. What you are paying for in a OEM workstation class machine is reliability and fast turn around on hardware support. Woe is you the first time that one of your overclocking gamin' rigs breaks and there is a production issue and lovely motherboard manufacturer won't do an advanced RMA.
|
# ? Apr 23, 2015 18:14 |
|
mayodreams posted:I worked with a small 2D/3D graphics company as they started growing and I had work very hard to break them from overclocking their servers and workstations. Overclocking is fine for gaming where the workload comes in short to medium periods, but running a workstation into the ground by overclocking it for real work is a very bad idea. To be fair it doesn't really require obscene voltage to smash the Xeons in IPC and just having that unlocked multiplier and support for memory frequencies above DDR3 1866 gives it pretty big advantage. A properly overbuilt setup like my personal machine would have to try real hard to fail as often as HP's cut rate garbage we're using and we don't exactly push our machines hard. The support is definitely the biggest reason to go with a workstation, as long as it isn't HP (this goddamn company sent out the 420's with recovery software that doesn't work, because they never tested it and learned of the ACHI legacy/UEFI bullshit). Their hardware turnaround time has been relatively decent when they get around to shipping us the right goddamn part.
|
# ? Apr 23, 2015 18:28 |
|
mayodreams posted:I worked with a small 2D/3D graphics company as they started growing and I had work very hard to break them from overclocking their servers and workstations. Overclocking is fine for gaming where the workload comes in short to medium periods, but running a workstation into the ground by overclocking it for real work is a very bad idea. At that point it's still probably cheaper to have various spares on hand if you're competent at replacing hardware in a small business scenario. Big corporation? gently caress it, use what they give you and make the IT dept deal with broken machines.
|
# ? Apr 23, 2015 18:30 |
|
Prescription Combs posted:At that point it's still probably cheaper to have various spares on hand if you're competent at replacing hardware in a small business scenario. This. I am just try to throw some caution from experience out there. I only build my own machines, both desktops and servers, but you should really never roll your own in a business environment unless everyone understands the risks involved. That 25% performance increase sounds great until something bad happens at a deadline and poo poo gets real. If you buy good hardware, you can have a good experience with white box parts, but the stakeholders need to know what it will take to resolve issues. I built out some workstation class machines using the EVGA dual socket 1366 boards and Xeon processors for my client that were rock solid and ran forever. They had way less issues than the boutique "Graphics" OEM that did stupid poo poo like liquid cooling and overclocking. Not to mention that my builds were about 2/3 the cost of the boutique, but still were easily $3k each. Sure it can be done, but I was charging $50/hr to build, test, and deploy the machines, so that adds up too. I don't want to turn this into an I.T. thread, but there is a huge difference in usage cases and approaches between building out a gamin' rig at home and depending on workstations to make money, particularly when deadlines and idle engineers/artists are involved.
|
# ? Apr 23, 2015 18:58 |
|
mayodreams posted:I worked with a small 2D/3D graphics company as they started growing and I had work very hard to break them from overclocking their servers and workstations. Overclocking is fine for gaming where the workload comes in short to medium periods, but running a workstation into the ground by overclocking it for real work is a very bad idea. To reinforce this: I work at a game dev studio. We used to use custom built gaming PCs for dev work but the reliability problem became a major liability. I got us switched over to actual OEM workstations and while we lost a bit of flexibility in terms of system configs and cost we gained so much more in terms of reliability and tech support. The productivity gain from not having custom PCs break all the time by far outweighs the extra system cost. It's not even all that close actually. For a Serious Work Computer it's almost always worth it to buy a real workstation class computer.
|
# ? Apr 23, 2015 21:35 |
|
Number19 posted:To reinforce this: I work at a game dev studio. We used to use custom built gaming PCs for dev work but the reliability problem became a major liability. I got us switched over to actual OEM workstations and while we lost a bit of flexibility in terms of system configs and cost we gained so much more in terms of reliability and tech support. The productivity gain from not having custom PCs break all the time by far outweighs the extra system cost. It's not even all that close actually. I don't disagree, but the massive decline in certain OEM's quality and support recently makes me hesitant to recommend OEM workstations over rolling your own in a small business environment (consumer or enterprise hardware) if the expertise is available to build and maintain them. The difference in component and configuration quality of our HP's from 4 years ago compared to now is apparent, and the perpetual failures and baffling configurations are a testament to this. When spending thousands on a machine it's nice to know you're getting the best components for the money rather than the knowledge that if they can save 1/4th of a cent cutting corners they will. Our biggest liability have been our newest workstations. Inversely, when we had some local boutique make custom workstations years ago they were even bigger liabilities.
|
# ? Apr 23, 2015 23:00 |
|
All my recent Z-Series have been bulletproof. Even the older ones I kept around have been going 24/7 for a long time and I've only had one keel over. Sure I've had a hard drive or two fail but that's well within expectations. We're a pretty small shop and it was still worth it.
|
# ? Apr 24, 2015 02:30 |
|
Mr SoupTeeth posted:I don't disagree, but the massive decline in certain OEM's quality and support recently makes me hesitant to recommend OEM workstations over rolling your own in a small business environment (consumer or enterprise hardware) if the expertise is available to build and maintain them. The difference in component and configuration quality of our HP's from 4 years ago compared to now is apparent, and the perpetual failures and baffling configurations are a testament to this. When spending thousands on a machine it's nice to know you're getting the best components for the money rather than the knowledge that if they can save 1/4th of a cent cutting corners they will. Our biggest liability have been our newest workstations. Inversely, when we had some local boutique make custom workstations years ago they were even bigger liabilities. HP is garbage hth
|
# ? Apr 24, 2015 03:56 |
|
We have HP Z8XX towers here and the driver support is an absolute joke. The users have no end of random intermittent problems with them. Not sure how to apportion blame between the SOE and HP, but it's probably at least a little on each side.
|
# ? Apr 24, 2015 04:28 |
|
Kazinsal posted:I wonder how soon we're going to start seeing Moore's Law degrading rapidly as the number of transistors per square inch we can increase successive die generations by approaches "zero, unless you like massive electron migration and quantum tunneling". Is 10nm going to be it? IIRC you start to see gnarly quantum tunneling happening at that point. The "Xnm" term you see isn't actually accurate, it's a marketing term.
|
# ? Apr 24, 2015 04:32 |
|
evilweasel posted:More than enough, CPUs don't get that much increase in processing power each generation these days and virtually nothing caps it out. Except Dwarf Fortress I bought a 4790K and now I get 150 FPS!
|
# ? Apr 24, 2015 17:46 |
|
some leaked skylake benches
|
# ? Apr 25, 2015 17:09 |
|
I thought that there wasn't going to be a Skylake K chip. I thought it was going to be the Skylake S? edit: I found an artical that answers my question and list the specs of the different chips that are expected at launch. http://wccftech.com/intels-6th-generation-skylake-s-processor-lineup-leaked-core-i7-6700k-leads-pack-10-skus-detailed-samples-spotted/ Lowen SoDium fucked around with this message at 19:34 on Apr 25, 2015 |
# ? Apr 25, 2015 19:28 |
|
|
# ? May 6, 2024 06:03 |
|
Wccftech is the Daily Mail of tech news.
|
# ? Apr 25, 2015 21:53 |