Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
DEUCE SLUICE
Feb 6, 2004

I dreamt I was an old dog, stuck in a honeypot. It was horrifying.

kill your idols posted:

Finally got everything installed and wrapped up. Another 16GB of ram came yesterday, so this setup is done.

Focus: :eng101: VMware.

Setup: Beefy AIO host, shared storage out by NFS/iSCSI, back to host, and x3 vESXi hosts for vGoodies.

What are you going to use for the VM to present the storage? FreeNAS?

edit:

So you have the machine booting into ESXi (from a USB stick?) with the three vESXi vmdk's and the NAS vmdk located on the SSD, and any VMs running off of the vESXi hosts have their vmdk's located on the NAS datastore, right?

DEUCE SLUICE fucked around with this message at 00:04 on Aug 30, 2013

Adbot
ADBOT LOVES YOU

kill your idols
Sep 11, 2003

by T. Finninho

DEUCE SLUICE posted:

What are you going to use for the VM to present the storage? FreeNAS?

edit:

So you have the machine booting into ESXi (from a USB stick?) with the three vESXi vmdk's and the NAS vmdk located on the SSD, and any VMs running off of the vESXi hosts have their vmdk's located on the NAS datastore, right?

ESXi is installed to a USB drive in the motherboard's header. (SuperMicro has built in slots on most of their boards.)

Datastore01 is the SSD, which has a FreeNAS VM + the HBA passed through for Raid10 Storage out iSCSI/NFS back to the host as datastore03.

Datastore02 is the 5th 2TB drive for backups and ISOs.

NYFreddy
Feb 16, 2009
Was pumped about my new C1100 w/ 72 GB of ram for $400 then I read this thread :(.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug

NYFreddy posted:

Was pumped about my new C1100 w/ 72 GB of ram for $400 then I read this thread :(.

If the lab fits your needs go for it, however chances are you could get better for a better value

Stealthgerbil
Dec 16, 2004


i like my c1100, its not bad its just super loud and they suck down power because of dual processors. like $300 vs $800 for a new e3 system, if you are just messing around the c1100 is fine for the money. they are still fast enough for most home lab uses.

Moey
Oct 22, 2010

I LIKE TO MOVE IT
Due to some most likely price mistake yesterday, I ordered two more SSDs for my home esxi box. I'll now have 750gigs of non redundant fast storage to spin up labs with.

SEKCobra
Feb 28, 2011

Hi
:saddowns: Don't look at my site :saddowns:

Moey posted:

Due to some most likely price mistake yesterday, I ordered two more SSDs for my home esxi box. I'll now have 750gigs of non redundant fast storage to spin up labs with.

The amazon uk one? I cashed in on that as well.

kill your idols
Sep 11, 2003

by T. Finninho

Moey posted:

Due to some most likely price mistake yesterday, I ordered two more SSDs for my home esxi box. I'll now have 750gigs of non redundant fast storage to spin up labs with.

Do tell. I got my 840pro from my macbook now unused. I think I'm gonna throw in my ESXi box instead of selling it. Fast storage all day. :jiggled:

Only reason I went with a bigger SSD for my main computer is the fact I need to bootcamp into Windows. Trying to run the vsphere client in a VM inside Fusion is driving me nuts. VM inside VM to manage a host to manage x8 other VM's is just silly at this resolution.

kill your idols fucked around with this message at 19:10 on Sep 1, 2013

Moey
Oct 22, 2010

I LIKE TO MOVE IT

SEKCobra posted:

The amazon uk one? I cashed in on that as well.

Oh yea. 250gb for 76ish is awesome.

Shipping time is like 5 weeks, so I will forget and it will be like Christmas when they arrive.

Edit:

Both orders canceled due to the price mistake. Rat farts.

Moey fucked around with this message at 16:16 on Sep 2, 2013

SEKCobra
Feb 28, 2011

Hi
:saddowns: Don't look at my site :saddowns:

Moey posted:

Oh yea. 250gb for 76ish is awesome.

Shipping time is like 5 weeks, so I will forget and it will be like Christmas when they arrive.

Edit:

Both orders canceled due to the price mistake. Rat farts.

Same here, this one seemed like it might go through, too bad :(

DEUCE SLUICE
Feb 6, 2004

I dreamt I was an old dog, stuck in a honeypot. It was horrifying.

kill your idols posted:

Do tell. I got my 840pro from my macbook now unused. I think I'm gonna throw in my ESXi box instead of selling it. Fast storage all day. :jiggled:

Only reason I went with a bigger SSD for my main computer is the fact I need to bootcamp into Windows. Trying to run the vsphere client in a VM inside Fusion is driving me nuts. VM inside VM to manage a host to manage x8 other VM's is just silly at this resolution.

I love how the vCenter web client integration plugin doesn't work on anything that the vsphere client doesn't already run on. :rolleyes:

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug

DEUCE SLUICE posted:

I love how the vCenter web client integration plugin doesn't work on anything that the vsphere client doesn't already run on. :rolleyes:

You should have support for the plugin for Safari and other OS's in ~30 days when 5.5 goes GA

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug
I need some volunteers for some ICM stuff, please PM or facebook.

Basically I need some people to PoC my environment for my VCAP:DCA/DCD and VCP:ICM design. It will probably be a 60 day max. Basically I want people who will stress test it.

Probably be a ping back by Wednesday on login's

alo
May 1, 2005


So this isn't homelab material (I already have a Norco with an E3 Xeon loaded with m1015's and drives -- it's terrific and quiet)... but...

At work I need to set up a lab. Normally the answer to this is "just get some older decommissioned machines," but in this case, I really don't have any older machines*. So lets set a budget of 1500 dollars. I'm looking for some storage (probably going to be running some Solaris derivative for easy iSCSI/NFS), plus two ESXi hosts. This is a bit different than the home setup, since I don't pay the power bill or care about the noise, since it's going in a cute little half rack in the corner of our DC.

So lets start out with two C1100's at $430 apiece (72gb ram, with rails, no drives). There are 32gb models for a few bucks less, but meh. Is there anything that's going to beat that, in a rack form factor?

If I go with those two, I'll have 640 left over for some storage. I'd like to throw at least 1 SSD in there as well as a few 3.5 inch drives. What's a good choice here? As noted below, I do have some older machines, if there's a particularly sweet DAS or SAS expander setup.

Oh, and I have a pile of terrible Dell 5224 switches sitting around, is there anything better in the ~200 dollar range?

Tell me about your dream setups that your wife/mom won't let you have.

* I have a pile of old 2950's (original, not iii) sitting around with 4gb of ram in them... I'm not really looking to buy loose ram for some even crappier Xeons.

Count Thrashula
Jun 1, 2003

Peak Performance.

Buglord
I tried to snipe an auction for a Juniper J2320 but lost it because I wasn't logged in and that messed up my sniping by like 3 seconds.

It ended at $28.

gently caress.

I'm gonna go take a walk.

(edit-- VBoxing a Juniper machine is semi-feasible but doesn't have full functionality and doesn't have any switching functionality at all. So, a physical box would be great to have.)

Tekhne
Sep 11, 2001

alo posted:

So this isn't homelab material (I already have a Norco with an E3 Xeon loaded with m1015's and drives -- it's terrific and quiet)... but...

At work I need to set up a lab. Normally the answer to this is "just get some older decommissioned machines," but in this case, I really don't have any older machines*. So lets set a budget of 1500 dollars. I'm looking for some storage (probably going to be running some Solaris derivative for easy iSCSI/NFS), plus two ESXi hosts. This is a bit different than the home setup, since I don't pay the power bill or care about the noise, since it's going in a cute little half rack in the corner of our DC.

So lets start out with two C1100's at $430 apiece (72gb ram, with rails, no drives). There are 32gb models for a few bucks less, but meh. Is there anything that's going to beat that, in a rack form factor?

If I go with those two, I'll have 640 left over for some storage. I'd like to throw at least 1 SSD in there as well as a few 3.5 inch drives. What's a good choice here? As noted below, I do have some older machines, if there's a particularly sweet DAS or SAS expander setup.

Oh, and I have a pile of terrible Dell 5224 switches sitting around, is there anything better in the ~200 dollar range?

Tell me about your dream setups that your wife/mom won't let you have.

* I have a pile of old 2950's (original, not iii) sitting around with 4gb of ram in them... I'm not really looking to buy loose ram for some even crappier Xeons.


I purchased the C6100 for my home lab recently and am very happy with it. Mine has 4 nodes each with 24GB ram and dual Xeon L5520's. This particular model has 12 HDD slots - normally three go to each node, but with a little modding you can have all 12 go to one node. In my case I have all 12 going to a node running FreeNAS for now. There are some trays that you can buy for a few bucks to fit an SSD into it. This particular seller accepted my offer of $769.99 (with some haggling, so start low). Additional info on this model can be found here. As for power usage, all four nodes on and idle use 121.1 watts.

evol262
Nov 30, 2010
#!/usr/bin/perl

alo posted:

So this isn't homelab material (I already have a Norco with an E3 Xeon loaded with m1015's and drives -- it's terrific and quiet)... but...

At work I need to set up a lab. Normally the answer to this is "just get some older decommissioned machines," but in this case, I really don't have any older machines*. So lets set a budget of 1500 dollars. I'm looking for some storage (probably going to be running some Solaris derivative for easy iSCSI/NFS), plus two ESXi hosts. This is a bit different than the home setup, since I don't pay the power bill or care about the noise, since it's going in a cute little half rack in the corner of our DC.

So lets start out with two C1100's at $430 apiece (72gb ram, with rails, no drives). There are 32gb models for a few bucks less, but meh. Is there anything that's going to beat that, in a rack form factor?

If I go with those two, I'll have 640 left over for some storage. I'd like to throw at least 1 SSD in there as well as a few 3.5 inch drives. What's a good choice here? As noted below, I do have some older machines, if there's a particularly sweet DAS or SAS expander setup.

Oh, and I have a pile of terrible Dell 5224 switches sitting around, is there anything better in the ~200 dollar range?

Tell me about your dream setups that your wife/mom won't let you have.

* I have a pile of old 2950's (original, not iii) sitting around with 4gb of ram in them... I'm not really looking to buy loose ram for some even crappier Xeons.

The question at this point is really "what are you going to do with your lab?"

Tekhne posted:

I purchased the C6100 for my home lab recently and am very happy with it. Mine has 4 nodes each with 24GB ram and dual Xeon L5520's. This particular model has 12 HDD slots - normally three go to each node, but with a little modding you can have all 12 go to one node. In my case I have all 12 going to a node running FreeNAS for now. There are some trays that you can buy for a few bucks to fit an SSD into it. This particular seller accepted my offer of $769.99 (with some haggling, so start low). Additional info on this model can be found here. As for power usage, all four nodes on and idle use 121.1 watts.

Noise at load: 77dba. That's a car driving 65mph passing you at 25 feet. Or a vacuum cleaner.
Noise at idle: 66dba. Standing next to a running dishwasher. Cash registers working.

Your offer of $769 would have purchased two current generation 8 core systems with 24GB of memory each. In some respects, it's "half" of the C6100. Except for the noise. And the IPC. And the bus speed. And the memory speed. And...

I'm glad you're happy. I just wish people would stop recommended recycled 5 year old server kit for home labs.

Tekhne
Sep 11, 2001

evol262 posted:

Noise at load: 77dba. That's a car driving 65mph passing you at 25 feet. Or a vacuum cleaner.
Noise at idle: 66dba. Standing next to a running dishwasher. Cash registers working.

Your offer of $769 would have purchased two current generation 8 core systems with 24GB of memory each. In some respects, it's "half" of the C6100. Except for the noise. And the IPC. And the bus speed. And the memory speed. And...

I'm glad you're happy. I just wish people would stop recommended recycled 5 year old server kit for home labs.


Not accurate in the least. I'm not sure why you feel the need to criticize my purchase every chance you get. Every post you make on this subject is so full of assumptions and inaccuracies its ridiculous. If your criticisms were actually based on fact, then they might be valid. For starters, he was asking for recommendations on a work lab, not a home lab. He specifically stated he didn't care about power or noise. He also stated he was looking at the C1100's and wanted opinions on if there are any better solutions out there for the price that fit into a rack. Additionally he mentions that he'll need to create a storage array. Did I not address all of those with my post? Sure there are plenty of other options, but you have yet to actually recommend one that fits his requirements.

Secondly the noise is very minimal, certainly not like standing next to a running dishwasher. In fact I just measured it with Noise Meter on my Android phone. Not the most accurate reading I'm sure, but from exactly two feet away from the back of the chassis, it measures 32.5dB. 5 feet away at my desk it is 28.7dB. My ultimate plan is to put it in my rack in the basement, in which case I wouldn't hear it at all.

Additionally this is not a five year old server kit. In fact this is a Gen 11 server that first came out in 2010. My particular server has a build date in 2011. Most enterprises don't replace their servers but every 4-5 years. Considering this is a 2-3 year old server, I think it will manage. Yes the Xeon E5520 was launched in 2009, but it is still supported by Intel and does the job just fine.

Just for kicks, why don't you make a build list of the components you would purchase for your 8 core system so we can see how it compares dollar for dollar. Be sure to include cases, power supplies, cables, etc as not all of us have spare parts laying around. Once you make those two hosts, also add another for storage as I have mentioned twice now that I use FreeNAS (and note it is not a VM) so your recommendation also needs to have the ability to account for storage. Since I make no mention of the drives I use, you don't need to spec those out. Maybe your next post can contribute something useful.

evol262
Nov 30, 2010
#!/usr/bin/perl

Tekhne posted:

Not accurate in the least. I'm not sure why you feel the need to criticize my purchase every chance you get. Every post you make on this subject is so full of assumptions and inaccuracies its ridiculous. If your criticisms were actually based on fact, then they might be valid. For starters, he was asking for recommendations on a work lab, not a home lab. He specifically stated he didn't care about power or noise. He also stated he was looking at the C1100's and wanted opinions on if there are any better solutions out there for the price that fit into a rack. Additionally he mentions that he'll need to create a storage array. Did I not address all of those with my post? Sure there are plenty of other options, but you have yet to actually recommend one that fits his requirements.
The comments about noise weren't aimed at all at alo, which is why he wasn't quoted when I commented on the noise, and why I asked "what you going to do with your lab?", because recommendations vary after that. One SSD and a few 3.5" drives paired with 2 C1100s is going to leave you I/O starved. A full MD3000i with one PE2900 is going to leave you way overcommitted on CPU. It's a balancing act.

Tekhne posted:

Secondly the noise is very minimal, certainly not like standing next to a running dishwasher. In fact I just measured it with Noise Meter on my Android phone. Not the most accurate reading I'm sure, but from exactly two feet away from the back of the chassis, it measures 32.5dB. 5 feet away at my desk it is 28.7dB. My ultimate plan is to put it in my rack in the basement, in which case I wouldn't hear it at all.
The noise levels came from the link you gave about 'additional info on this model'. Dell's spec sheet agrees. 30 decibels is literally whisper-quiet.

I'm not invested in getting people to buy/not buy C1100s, C6100s, or whatever except that I've run 1U and 2U equipment at home, and it's not a pleasant experience. It looks really good on paper, because you can get 8 cores and 72GB of memory in 1U, but that's far more capacity than the vast majority of home users need (labs included), and all the warts of server kit are hard to get around unless you have a rack in the basement or the garage. 40mm fans are often audible through floors even if it's in the garage.

Tekhne posted:

Additionally this is not a five year old server kit. In fact this is a Gen 11 server that first came out in 2010. My particular server has a build date in 2011. Most enterprises don't replace their servers but every 4-5 years. Considering this is a 2-3 year old server, I think it will manage. Yes the Xeon E5520 was launched in 2009, but it is still supported by Intel and does the job just fine.
The L5520 was release Q1 '09. Your server may have been built in 2011. Or 2013. The release date was around the end of March 2009. In 6 months, it'll be 5 year old server kit. I'm rounding up very marginally.

Tekhne posted:

Just for kicks, why don't you make a build list of the components you would purchase for your 8 core system so we can see how it compares dollar for dollar. Be sure to include cases, power supplies, cables, etc as not all of us have spare parts laying around. Once you make those two hosts, also add another for storage as I have mentioned twice now that I use FreeNAS (and note it is not a VM) so your recommendation also needs to have the ability to account for storage. Since I make no mention of the drives I use, you don't need to spec those out. Maybe your next post can contribute something useful.
I'm really not interested in dollar-for-dollar comparisons. Especially against someone who's going to defend his purchase to the death with such vigor that he wants me to spec out another system because you have a FreeNAS node (not virtualized, eating 8 cores and 24GB of memory on your chassis, which just makes your "5 year old kit" vs equivalent modern gear comparison look worse, since all cores and all memory are not equal). I also wouldn't bother speccing drives because your purchase iddn't come with any. Gluster is fine. vSAN is fine. Local storage is also fine. It's a tradeoff between 60+ dba servers with components you can't replace without scouring eBay, potentially limited HD compatibility, and unknown usage patterns on used equipment vs. flat consumer gear.

Generic case+PSU - $45
AM3+ motherboard with integrated graphics and 4 DIMM slots - $45
8 Core Zambezi - $150
16GB DIMM (2x8GB) - $108

Two of those is $696, assuming you buy right now and don't wait for any deals on hardware. Plus two 8GB (2x4GB) kits for $50 each puts it at $796 (which is only marginally more expensive than your purchase) for two evenly-specced systems. If you were willing to suffer with 4 cores per node (which is still plenty, honestly), you could bump it from 24GB/node to 32GB/node.

You don't get RAID controllers, hot-swappable drives (or any hot-swappable equipment), DRAC/iLO, and whatever else you want to use to justify your purchase. You do get consumer equipment which you can get replacements for at any Fry's or Microcenter. You only get half the memory (albeit with better/newer memory controllers than Nehalem CPUs) and half the CPUs (albeit with much newer architectures, better virtualization instructions, and more IPC). You also don't have a 1400W PSU (maybe two!). You don't have a server that's minimum (per your link and Dell's datasheet) 65dba.

Again, I'm glad you're happy. It's just not a good purchase for most people. It's a fine purchase if you have a half-rack in the corner of a datacenter that you want to set up a lab to play with in. My house doesn't have a datacenter.

E:

Just to be clear, I'm not trying to rag on your purchase of a C6100 in particular. I didn't remember it was you who purchased one previously. I'm just reiterating that "buying used Dell kit" isn't always the best or most practical solution.

evol262 fucked around with this message at 18:20 on Sep 13, 2013

alo
May 1, 2005


evol262 posted:

The question at this point is really "what are you going to do with your lab?"

A whole bunch of things. We don't currently have any extra hardware to test large changes in our environment, so it would be nice. I've been using my home setup to make sure things work before I deploy them, but there are limits to what I can do at home. I have to maintain my impeccable "never fucks poo poo up" record.

I'm in a very mixed environment where I'm technically a Linux sysadmin, but I end up touching storage, VMware, Windows and Windows clients (thankfully only on the deployment side) -- so it's really valuable to be able to play with stuff before making changes that would keep me at work past 5pm.

As for storage...

I actually have an MD3000i sitting around, but I wouldn't use it... it's a terrible device. I see people recommending the newer versions of it and I hope they've improved ( http://rtumaykin-it.blogspot.com/2012/04/fixing-unresponsive-management-ports-on.html as an example ).

I'm probably going to go the route of SSD + a few 3.5" drives and buy better stuff later if I need it (I have a pile of 10k SAS drives sitting around too). The question is really about enclosures, since I want to be flexible in that regard.

Thanks for the suggestion, Tekhne. Can you detail what "a little modding" actually is? I'm still leaning toward the C1100's with their 72gb of ram and a separate box for storage.

Oh and please be friends.

evol262
Nov 30, 2010
#!/usr/bin/perl

alo posted:

A whole bunch of things. We don't currently have any extra hardware to test large changes in our environment, so it would be nice. I've been using my home setup to make sure things work before I deploy them, but there are limits to what I can do at home. I have to maintain my impeccable "never fucks poo poo up" record.

I'm in a very mixed environment where I'm technically a Linux sysadmin, but I end up touching storage, VMware, Windows and Windows clients (thankfully only on the deployment side) -- so it's really valuable to be able to play with stuff before making changes that would keep me at work past 5pm.

As for storage...

I actually have an MD3000i sitting around, but I wouldn't use it... it's a terrible device. I see people recommending the newer versions of it and I hope they've improved ( http://rtumaykin-it.blogspot.com/2012/04/fixing-unresponsive-management-ports-on.html as an example ).

I'm probably going to go the route of SSD + a few 3.5" drives and buy better stuff later if I need it (I have a pile of 10k SAS drives sitting around too). The question is really about enclosures, since I want to be flexible in that regard.

Thanks for the suggestion, Tekhne. Can you detail what "a little modding" actually is? I'm still leaning toward the C1100's with their 72gb of ram and a separate box for storage.

Oh and please be friends.
The MD3000i actually has reasonably good iSCSI performance. If you provision a LUN and present it to a few ESXi hosts, you'll be hard-pressed to beat it for performance, even if you route all 12 bays on a C6100 to one node and present it via OpenFiler, Nexenta, or whatever. Obviously you could dump a 10GE NIC into one of those nodes, but I'm guessing you don't have a 10GE switch in your corner, either.

You'll have a hard time beating refurb C6100s or C1100s for a lab in a datacenter. Just make sure you get L5639s instead of L5520s. 4 C1100s (dump your 10k drives into the chassis) with one datastore on the MD3000i and one on vSAN spread across the drives is very likely the best you'll do for $1500.

three
Aug 9, 2007

i fantasize about ndamukong suh licking my doodoo hole
Buy the loud Dell servers then build a shed with an AC unit, raised floors, etc in it and have your own datacenter. :eng101:

Stealthgerbil
Dec 16, 2004


I would love to build a home datacenter and get the fastest FIOS and xfinity business plans. Get solar panels and a battery system and I could have a totally solar powered micro datacenter.

evol262
Nov 30, 2010
#!/usr/bin/perl

Stealthgerbil posted:

I would love to build a home datacenter and get the fastest FIOS and xfinity business plans. Get solar panels and a battery system and I could have a totally solar powered micro datacenter.

I'm pretty sure that even in Phoenix, I couldn't power one rack with solar panels covering my entire property. Not to mention there's no availability of fiber when I could throw a rock and hit CenturyLink's regional HQ, but...

three
Aug 9, 2007

i fantasize about ndamukong suh licking my doodoo hole

Dilbert As gently caress posted:

Here is what I would look into

Intel build:
mobo/case/psu http://www.newegg.com/Product/Product.aspx?Item=N82E16856101117
CPU http://www.newegg.com/Product/Product.aspx?Item=N82E16819116782
Memory(2) http://www.newegg.com/Product/Product.aspx?Item=N82E16820104361
Cost: 645

Pro's
Intel-VT/D
Low power
Quiet
small
easy to setup and roll
Con's
Low internal disks
Budget doesn't include drives

I really like the shuttle enclosures. That CPU doesn't support the onboard graphics of the shuttle, though.

CrazyLittle
Sep 11, 2001





Clapping Larry
"Low internal disks" is kind of an understatement, don't you think?

Comradephate
Feb 28, 2009

College Slice

three posted:

I really like the shuttle enclosures. That CPU doesn't support the onboard graphics of the shuttle, though.

Also the motherboards are insanely cheap. I had a couple, and both died within 18 months.

three
Aug 9, 2007

i fantasize about ndamukong suh licking my doodoo hole
This thread needs more cool parts lists like Corvettefisher posted. I don't want to spec out parts on my own. :ohdear:

Comradephate
Feb 28, 2009

College Slice

three posted:

This thread needs more cool parts lists like Corvettefisher posted. I don't want to spec out parts on my own. :ohdear:

What are you trying to achieve?

Moey
Oct 22, 2010

I LIKE TO MOVE IT

Comradephate posted:

What are you trying to achieve?

I think that was a jab saying most people in here can spec a whitebox ESXi setup.

Comradephate
Feb 28, 2009

College Slice

Moey posted:

I think that was a jab saying most people in here can spec a whitebox ESXi setup.

I'm completely incapable of detecting insincerity in any form on the internet. :v:

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug

three posted:

This thread needs more cool parts lists like Corvettefisher posted. I don't want to spec out parts on my own. :ohdear:

I was responding to Indecision1991's post on some whitebox considerations, if you are tired of seeing my posts you can just add me to your ignore list.

Count Thrashula
Jun 1, 2003

Peak Performance.

Buglord
Here's my current baby, my CCNP ROUTE lab in GNS3



Since taking that screenshot, I've added an SNMP server using Paessler's PRTG.

If anyone has any questions about it, I'm happy to write a short tutorial on how to get something up and running. The ASA and the IP phones were both kind of a pain to get working, but it was exciting when they finally did!

three
Aug 9, 2007

i fantasize about ndamukong suh licking my doodoo hole

Moey posted:

I think that was a jab saying most people in here can spec a whitebox ESXi setup.

Dilbert As gently caress posted:

I was responding to Indecision1991's post on some whitebox considerations, if you are tired of seeing my posts you can just add me to your ignore list.

You guys are so negative. :psyduck:

I was really looking for more builds. I think they're interesting.

There should be a battle for cheapest whitebox with 32GB of RAM.

Edit: I <3 you, Corvettefisher. You're not crazy like you used to be.

three fucked around with this message at 18:30 on Sep 18, 2013

evol262
Nov 30, 2010
#!/usr/bin/perl

three posted:

You guys are so negative. :psyduck:

I was really looking for more builds. I think they're interesting.

There should be a battle for cheapest whitebox with 32GB of RAM.

Edit: I <3 you, Corvettefisher. You're not crazy like you used to be.

Take the build from half a page up:

Generic case+PSU - $45
AM3+ motherboard with integrated graphics and 4 DIMM slots - $45
8 Core Zambezi - $150
16GB DIMM (2x8GB) - $108

Cut down the CPU to a quad if you want to save $70. I don't personally think it's worth it. Double the memory. Boot from SAN. Or add a very cheap drive. That motherboard (which has gone up $20 in the last week, :psyduck:) is whitebox compatible.

MF_James
May 8, 2008
I CANNOT HANDLE BEING CALLED OUT ON MY DUMBASS OPINIONS ABOUT ANTI-VIRUS AND SECURITY. I REALLY LIKE TO THINK THAT I KNOW THINGS HERE

INSTEAD I AM GOING TO WHINE ABOUT IT IN OTHER THREADS SO MY OPINION CAN FEEL VALIDATED IN AN ECHO CHAMBER I LIKE

three posted:

Edit: I <3 you, Corvettefisher. You're not AS crazy like you used to be.

fixed for hilarity

(I'm just kidding around CV)

sudo rm -rf
Aug 2, 2011


$ mv fullcommunism.sh
/america
$ cd /america
$ ./fullcommunism.sh


QPZIL posted:

Here's my current baby, my CCNP ROUTE lab in GNS3



Since taking that screenshot, I've added an SNMP server using Paessler's PRTG.

If anyone has any questions about it, I'm happy to write a short tutorial on how to get something up and running. The ASA and the IP phones were both kind of a pain to get working, but it was exciting when they finally did!

Both the ASA amd IP Phone are what I'm interesting in getting set up - how'd you do it?

Moey
Oct 22, 2010

I LIKE TO MOVE IT

three posted:

There should be a battle for cheapest whitebox with 32GB of RAM.

Screw cheapest. I am halfway done with mine. It looks nice sitting alone in the corner and fills my needs (minus my cyber monday storage expansion)

I am currently running:
CPU: I7-3770
Memory: 4x8gb DDR3
Case: Fractal Design Define Mini
PSU: Can remember off the top of my head, some decent modular 4XXw
Boot: ESXi from thumb drive
Local datastore (primary): 250gb Samsung 840
Local datastore (leftover disks): Random drives ranging from 2tb down to laptop drives.
Add in card: 4x1gb Intel NIC

Future expansions:
Local datastore: more SSDs
Add in card: IBM M1015
Disks attached to M1015: 4x3tb

M1015 passed through to FreeNAS/NAS4Free/Something for ZFS
Non-Lab VMs backed up to ZFS array
ZFS used for media storage

I run a handful of lab and non lab machines on here.

The Third Man
Nov 5, 2005

I know how much you like ponies so I got you a ponies avatar bro
Is it possible to lab Nagios? I'd like to be able to claim some sort of monitoring experience when I try and get a new job.

Adbot
ADBOT LOVES YOU

Agrikk
Oct 17, 2003

Take care with that! We have not fully ascertained its function, and the ticking is accelerating.
Oh god what am I about to do?

I am building a new storage server to replace my current iSCSI target that is buried under the SQL server / Hyper-V / ESXi requests I throw at it. I'm putting together this box based on my familiarity with each of the hardware components and availability on eBay:


Supermicro H8SGL-F motherboard - $180
Opteron 6128 (8-core @ 2GHz) - $45
HP SmartArray P410 array controller with 512MB battery-backed cache - $150
2x Mini SFF-SATA fan cables - $15
16GB DDR3-1333 RAM - $80
500w gold power supply - $90
4x Samsung 840 Pro 512GB SSD - $1800
4x 1TB SATA HDs <exists> - $0
Mellanox ConnectX-2 HBA - $190

Total: $2550

I'll be using Server 2012 R2 for my iSCSI target so I can play with its tiered storage capability. 180,000 iops available and over 1 gigabyte of read/write speeds from the SSD array with a 2TB storage tier. and the Mellanox card will give me a theoretical limit of 20gbit throughput via RDMA (SMB Direct) making the storage throughput available to the network.

  • Locked thread