Blog

How We Build Test Nodes

Node Engineer Keith Millette dives into the details of node configuration in the lab.

Posted by: Keith Millette
Posted on: April 1, 2020

In my last post I gave everyone an overview of the lab. Today, I want you to see how I’ve configured each node and give you some general information about the testing I’ll be doing. Much of the earliest testing is related to installation, system configuration, etc. Nothing very exciting but necessary.

For now though, I want everyone to understand how our testing will help node runners. If you don’t have hardware to build a node now, we see no reason you should overspend to build a BetaNet node if you can build one for less. If you’ve got everything except a GPU or SSD, great. We want to be able to recommend the most cost effective hardware when the time comes to build or complete your node.

Before I get into too much detail, it is really important that everyone understand the hardware I’m using is not a final recommendation. Much testing needs to be done and some components may not meet the final requirements. So with that said, NONE OF THE FOLLOWING IS RECOMMENDED HARDWARE.


We do expect much of the hardware to pass our testing and we expect some to fail but it’s important to understand why we chose the hardware we did.

SSD


You’ll notice all nodes are using 500GB Samsung 860 EVO SSDs. That’s because endurance, longevity and capacity are not required for the in-house testing. During in-house testing the nodes will run short sprints to test specific functions. At this point, I can comfortably say this is definitely not the recommended SSD. An enterprise grade SSD with a higher capacity will be recommended.

PSU

I chose these for their reliability. I wanted to ensure all nodes have enough power and they wouldn’t need to be replaced during the in-house testing. I will be testing actual power consumption. If it turns out a 500W PSU can handle the load, hey it’s less expensive. But we’ll have to see.

RAM

Expensive or vast quantities of RAM are not expected to be required. I’ve no intention to do any overclocking so RAM with heatsinks isn’t required. I’ve configured the nodes with as little as 8 - 32GB of the least expensive and most readily available modules. I’ll be testing for minimum capacity requirements and performance of single vs dual channel configurations.

CPU


I’m using the AMD Ryzen 7 3700x because that’s what was readily available on the Korean market. When the original specs were published a 2700x cost about the same as what I paid for a 3700x. I’m using an Intel processor due to market penetration of Intel. It isn’t our first choice because Intel processors are just more expensive. However, many people have Intel processors on hand so I chose the i7-9700K because it’s on par with the computing power to the 3700x. It should be noted that the GPU will be doing the brunt of the work so 64 core AMD EPYC or Xeon Phi are absolute overkill. We’re talking 8 cores.

That covers the bulk of the table above. When it comes to motherboards and graphics cards, things get a little more complicated since there’s a much wider range of options. There are quite a few factors to consider such as form factor, chipset, manufacturer, availability, price, and the ability to do empirical testing.

MOTHERBOARD (actually chipset)

I think there are 7 Socket AM4 chipsets. Many of the features of the X70 chipsets aren’t required and motherboards with those chipsets are more expensive in comparison to A3** and B50 based motherboards. Features like being able to overclock, SLI, or USB 3.2 Gen. 2x2 aren’t necessary. So on motherboards to give you an idea of the things I’m testing, nodes 2 and 3 use the B450 chipset so I’ll test different GPUs and memory configuration. Node 1 uses the B450M chipset and its technically the same as the B450 but a smaller form factor with fewer PCIe slots which reduces the price considerably. Another example is nodes 4 and 5. By using the same motherboard and memory configuration I can test the performance of the RTX 2070 vs the RTX 2060. Those are just a couple examples.

GRAPHICS CARD

This is the one piece of hardware that will affect the cost of building a node and the ability to keep up the most. Graphics cards with Turing GPUs run between $300 to $5000. If you consider there are about 20 Turing GPUs, dozens of manufactures plus individual models, that is a lot to choose from! We selected the RTX2070 because on paper, it can handle the workload. I got a couple others just to check but mathematics is a beautiful thing and very smart people were able to narrow the field before we even began. So rather than testing the performance of the RTX 2070 vs the RTX 2070 Super or the RTX 2070 vs the RTX 2080, we chose to try to determine whether all GPUs are created equally. You may have a preferred manufacturer but they don’t necessarily make cards that will meet your budget. LEDs and a cool decal aren’t going to improve your node’s performance so if manufacturers like Palit and EmTek can manufacture an RTX 2070 that performs as well and doesn’t melt after 6 months, then I see no reason anyone should pay an extra $200 - $300 for a more expensive card. In the end the graphics cards we recommend will be based on the number of calculations it can do reliability and the price.

When all is done, our goal is to be able to offer a cost effective baseline configuration to build a BetaNet node. From there if you choose to use a terabyte of RAM or a $3000 graphics card, that’s on you.

That’s it for today. If you’ve got questions or suggestions please join the conversation! I’ll do my best to answer all your questions. Keep an eye out for more posts in the near future.



Steve June 12, 2020

Very useful ‘partpicker’ link! Just put all of the hardware that I have already ordered into it and found that its all okay!
Thanks for that. I may have to bookmark that for the future :+1:

lonewolf May 30, 2020

@Theo I don’t think the team will/should provide a precise build list. They have published the necessary minimum specifications. Check the revised ones at Revised BetaNet Node Hardware Requirements

If you are not familiar with building (assembling at home) your own PC, then you should probably bring these minimum specs to one or several PC builders in your area (online or physical) and ask for a quotation. Make sure to highlight the minimum specs to them and check that they finally observe them. Many PC building services have online custom PC configuration tools: you choose your parts (MOBO, GPU, CPU, RAM, SSD, PSU, case, fans…) and they check compatibility among the parts and build it for you.

On the other hand, sites such as https ://pcpartpicker.com (remove the space) allow you to check compatibility and prices if you were interested in ordering separate parts and building on your own (but this requires some expertise).

The PC that you suggest seems to be under specs. Check the minimum specs carefully at Revised BetaNet Node Hardware Requirements. For instance, GPU should be RTX 2070 or higher. The whole build should approach or excede 2000€ if you use new parts.

The gateway should ideally run in a separate device (cloud or physical) on a different IP address and network. But during the betanet it’s possible to run the gateway on the node if you have at least 150mbit up / 200mbit down bandwith, read carefully this Revised BetaNet Node Hardware Requirements and this So, what is a Gateway?

Good luck.

Theo May 29, 2020

Since I am quite new to the topic and without hardware, I have a few questions. is there a list for the hardware components for simple users? Where can I find it? I’m not talking about the components I find here “https://xx.network/blog/hardware-requirements”, because without a motherboard I can’t do anything with a CPU.
if I made a node and gateway, would I have to set up two devices or two SSD disks or what is the procedure? I’ve read a lot here, but I don’t find all the connections to make a todo list, for me.
if i join the violett team, i cannot maintain any hardware without this information. of course I have no idea about the software, but that comes with the support here in the forum. (I hope)
@Keith

would that work?
Dell G5 Desktop

Eugene May 25, 2020

Hello!
I would like to clarify for myself and for other participants who wants launch the node.
specification requires HT support, but you build test benches with 9700k where there is no Hyperthreading.
Does this processor without HT show itself normally?
just wanted to clarify before buying, thanks

Keith May 9, 2020

@bradford A couple things to consider about overclocking/undervolting. In the NodeLab I maintain a consistent environment for new code. Changes to default settings introduces a variable to the testing environment. Since there will be regular changes to code throughout BetaNet, any gains or losses as a result of hardware tweaking may only be temporary.

Also, the kind of changes to hardware settings you’re proposing will likely have the same affect if one is running Elixxir software or benchmark software. Anyone is free to experiment with settings of their hardware during BetaNet but the result may be operating below the performance target or result in downtime. That’s a decision each operator will need to make for themselves.

bradford May 9, 2020

@Keith Can you also test how a GPU performs when it’s undervolted? If we can undervolt them without degrading performance then it will be much cheaper to colo them.

Sergei Kolyvanov May 8, 2020

Thanks for info. :muscle:

Keith May 8, 2020

@Elvis I don’t have performance data yet. I’ve had no problems with either of the two MSI cards I’m using here. One thing I will note though is the MSI GeForce RTX 2070 Gaming is a very big card (295 mm long) so just make sure your case is big enough if that’s the one you’re planning on using.

Sergei Kolyvanov May 8, 2020

Hi Keith,
If it possible a little more information about station with MSI RTX2070 > how well does GPU perform at work compared to others?
My system will be built on it.

Keith May 5, 2020

I don’t have the kind of data I think you’re hoping for yet. The GPU code is changing daily and the nodes in the lab are being used to test code stability and squashing bugs. What I can say at this point and it may be helpful in your decision making, is that none of the different brands of cards we are using have proven to be unreliable. None have burst into flames or stopped working. That’s really all I can say at this point. It’s a process and unfortunately we haven’t gotten to the fun stuff yet.

bradford May 5, 2020

@Keith, any update as to how to the different cards performed? I’ll be buying a card in the next month and I hope to have some more info before I do.

Andrey April 2, 2020

If 1660 were enough, it would be great! Since there are many cheap models around $150-200

Keith April 1, 2020

Turing is the first requirement because of the compute capability of the shader processor. @benger, the VP of Architecture, calculated the RTX 2070 with 2304 shader processors would meet the requirements handsomely so there’s a little room for lower processor count but the GTX 1650 with only 896 processors would likely not be able to keep up.

Also, we have not determined if the 1536 cores on the GTX 1660 ti is enough.

bradford April 1, 2020

I see you’re testing with a 1660 Ti; did you eliminate the cheaper 1650 because it didn’t meet some other requirement besides being Turing? I think a benchmark of the lowest capable card would be interesting, at least.

Keith April 1, 2020

Thanks very much. Your comments, as a system builder, are exactly why we’re making this process public. Everyone is encouraged to comment on the hardware we are using, they have or are considering using.

If someone has experience with XYZ graphics card or XYZ brand, we welcome that input!

Bryan Riester April 1, 2020

Hey Keith,

Long time Elixxir fan and advocate here, hoping you’re staying safe during the sniffles-apocalypse :smiley:

Just wanted to write in regarding the RAM in your configurations. As a professional system-builder for small business, I found that the hardware component most likely to fail at any point during a build or the 3 first months of use was the RAM. I spent a lot of time and energy on the problem, testing various configurations of heat sinks, timings, etc.

What I found was that for whatever reason, Corsair memory was different than others in one key aspect: regardless of the price tier, whenever a build turned on for the first time it would stay on for several years. Yes I still get 1/20 modules that are dead on arrival. But I never have a working stick fail.

Given that you’re spending an extra satoshi here and there and power components and monster CPU for the sake of eliminating failure-causing variables unrelated to the test, I thought you might like a bit of wisdom from someone who’s dinner has relied on cheap-ass systems staying on while illiterate shopkeepers abuse them with browser toolbars.

Cheers!
–Bryan

xx network does not distribute, offer, solicit sales of, or sell any xx coins in any state or jurisdiction in which such a distribution, offer, solicitation or sale would be unlawful prior to registration or qualification under the securities laws of any such state or jurisdiction.

Google Play and the Google Play logo are trademarks of Google LLC.