Live Chat Software by Kayako
 News Categories
(19)Microsoft Technet (2)StarWind (4)TechRepublic (3)ComuterTips (1)SolarWinds (1)Xangati (1)MyVirtualCloud.net (27)VMware (5)NVIDIA (9)VDI (1)pfsense vRouter (3)VEEAM (3)Google (2)RemoteFX (1)developers.google.com (1)MailCleaner (1)Udemy (1)AUGI (2)AECbytes Architecture Engineering Constrution (7)VMGuru (2)AUTODESK (1)storageioblog.com (1)Atlantis Blog (7)AT.COM (2)community.spiceworks.com (1)archdaily.com (14)techtarget.com (2)hadoop360 (3)bigdatastudio (1)virtualizetips.com (1)blogs.vmware.com (3)VECITA (1)vecom.vn (1)Palo Alto Networks (4)itnews.com.au (2)serverwatch.com (1)Nhịp Cầu đầu tư (3)VnEconomy (1)Reuters (1)Tom Tunguz (1)Medium.com (1)Esri (1)www.specommerce.com (1)tweet (1)Tesla (1)fool.com (6)ITCNews (1)businessinsider.com (1)hbr.org Harvard Business Review (1)Haravan (2)techcrunch.com (1)vn.trendmicro.com (3)thangletoan.wordpress.com (3)IBM (1)www.droidmen.com (2)blog.parallels.com (1)betanews.com (6)searchvmware.techtarget.com (1)www.bctes.com (1)www.linux.com (4)blog.capterra.com (1)theelearningcoach.com (1)www.examgeneral.com (1)www.wetutoringnation.com (1)chamilo.org/ (1)www.formalms.org (1)chalkup.co (1)www.mindonsite.com (5)moodle.org (4)moodle.croydon.ac.uk (1)opensource.com (1)1tech.eu (1)remote-learner.net (1)paradisosolutions.com (2)sourceforge.net (7)searchbusinessanalytics.techtarget.com (1)nscs.gov.sg (1)virten.net (1)fastest.com.vn (1)elearninglearning.com (2)www.computerweekly.com (1)youtube.com (2)computer.howstuffworks.com (2)techz.vn (2)techsignin.com (1)itworld.com (7)searchsecurity.techtarget.com (1)makeuseof.com (1)nikse.dk (1)4kdownload.com (1)thegioididong.com (1)itcentralstation.com (1)www.dddmag.com (1)Engenius (1)networkcomputing.com (1)woshub.com (1)hainam121.wordpress.com (1)www.lucidchart.com (1)www.mof.gov.vn (3)www.servethehome.com (6)www.analyticsvidhya.com
RSS Feed
News
Nov
23
Supermicro X11DPi-N E-ATX Motherboard Review
Posted by Thang Le Toan on 23 November 2017 01:26 AM
Supermicro X11DPi N Feature
Supermicro X11DPi N Feature

A mainstay of Supermicro’s dual socket LGA3647 motherboards will no doubt be the X11DPi-N. We reviewed the Supermciro X11DPi-NT not long ago which offers 10GbE network for those who need faster network speeds, while the X11DPi-N provides dual 1GbE ports for reduced entry cost to this platform while retaining the same feature set.

Supermicro X11DPi-N Specifications

Here is the spec table for the Supermicro X11DPi-N:

Supermicro X11DPi N Specifications
Supermicro X11DPi N Specifications

Some readers requested we include the motherboard block diagram, so we wanted to add this in our review.

Supermicro X11DPi N Block Diagram
Supermicro X11DPi N Block Diagram

In the case of the X11DPi-N, we find 1GbE network ports; the X11DPi-NT includes 10GbE ports.

Supermicro X11DPi-NT Overview

The X11DPi-N fits into the sizeable E-ATX motherboard class with a size of 12” x 13”. Filling the general purpose or storage server roles this motherboard supports a large variety of Supermicro’s 2U and 4U platforms.

Supermicro X11DPi N Top
Supermicro X11DPi N Top

Paired with each socket, we find six blue memory slots that will give us six DIMMs per CPU with 1 DIMM per channel. Memory speed of 2666MHz fully supported which is an increase from 2400MHz we saw in the last generation. The black DIMM slots add an extra pair of DIMMs to achieve capacity parity with the previous generation Xeon E5 series products.

Supermicro X11DPi N PCIe Slots
Supermicro X11DPi N PCIe Slots

PCIe slots available on the X11DPi-N are the same as we find on the X11DPi-NT with four x16 slots and two x8 slots. The x16 slots would be perfect for GPUs, networking cards or other expansion devices. Throw in the two x8 slots for higher end networking and storage expanders. Whether you are looking at GPU based machines or powerful storage servers, there is plenty of PCIe slots to house a wide array of design choices.

At the edge of the motherboard, we also see a USB 3.0 Type A port which works well for a boot drive.

Supermicro X11DPi N Storage Ports
Supermicro X11DPi N Storage Ports

For storage servers, three SFF-8087 connectors can accommodate up to 12 SATA III 6.0gbps drives. Also, two 7-Pin SATA ports can further support 14 SATA III 6.0gbps drives from the C622 PCH chipset. There are also two Oculink ports for NVMe drives. This combination of storage ports allows a storage server to free up expansion slots for additional PCIe cards.

Supermicro X11DPi N Back IO
Supermicro X11DPi N Back IO

Network ports are Dual LAN GbE from the C612, IPMI via ASPEED AST2500 above two USB 3.0 ports. Two USB 2.0 ports, VGA and COM ports round out the rear I/O.

Supermicro X11DPi-N Management

Supermicro’s new X11 platforms out of band management is an updated version of their industry standard management interface including a WebGUI.

Supermicro X11DPi N IPMI
Supermicro X11DPi N IPMI

Supermicro’s latest BIOS for the X11 platform features support for HTML5 iKVM which we have seen on several motherboards and systems now.

Supermicro X11DPi N BIOS
Supermicro X11DPi N BIOS

Enhanced features for the latest HTML5 iKVM is the ability to enlarge the screen to make for easier reading on high-resolution screens such as 4K displays.

Supermicro X11DPi N IPMI 2
Supermicro X11DPi N IPMI 2

Ease of use with enlarged working screens carries right over to the desktop.

Test Configuration

Our primary test configuration for this motherboard is as follows:

  • Motherboard: Supermicro X11DPi-N
  • CPU: 2x Intel Xeon Gold 6134, 8 Core processors
  • RAM: 12x 16GB DDR4-2400 RDIMMs low profile (Micron)
  • SSD: OCZ RD400
Supermicro X11DPi N Gold 6134
Supermicro X11DPi N Gold 6134

We have moved our benchmark processors to Intel Xeon Gold 6134 8 core CPU’s. With a TDP of 130 watts, we have no thermal throttling issues using two Supermicro 2U active coolers.

AIDA64 Memory Test

AIDA64 memory bandwidth benchmarks (Memory Read, Memory Write, and Memory Copy) measure the maximum achievable memory data transfer bandwidth.

Supermicro X11DPi N AIDA64 Memory
Supermicro X11DPi N AIDA64 Memory

AIDA64 Memory benchmarks show comparable numbers with the past dual processor motherboards we have reviewed, slightly higher Reads while copy and write are somewhat lower.

Cinebench R15

Supermicro X11DPi N Cinebench R15
Supermicro X11DPi N Cinebench R15

Cinebench R15 benchmark numbers fall right where we would expect them to and match just below previous reviews.

Geekbench 4

Supermicro X11DPi N Geekbench 4
Supermicro X11DPi N Geekbench 4

The Gold 6134 processors have a Turbo speed of 3.7GHz that bumps up Single-Core results, and we do see a modest improvement in Multi-Core results.

Supermicro X11DPi-N Power Consumption

For our power testing needs, we use a Yokogawa WT310 power meter which can feed its data through a USB cable to another machine where we can capture the test results. We then use AIDA64 Stress test to load the system and measure max power loads.

Power consumption can vary depending on processors used and the number of HDDs/SSDs/Expansion cards used. Here we test just a primary system.

Supermicro X11DPi N Power Test
Supermicro X11DPi N Power Test

We find an idle power draw of 132 watts and Peak of 394 watts to be quite common power draw for test platforms we run.

OS Idle: 132W
AIDA64 Stress Test: 394W

Conclusion

The Supermicro X11DPi-N exceeds our expectations in flexibility and features. Aided by the new Intel Xeon Scalable platform, it offers more PCIe lanes, better memory bandwidth, and more SATA III 6.0gbps ports without the need for extra PCIe storage expansion cards. Network connections with dual 1GbE ports are a cost-saving measure welcome to those looking to add their own high-speed networking. Overall this was an extremely easy motherboard to work with and one we have no hesitation recommending.

Supermicro X11DPi N
Supermicro X11DPi N

The Supermicro X11DPi-N is an incredible motherboard with great storage options couple with enough PCIe slots to take advantage of a broad mix of expansion cards.

You may also like:

REVIEW OVERVIEW

Design & Aesthetics
9.3
Performance
9.4
Feature Set
9.1
Value
9.3

SUMMARY

We review the Supermicro X11DPi-N motherboard that packs a ton of functionality into a compact E-ATX form factor for easy customization

 


Read more »



Nov
23
NVIDIA GPU Cloud is one important step in democratizing deep learning
Posted by Thang Le Toan on 23 November 2017 12:40 AM
NVIDIA GPU Cloud
NVIDIA GPU Cloud

At the 2017 Supercomputing conference (SC17) we were able to attend the NVIDIA press and analyst event. We had a pre-briefing on the event so we knew what to expect but there were a number of announcements. By far the most impactful was the announcement of the NVIDIA GPU Cloud expanding to HPC applications. Here is a quick overview of what NVIDIA announced.

NVIDIA Tesla V100 Volta Everywhere

The first announcement should be of little surprise to anyone following STH coverage of NVIDIA. The next generation NVIDIA Tesla V100 is now available just about everywhere.

NVIDIA Volta Taking Off At SC17
NVIDIA Volta Taking Off At SC17

On the show floor of SC17 just about every vendor (Dell EMC, Hewlett Packard Enterprise, Huawei, Cisco, IBM, Lenovo, and Supermicro) had solutions with the V100 GPU. Some were 4 to a box, some were 8, some were submerged in liquid for cooling, but they were everywhere.

Supermicro At SC 2017 NVIDIA Tesla V100 Volta
Supermicro At SC 2017 NVIDIA Tesla V100 Volta

Beyond just the hardware vendors, Alibaba Cloud, Amazon Web Services (in AWS P3 instances), Baidu Cloud, Microsoft Azure, Oracle Cloud and Tencent Cloud have also announced Volta-based cloud services.

HPC and AI Applications in the NVIDIA

The NVIDIA GPU Cloud is NVIDIA’s offering where frameworks and applications are packaged and distributed by NVIDIA in containers. We have been working with nvidia-docker, the precursor to this service, for quite some time and it is awesome. We have recently been working to package our applications using these NVIDIA optimized containers. Essentially, NVIDIA is taking away much of the “getting the stack running” work for deep learning/ AI and HPC applications.

NVIDIA GPU Cloud
NVIDIA GPU Cloud

At SC17, NVIDIA added HPC applications and visualization to the platform. This allows users to quickly get up-and-running with GPU accelerated applications without having to manage CUDA versions and dependencies with the application versions. The real impact here is that this can shave days or weeks of time deploying systems.

NVIDIA HPC Applications Coming To NGC
NVIDIA HPC Applications Coming To NGC

We are already working with NGC at STH and will have more on this offering soon. It is interesting for a few points. First, it does offer something that is so easy, we wish we would see Intel offer something similar. Distil the vast expanse of Github and Docker Hub down to a few select best-known configurations. Second, it is fascinating in that we can see this being offered for VDI in the future and also to make NVIDIA the hub of offering on-demand GPU compute offerings.

Final Words

We have been talking about the NVIDIA Tesla V100 “Volta” for some time, and have some numbers in the editing queue for 8x Tesla V100 systems. At the same time, these just started shipping in volume from what we were hearing in the last two months so they are a somewhat known quantity. The NVIDIA GPU Cloud is the big story here. NVIDIA has the opportunity to build the “app store” model and both on-prem (DGX-1 and DGX Station along with partner systems) and cloud-based on demand GPU compute.


Read more »



Nov
23
Microsoft adds NVIDIA Tesla V100 Volta Support in Azure
Posted by Thang Le Toan on 23 November 2017 12:38 AM

Microsoft adds NVIDIA Tesla V100 Volta generation support in Azure using Infiniband connected NCv3 instances. The company is also making NCv2 Tesla P100 GA

 

NVIDIA Tesla Module
NVIDIA Tesla Module

The cloud wars are heating up for high-dollar GPU compute instances. When the Amazon AWS EC2 P3 instances came out a few weeks ago we knew we were in for a slew of new cloud instance introductions. Microsoft did not disappoint as we now have the NVIDIA Tesla V100 “Volta” generation of GPUs in the Microsoft Azure cloud. Microsoft made the announcement to coincide with Supercomputing 2017 a move trumpeted by NVIDIA at the show.

Microsoft Azure NCv3 Instances with NVIDIA Tesla V100 Volta GPUs

Microsoft has a leading cloud compute GPU architecture. The company offers virtual machines with NVIDIA Tesla K80, M60, P40, and P100 GPUs. Adding to that stable is the Microsoft Azure NCv3 instance size with NVIDIA Tesla V100 GPUs.

Realizing the need for deep learning and AI workloads, along with HPC clustered workloads to have a high-speed interconnect, Microsoft also offers a high-speed, low-latency Infiniband (Mellanox) based interconnect for its GPU compute machines. This bucks the trend with the 19 clusters that made the November 2017 Top500 supercomputers list using 25GbE. Our take is that customers paying top dollar for Microsoft Azure NCv3 and NCv2 / ND series instance types want the high performance interconnect that Microsoft Azure offers.

One of the key benefits of using the Microsoft Azure cloud NCv2 and NCv3 instances for deep learning and AI is that one can train on data already found in Azure. One can also quickly burst to spin up and then collapse capacity as required. That is especially important if a data scientist needs a fast result.

We have heard from several companies that the NVIDIA Tesla P100 based instance type machine pools have been relatively limited. We hope that as the company adds new instances the availability gets better.

You can read more on the announcement at the official Microsoft page.


Read more »




Help Desk Software by Kayako