Live Chat Software by Kayako
 News Categories
(19)Microsoft Technet (2)StarWind (4)TechRepublic (3)ComuterTips (1)SolarWinds (1)Xangati (1)MyVirtualCloud.net (27)VMware (5)NVIDIA (9)VDI (1)pfsense vRouter (3)VEEAM (3)Google (2)RemoteFX (1)developers.google.com (1)MailCleaner (1)Udemy (1)AUGI (2)AECbytes Architecture Engineering Constrution (7)VMGuru (2)AUTODESK (1)storageioblog.com (1)Atlantis Blog (7)AT.COM (2)community.spiceworks.com (1)archdaily.com (14)techtarget.com (2)hadoop360 (3)bigdatastudio (1)virtualizetips.com (1)blogs.vmware.com (3)VECITA (1)vecom.vn (1)Palo Alto Networks (4)itnews.com.au (2)serverwatch.com (1)Nhịp Cầu đầu tư (3)VnEconomy (1)Reuters (1)Tom Tunguz (1)Medium.com (1)Esri (1)www.specommerce.com (1)tweet (1)Tesla (1)fool.com (6)ITCNews (1)businessinsider.com (1)hbr.org Harvard Business Review (1)Haravan (2)techcrunch.com (1)vn.trendmicro.com (3)thangletoan.wordpress.com (3)IBM (1)www.droidmen.com (2)blog.parallels.com (1)betanews.com (6)searchvmware.techtarget.com (1)www.bctes.com (1)www.linux.com (4)blog.capterra.com (1)theelearningcoach.com (1)www.examgeneral.com (1)www.wetutoringnation.com (1)chamilo.org/ (1)www.formalms.org (1)chalkup.co (1)www.mindonsite.com (5)moodle.org (4)moodle.croydon.ac.uk (1)opensource.com (1)1tech.eu (1)remote-learner.net (1)paradisosolutions.com (2)sourceforge.net (7)searchbusinessanalytics.techtarget.com (1)nscs.gov.sg (1)virten.net (1)fastest.com.vn (1)elearninglearning.com (2)www.computerweekly.com (1)youtube.com (2)computer.howstuffworks.com (2)techz.vn (2)techsignin.com (1)itworld.com (7)searchsecurity.techtarget.com (1)makeuseof.com (1)nikse.dk (1)4kdownload.com (1)thegioididong.com (1)itcentralstation.com (1)www.dddmag.com (1)Engenius (1)networkcomputing.com (1)woshub.com (1)hainam121.wordpress.com (1)www.lucidchart.com (1)www.mof.gov.vn (3)www.servethehome.com (6)www.analyticsvidhya.com
RSS Feed
News
Aug
9
IBM's cloud vision prioritizes platform, but future remains fuzzy
Posted by Thang Le Toan on 09 August 2017 12:32 AM

Some doubt IBM will ever catch up to AWS' dominance in the IaaS market. But, given Big Blue's growing emphasis on Bluemix platform services, maybe it doesn't need to.

Despite the inroads IBM has made in the cloud infrastructure market, it still struggles to close the gap with industry giant Amazon Web Services. That's because the crux of IBM's cloud strategy isn't -- and shouldn't be -- infrastructure, IT pros say. Instead, it's all about the data.

 

Many industry experts rank IBM among the top infrastructure as a service (IaaS) providers. IBM was the fourth biggest cloud provider for public IaaS in terms of revenue market share -- until recently, when Alibaba overtook it, said John Dinsdale, chief analyst at Synergy Research, a market research firm in Reno, Nev.

In the first quarter of 2017, Synergy's rankings showed Amazon Web Services (AWS) comfortably in the lead with 44% market share, followed by Microsoft Azure (11%), Google Cloud Platform (6%), Alibaba (5%) and then IBM (4%).

Still, IBM has made progress in the public IaaS market compared to some of its traditional competitors. Hewlett Packard Enterprise, for example, shuttered its Helion public cloud in early 2016.

IBM's 2013 acquisition of SoftLayer was the jumping-off point for its IaaS gains. The company now has more than 50 data centers worldwide, 13 of which came from SoftLayer, Dinsdale said.

 

"IBM has clearly continued to grow both the number and capacity of its data centers," he said.

A broad global footprint and scale are table stakes in the highly competitive public IaaS market, and IBM's acquisition of SoftLayer's cloud data centers provided a crucial foundation on which to grow, said Cassandra Mooshian, senior analyst at Technology Business Research, an IT research organization in Hampton, N.H.

"Without [SoftLayer], I don't know what they would have done in this market," she said. "That gave IBM access globally, very quickly. I think that was a huge advantage, and they continue to grow on that."

IBM cloud infrastructure revamp in the works

To level the playing field with AWS and Azure, IBM is working on a new cloud architecture -- what it calls "next-generation infrastructure" (NGI) -- that will incorporate new hardware within its cloud data centers to further modernize its SoftLayer infrastructure.

For example, IBM IaaS lacks the virtual networking constructs that are present in AWS and Azure, and NGI is meant to address these differences, said Lydia Leong, an analyst with Gartner.

But when this new cloud architecture will formally be available -- and whether it will win customers from AWS -- remains to be seen.

"The core infrastructure war has been mostly won," Leong said.

In addition to scale, the SoftLayer buy also opened the door for IBM's bare-metal cloud strategy, which continues to be a differentiator for the company.

Bitly, Inc., the company behind the popular link management platform, began to migrate to IBM cloud infrastructure in 2015. The company ran its platform on bare-metal servers in a colocation facility, but as the business expanded, its capacity and scale became difficult to manage, said Rob Platzer, CTO at Bitly, based in New York City.

Rather than move to pure IaaS, which would have required some rearchitecting, Bitly wanted a more cloud-like environment that would still include bare metal, as some of the Bitly platform is optimized for that particular technology, Platzer said.

IBM SoftLayer allowed Bitly to maintain its bare-metal footprint, yet also tap into key benefits of cloud, such as more flexible pricing models and the ability to scale capacity as needed. But it also put Bitly in a position to pursue a hybrid strategy, Platzer said. Eventually, the company plans to use a mix of IBM's bare-metal and virtualized IaaS offerings and is already working toward that goal.

"What we wanted was a low-risk lift-and-shift from bare metal to bare metal, but we wanted the opportunity to have a hybrid environment where we could begin to take advantage of transforming some of our platform to virtualized or pure cloud-based," he said.

But a growing data center footprint and its bare-metal option might not be enough for IBM to dethrone public IaaS leader AWS, Dinsdale suggested. AWS doesn't directly offer bare metal, but it has somewhat comparable services, such as Elastic Compute Cloud Dedicated Hosts, which commit resources on a physical server to a single tenant.

"[IBM bare metal] might offer some level of differentiation, but I doubt that AWS or Microsoft are losing too much sleep over it," Dinsdale said.

Moreover, while IBM has made efforts to evolve SoftLayer infrastructure, its pace of change hasn't kept up with AWS and Azure, Leong said.

"They introduced some storage options in early 2015, and that has been the only major new functionality," she said. "There were other functions offered to some users as one-offs … but it was not at the pace [of] Microsoft or AWS, [which introduce] hundreds of new features a year."

IBM's cloud strategy emphasizes Bluemix, platform services

In the long term -- even after the rollout of its next-generation cloud architecture -- it's not in IBM's best interest to compete in cloud on a pure infrastructure basis, Leong said.

It would be better for IBM to focus on where it can differentiate with things like Watson rather than trying to compete head-to-head with Amazon and Microsoft in the infrastructure space.
Lydia LeongAnalyst at Gartner

"It would be better for IBM to focus on where it can differentiate with things like Watson rather than trying to compete head-to-head with Amazon and Microsoft in the infrastructure space," she said.

IBM has already started to move in this direction. The company has placed a greater emphasis on its Bluemix portfolio of cloud app development services, as well as its offerings for emerging workloads related to artificial intelligence, big data and the internet of things (IoT).

The SoftLayer acquisition was primarily just a means to an end for IBM -- a set of infrastructure resources on which to host these data services, said Lief Morin, CEO of Key Information Systems, an IT services provider and system integrator based in Agoura Hills, Calif.

"They didn't buy SoftLayer so they could build the world's largest bare-metal provisioning company," said Morin, whose company partners with both IBM and AWS. "But they knew they had to have that platform in order to purvey their [services]."

IBM's growing emphasis on platform as a service, as well as data analytics and cognitive computing, is evidenced by its move in October 2016 to roll the SoftLayer cloud portfolio into the Bluemix brand. IBM also integrated the SoftLayer and Bluemix management portals to provide a single dashboard for users of both IBM cloud infrastructure and platform services.

That model provides a more unified experience for IBM cloud customers and is similar to that of Microsoft Azure, Mooshian said.

"The SoftLayer brand was strong with its own customers; however, I think IBM has done a better job promoting the Bluemix brand," she said. "Bringing SoftLayer underneath Bluemix and kind of unifying the buyer experience will help them going forward."

Another key area of focus for IBM moving forward will be its hybrid story with the Bluemix platform, Mooshian added. Right now, IBM offers Bluemix Local, a private version of the IBM Bluemix public cloud platform that users deploy on their own hardware and within their own data centers. The aim -- much like Microsoft's upcoming Azure Stack -- is to simplify hybrid cloud deployments.

For example, Bluemix Local and Bluemix public cloud share the same underlying infrastructure. This lets developers build apps in-house that can more easily migrate to the public cloud. It also enables IBM cloud admins to manage both their private and public deployments from a common console.

Oracle has a similar offering with its Oracle Cloud Machine. But, otherwise, the market for these types of on-premises hybrid cloud systems is still fairly young -- especially given the delayed release of Microsoft's Azure Stack.

"We have seen a lot of demand on the customer side for a truly hybrid model," Mooshian said. "And right now, only Oracle and IBM have that -- with options to have a public cloud platform, hosted private cloud platform and then an on-premises option."

Moving forward, IBM's cloud success hinges largely on its platform services and hybrid strategy. But how successfully and quickly it builds out those offerings, while maintaining the more traditional parts of its business, such as mainframes, will be the true test.

"They are trying to balance the old and the new," Mooshian said. "They have a different portfolio than any other vendor out there right now because they have so much of both."

 

Kristin Knapp asks:

What do you consider to be IBM's overall strengths and challenges in the public cloud market?

 


Read more »



Jul
21
Magic Quadrant for Data Center Backup and Recovery Software
Posted by Thang Le Toan on 21 July 2017 04:58 AM
Published: 08 June 2016 ID: G00280391

Analyst(s):

Summary

Enterprise backup is among the oldest, most performed tasks for infrastructure and operations professionals. Gartner provides analysis and evaluation of the leading data center backup software vendors that offer a range of traditional to innovative recovery capabilities.

Strategic Planning Assumptions

By 2020, 30% of organizations will leverage backup for more than just operational recovery (e.g., disaster recovery, test/development, DevOps, etc.), up from less than 10% at the beginning of 2016.

By 2020, over 40% of organizations will supplant long-term backup with archiving systems — up from 20% in 2015.

By 2020, 10% of storage systems will be self-protecting, obviating the need for backup applications, up from less than 2% today.

By 2019, 30% of midsize organizations will leverage public cloud IaaS for backup, up from 5% today.

By 2018, 70% of business and application owners will have more self-service control over their data protection services, up from 30% today.

By 2018, 50% of organizations will augment with additional products or replace their current backup application with another solution, compared to what they deployed at the beginning of 2015.

By 2018, more than 50% of enterprise storage customers will consider bids from storage vendors that have been in business for less than five years, up from less than 30% today.

By 2018, the number of enterprises using the cloud as a backup destination will double, up from 11% at the beginning of 2016.

Market Definition/Description

Gartner defines the data center backup and recovery software market as being focused on providing backup capabilities for the upper-end midmarket and large-enterprise environments. Gartner defines the upper-end midmarket as being 500 to 999 employees, and the large enterprise as being 1,000 employees or greater. Protected data comprises data center workloads, such as file share, file system, operating system, database, email, content management, CRM, ERP and collaboration application (such as content management solutions) data. Today, these workloads are largely on-premises; however, protecting SaaS applications (such as Salesforce and Microsoft Office 365) and infrastructure as a service (IaaS) is becoming increasingly important, as are other, newer "born in the public, private or hybrid cloud" applications.

Provider solutions that primarily address backup and recovery of remote office, small enterprise, individual system and/or an endpoint device data are outside of the scope of this data-center-oriented focus. Some providers may, however, also address these workloads, as well as the larger data center workloads described above. However, those are not the primary use cases for deploying these data center solutions.

These backup and recovery software products provide features such as traditional backup to tape, backup to conventional random-access media (such as a hard disk or solid-state drives) or devices that emulate the previous backup targets (such as virtual tape library [VTL]), data reduction (such as compression, deduplication or single instancing), array and/or server-based snapshot, heterogeneous replication, and continuous data protection (CDP). Additionally, integration and exploitation of the cloud, particularly the public cloud, as a backup target or to a colocation facility are becoming more important for backup workloads.

These solutions may be provided as software-only, or as an integrated appliance that contains all of — or substantial components of — the backup application, such as the master server or a media server (something beyond a backup agent, or preparsing code that is used for dedicated backup target devices).

As the backup and recovery software market has dozens, if not hundreds, of vendors, this report narrows the focus down to those that have a very strong presence worldwide in the upper-end midmarket and large-enterprise environments.

Solutions that are predominantly sold as a service (backup as a service [BaaS]) do not meet the market definition for the data center backup and recovery software market. Backup software for a homogeneous environment, such as native tools from Microsoft or VMware for their own specific platforms, is also excluded, as many midsize and large customers prefer a single, scalable backup product for their entire environment.

This 2016 "Magic Quadrant for Data Center Backup and Recovery Software" is a refocus and update to the "Magic Quadrant for Enterprise Backup Software and Integrated Appliances" that was last published in June 2015. The renamed Magic Quadrant and updated market criteria are in response to Gartner client requests to focus more on backup and recovery software for the upper-midsize to large enterprises that protect data center workloads managed by data center personnel.

Magic Quadrant

 

Figure 1. Magic Quadrant for Data Center Backup and Recovery Software

Research image courtesy of Gartner, Inc.

Source: Gartner (June 2016)

Vendor Strengths and Cautions

Actifio

Actifio is an emerging data protection vendor whose innovative solution offers a fundamentally different way to do backup. The solution uses application-aware plug-ins to capture backup data with a block-level incremental-forever method. Backup data is stored in the native format in a snapshot pool for very fast recovery, and separately preserved in a deduplicated pool for local retention and remote replication. While all its customers use Actifio's solution to replace traditional backup, the vast majority also use it for disaster recovery and business continuity within a hybrid cloud. Increasingly, nearly half of its customers are also exploring the virtual copy functionality for test/development. Actifio reported over 1,200 enterprise customers by the end of its latest fiscal year (end of January 2016), with the majority using Actifio's subscription license option. Actifio is facing competition from other emerging vendors with a similar architecture and from traditional vendors that continue to improve performance.

Strengths
  • Actifio's solution offers much faster backup and recovery than traditional backup methods, augmented by SLA-based management policies to simplify initial configurations.

  • Many enterprises have used the solution for backup and disaster recovery, as well as to enable DevOps agility and structured data archiving.

  • Among emerging vendors offering a similar architecture, Actifio has the broadest host support matrix and data management functions.

Cautions
  • Some customers cited potential high cost of storage, as the snapshot pool in Actifio's physical appliances doesn't offer compression.

  • Actifio backup users may require collaboration and buy-in from other constituents, such as database/application administrators and the disaster recovery team, to fully realize "copy data management" benefits.

  • Actifio doesn't natively offer bare-metal restore capabilities.

Arcserve

Arcserve has two backup products in its portfolio: Arcserve Backup r17 (legacy offering) and Arcserve Unified Data Protection (UDP).The company generates its growth from the newer UDP solution, which is the strategic platform and is offered as software or as an appliance. Arcserve has been on a journey to further unify backup capabilities and offer a single point of management for UDP. In the past year, unified tape management and a new installed streamlined product feature have been added to UDP. Arcserve Backup has added SAP Hana support, and the ability to write compressed and deduplicated data to tape has been delivered. The company is especially focused on addressing the backup needs of the upper-midsize and decentralized large-enterprise markets. Organizationally, Arcserve has been looking to strengthen its channels and alliances, and has brought in additional industry veterans to lead its development efforts. Now approaching the two-year mark after the split from CA Technologies, sales are improving, and the vendor is seeing greater traction from the UDP solution, with almost half of its active customers having migrated to the new platform. Arcserve has particularly strong presence in EMEA and Japan.

Strengths
  • Arcserve offers protection capabilities that range from traditional backup to continuous data protection (CDP), has begun delivery snapshot support, initially with NetApp and Nimble Storage supported.

  • Arcserve offers an Assured Recovery capability via its high availability option that can leverage virtual standby and instant-access virtual machine (VM) options, or via its Arcserve Cloud Disaster Recovery as a Service (DRaaS) option, to provider broader data access.

  • Flexible packaging and pricing options make it easy to only purchase required features, with the ability to pay by the processor socket, terabyte or operating system instance, and to deploy as software or as a branded, integrated appliance.

Cautions
  • Arcserve may require further validation for very large enterprise deployments, as the vendor is most typically deployed in the midsize to upper-midsize enterprises.

  • Full integration of features, such as physical tape support, into the UDP product and administration console remains a work in progress.

  • No automatic load balancing of backups or restores across available resources means that larger operations could require more tuning.

Commvault

Commvault's investment in R&D is evident as it has added support for new hypervisors, public cloud IaaS, SaaS and storage arrays in rapid succession. Commvault now supports a growing list of cloud and hypervisors, and facilitates agnostic conversion from physical, to virtual, to cloud and back for complete interoperability among them. IntelliSnap offers the same portability across the industry's broadest list of supported storage arrays. This provides a flexibility that is more than the sum of the parts, given the expansive support matrix. The result is that procurement decisions made today regarding storage arrays, hypervisors or even the cloud are not lock-ins for tomorrow. Commvault has introduced a scale-out software-defined storage (SDS) back end with built-in indexing and data reduction/optimization, which is available as a storage target for other applications. After achieving success with its bundles and responding to market pressures, Commvault has completed a second stage of pricing revision to get rid of its reputation as an expensive vendor, and convert to a customer-friendly, net-price maintenance model. Customer support and satisfaction feedback remain very favorable, and over the past year, Commvault has again made product enhancements that further its ability to scale into very large environments.

Strengths
  • Commvault leverages a single code base to scale from solution-specific bundles (such as VM backup) to the full large-enterprise offering through a variety of new flexible pricing and packaging options that are more cost-competitive than past bundles.

  • Commvault offers the industry's broadest support for integrating with and exploiting storage hardware platform snapshots, directly supporting over two dozen of the top-selling storage arrays.

  • Commvault has comprehensive public cloud support with disaster recovery orchestration, as well as SaaS application protection.

Cautions
  • Getting started with Commvault may not be an out-of-the-box experience, and can require professional services.

  • Administrators report an initial steep learning curve, which makes training requisite.

  • Customers in most regions in EMEA and Asia/Pacific (APAC) may face substandard presales experiences, because Commvault has yet to partner with Tier 1 distributors/partners in these regions.

EMC

EMC's backup portfolio is a potpourri of separate products acquired over the past decade, including a solution to back up SaaS applications. EMC also offers the organically developed ProtectPoint to protect its select primary arrays. The core of the portfolio is the Data Protection Suite (DPS), with Avamar and NetWorker being the two key components, augmented by Data Domain Boost for Enterprise Apps, which sends backup from enterprise applications directly to Data Domain. Gartner observes that a high percentage of EMC backup customers use Data Domain in their environments. DPS is available in several editions for different environments and personas (such as the application administrator), and may also include analytics, archiving, hybrid cloud connection and cross-platform search tools. In the past year, EMC delivered a major revamp of NetWorker, whose client can leverage either Avamar deduplication for remote offices or Data Domain Boost deduplication for enterprise applications. However, customers implementing multiple products in a DPS edition need to use multiple user interfaces to perform different operations. Dell's pending acquisition of EMC may have an impact on product development and product focus.

Strengths
  • The increasing number of products and editions within the DPS portfolio offers customers more targeted options for different environments.

  • NetWorker 9.0 has numerous enhancements to improve scale, speed up backup and simplify management.

  • Avamar offers fast backup techniques for centralized large network-attached storage (NAS) file systems and network-efficient backup for distributed file shares in remote offices, while Data Domain Boost has strong enterprise application integration with self-service capabilities.

Cautions
  • Management and support complexity continues to be an issue, with overlapping products and multiple management consoles within EMC's backup portfolio.

  • Prospects not planning to adopt Data Domain appliances should be aware that the value propositions of NetWorker and Avamar weaken without Data Domain.

  • The pending acquisition by Dell should be monitored closely for any potential warning signs, such as more frequent turnover of sales and support personnel.

Hewlett Packard Enterprise

In November 2015, Hewlett Packard Enterprise (HPE) completed its split from its consumer products counterpart (HP Inc.) to better focus on the enterprise business. HPE's backup portfolio resides in the Information Management and Governance business unit, with the flagship product being Data Protector. The vendor continues to try to better leverage its direct storage and server sales teams, as well as its worldwide partner channels. While 2015 saw progress, there would seem to still be room for greater improvements, as customers and prospects report field activities with backup competitors. For data center workloads, a new bundle called the Data Protection Suite includes Data Protector, Backup Navigator for operational analytics and Storage Optimizer for unstructured data management. Data Protector offers snapshot integration with HPE, EMC and NetApp storage arrays. References comment very favorably on HPE's Data Protection Suite's administrative experience and cite recent improvements in customer support. In February, 2016, HPE acquired Trilead and its VM Explorer product for VM protection and a midmarket backup offering.

Strengths
  • HPE Data Protector focuses on usability and value.

  • Data Protector is a single, scalable solution with support for a wide range of host environments, and offers native integration with many enterprise core applications, especially SAP.

  • Backup Navigator and Storage Optimizer offer Data Protector administrators robust analytics and insight into the backup infrastructure.

Cautions
  • Data Protector is not predominantly installed in large environments.

  • Data Protector does not have strategic cross-divisional focus within HPE, which impacts how well the product is taken to market in various geographies.

  • Although Data Protector can manage array-based snapshots from many vendors, user adoption of this function remains low.

IBM

In February, 2015, IBM announced a major rebranding of all of its storage solutions. The Tivoli Storage Manager (TSM) family of backup products and Tivoli Storage FlashCopy Manager were renamed IBM Spectrum Protect and IBM Spectrum Protect Snapshot, respectively. As a result of the rebranding, IBM has sought to better leverage the sales of Spectrum Protect through greater alignment with its IBM Storage Systems team. Enhancements to product scale and faster deduplication further distinguish Spectrum Protect's existing capabilities, which can scale from the midmarket to the largest data centers with a single code base. Unlike most other backup products — that have added in synthetic processing and often still require some sort of periodic full backup — IBM has had a true incremental-forever methodology since the product's inception. IBM appears to have recommitted to product development, with a strong future roadmap for cloud exploitation and integration. IBM will have delivered four releases of Spectrum Protect in just over one calendar year. New Spectrum Storage Suite pricing allows customers to freely mix and match deployments of any of its storage software products at any time.

Strengths
  • Spectrum Protect supports a broad range of platforms, and integrates with IBM and non-IBM storage array snapshots and has the ability to write data to disk, tape and cloud target devices.

  • Improved in-line deduplication, a two-tier architecture and a true incremental-forever backup methodology combine to minimize the Spectrum Protect infrastructure requirements and expense required for large data center deployments.

  • A single instance of Spectrum Protect can protect a very scalable 4 to 5PB of deduplicated data.

Cautions
  • End users, including IBM's own references, continue to report that the Spectrum Protect administrative interface is challenging to navigate, with full integration of all management activities into the Operations Center console taking years to complete.

  • Low overall awareness of recent features and improvements is compounded by the fact that many customers are down-level in product releases.

  • While IBM can offer many capabilities for new platforms and features, some of these may require third-party offerings, such as support for newer databases that must be separately purchased and supported.

Unitrends

Unitrends offers a complete package of hybrid backup to the cloud and DRaaS via integrated appliances or software-only offerings. The standard offering is an on-premises solution, with an additional hybrid option for uploading to the Unitrends Cloud or a broad choice of cloud providers. Unitrends Boomerang remains a free-standing product that can move workloads to Amazon Web Services (AWS) for app migration and DR, automatically handling the networking conversion in the process. Unitrends was early to the market with cloud capabilities, and has weathered the limited cloud backup adoption rates better than some other competitors. Unitrends completed a product consolidation, and has released a major revision interface for the Unitrends Enterprise Backup (UEB) product to focus on usability and click count optimization, as well as to prepare for more unification. Unitrends is strategically expanding its sales presence outside of North America, and throughout 2015 expanded cloud geographies in North America, Europe and Australia to bolster its Unitrends Cloud coverage.

Strengths
  • Unitrends has a mature portfolio of cloud backup and disaster recovery (DR) capabilities and supports of a broad range of cloud providers, such as AWS S3 (and Glacier), Microsoft Azure, Google (and Google Nearline) and any OpenStack-Swift-compatible provider.

  • Unitrends has updated its deduplication algorithm to offer content-aware data reduction and improved performance.

  • To ensure recoverability, ReliableDR provides fully automated, application-level-consistency sandbox testing of physical servers, Microsoft Hyper-V or VMware vSphere VMs. This capability is available for on-site, Unitrends Cloud and service providers.

Cautions
  • Unitrends' most frequently shipped appliance protects less than 50TB of data.

  • NAS support is new and limited to EMC VNX and NetApp; however, SnapDiff is not supported.

  • Unitrends Forever Cloud is more financially suitable for a long-term archive process than for the shorter backup retention periods Gartner recommends.

Veeam

Veeam has increased its deployments on enterprise VM backup and penetrated 70% of global enterprises. Veeam's success is largely based on its focus on solving VM-specific issues during the data protection process, and its effort to make its architecture easier to scale. The company's marketing has been very effective, and its sales momentum has been stronger than most of the competition. In the most recent release (v.9), Veeam added scale-out heterogeneous storage for a virtual backup pool and additional backup options, with no impact on production VMs, such as Direct NFS (Network File System) backup. Veeam has increased its integration with both front-end storage arrays and back-end disk target appliances. Veeam's "instant recovery" function and socket-based pricing model have forced some major competitors to follow suit and to develop similar capabilities and pricing models. The new Direct Restore to Microsoft Azure function allows customers to perform disaster recovery to Azure. However, Veeam is facing new competition from emerging vendors that also aim to simplify backup infrastructure management for the virtual server environment.

Strengths
  • Veeam Backup and Replication is a reliable and function-rich data protection solution for VMware and Hyper-V environments.

  • Integration with leading deduplication appliances offers customers the option of enhanced performance and storage efficiency.

  • Customers comment favorably on general code reliability, agentless granular restores and ease of use.

Cautions
  • Veeam's support for physical servers and applications on-premises and in the cloud remains very limited and unproven.

  • Veeam's many backup options require backup administrators to know various technical nuances to choose the optimized method for certain workloads and to properly size a solution.

  • Veeam's Cloud Connect is designed with managed service providers in mind and lacks API integration with public cloud object storage providers, such as Amazon and Microsoft.

Veritas Technologies

In January 2016, the new Veritas Technologies completed its separation from Symantec, bringing in new executives from outside of the company to serve as CEO, chief information officer (CIO) and chief marketing officer (CMO). A comprehensive global marketing campaign was used to launch the new company. With the new organizational structure, Veritas has now increased its dedicated sales resources tenfold. Continued improvements in support and positive reference feedback on the usability of the administrative console are positive customer experiences that are tempered for some by continued compliance-auditing practices. According to Gartner reference surveys and conference polling, Veritas' NetBackup is the most evaluated enterprise backup solution, the most deployed for organizations with 5,000 or more employees, and the second-highest response to the question of which provider would be respondents' primary backup vendor in the future. NetBackup offers a very good mix of protected platforms, robust features and enterprise-class scale delivered as software only or via its branded integrated appliances. Self-service for application administrators is provided though native integration with Oracle, SQL Server, Hyper-V and VMware, as well for end users though a cloud portal.

Strengths
  • NetBackup is scalable to very large enterprises, with protection across a wide range of OSs and applications, and a broad range of data protection capabilities.

  • NetBackup features, such as Accelerator for fast backups and the OpenStorage Technology (OST) interface to back up disk devices, enable better management of backup storage devices and continue to be major product differentiators.

  • Consolidated administrative console with multitenant end-user restore capabilities allow low-touch management in a broad variety of deployments.

Cautions
  • Cloud integration and exploitation remain a work in progress.

  • While Replication Director has added EMC VNX and Celerra support, overall array-based hardware snapshot and replication support is narrower than some other large-enterprise providers.

  • Compliance audits initiated by Symantec in 2015 remain an ongoing practice by the new Veritas in the first half of 2016, causing some customers to seek replacement solutions.

Vendors Added and Dropped

We review and adjust our inclusion criteria for Magic Quadrants and MarketScopes as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant or MarketScope may change over time. A vendor's appearance in a Magic Quadrant or MarketScope one year and not the next does not necessarily indicate that we have changed our opinion of that vendor. It may be a reflection of a change in the market and, therefore, changed evaluation criteria, or of a change of focus by that vendor.

Added

No vendors were added to this Magic Quadrant.

Note that the previous vendor names of HP and Symantec have changed. HP split into two entities, with Hewlett Packard Enterprise (HPE) retaining the backup software portfolio. Symantec sold its backup portfolio to a private equity group and returned the newly formed group to its pre-Symantec corporate name of Veritas, adding "Technologies" to the name to become Veritas Technologies.

Dropped

As a result of the refocus of this Magic Quadrant to the upper-end midmarket and large-enterprise environments (see specific inclusion criteria below), the following vendors are not included in this iteration of the Magic Quadrant: Asigra, Barracuda Networks, Dell and Seagate (EVault server backup assets are now owned by Carbonite).

FalconStor now focuses on the software-defined storage (SDS) market; the backup solution has been fully integrated as a feature of the FreeStor SDS solution, rather than sold as a stand-alone product.

Inclusion and Exclusion Criteria

The inclusion criteria represent the specific attributes that analysts believe are necessary for inclusion in this research.

To qualify for inclusion, vendors need to meet the following 10 criteria at the time that initial research and survey work commences (January 2016). The criteria for inclusion in the 2016 "Magic Quadrant for Data Center Backup and Recovery Software" are:

  • Vendor's qualifying backup and recovery software product(s) must possess the capability to capture data directly, and not to solely rely on other third-party and/or partner means of data capture/ingestion. In short, the vendor must own heterogeneous backup software capabilities which meet the criteria below.

  • The qualifying solution must be focused on protecting data center workloads, such as file system, operating system, database, email, content management and customer relationship management (CRM) data.

  • More specifically, the solution must support files and multiple applications on Windows and either Linux or one or more Unix OS (AIX, HP-UX, Solaris) in a physical and/or a virtual deployment supporting both VMware and Microsoft Hyper-V hypervisors.

  • The solution must be predominantly (more than 50% of the time) deployed to protect upper-end midmarket and large-enterprise data center workloads of greater than 25TB of protected data.

  • The qualifying solution must be deployed at least one-third of the time as an on-premises solution and not predominantly be a cloud-services-required solution.

  • The vendor must have a disk-based backup and recovery software solution commercially available for at least one calendar year.

  • The vendor must be able to produce upon request 10 active references using the solution in a production environment that meets the above criteria. References must be data center customers, not value-added resellers (VARs), business partners or managed service providers.

  • The vendor must actively market its branded backup and recovery products in at least two major geographic regions (for example, North America, EMEA, or Japan and APAC).

  • The vendor must have generated greater than $30 million in 2014 for new license and maintenance revenue for its on-premises data center backup and recovery software solution.

  • The vendor must be the originator of the required capabilities and meet all of the above requirements via intellectual property that they own, and not rely exclusively on third-party and/or resold solutions to meet these criteria.

Gartner will continue to cover emerging vendors, as well as vendors and products that do not yet meet the above inclusion criteria.

Based upon the criteria above, the following vendors and products are believed to have qualified for inclusion in to this Magic Quadrant:

  • Actifio — Actifio Enterprise and Actifio Sky

  • Arcserve — Arcserve UDP and Arcserve Tape Backup

  • Commvault — Commvault Software (formerly Simpana)

  • EMC — Avamar, Data Protection Suite (which includes Avamar, NetWorker and Data Protection Advisor) and NetWorker

  • HPE (formerly HP) — Data Protector

  • IBM — Spectrum Protect (formerly Tivoli Storage Manager [TSM])

  • Unitrends — Unitrends Enterprise Backup

  • Veeam — Veeam Backup and Replication

  • Veritas Technologies (formerly Symantec) — NetBackup

Evaluation Criteria

Ability to Execute

Gartner analysts evaluate technology providers on the quality and efficacy of the processes, systems, methods or procedures that enable IT provider performance to be competitive, efficient and effective, and to positively impact revenue, retention and reputation. Ultimately, technology providers are judged on their ability and success in capitalizing on their vision:

  • Product/Service — This is the evaluation of how well a vendor does in building and effectively delivering the solution that the market wants and perceives as being worthy of new investments — ideally resulting in a three- to five-year strategy based on the vendor's portfolio (versus tactical or point product usage). The solution must be easily configured and managed so that the capability of the product is easily exploited. The product's completeness of overall capability, as well as the breadth and depth of the specific key features, will be considered. The overall scalability of a single instance of the solution will be taken into account. Also tracked is the level of customer interest and positive feedback.

  • Overall Viability — Viability is important because backup solutions are considered strategic, and organizations do not want to change offerings frequently. Viability is in relation to commitment to the backup portfolio, not the overall vendor, unless the vendor sells only backup solutions. Company viability, which equates to risk for the buyer, is something that data center professionals tell Gartner is important to them.

  • Sales Execution/Pricing — This criterion also includes the transparency of pricing, including line item and list pricing in a bid.

  • Market Responsiveness/Record — This criterion heavily considers the provider's recent three-year history of responsiveness in meeting, or even being ahead of the market, and being adaptable enough to maintain or achieve competitive success as opportunities develop, competitors act, customer needs evolve, and market dynamics change.

  • Marketing Execution — Marketing execution directly leads to unaided awareness (that is, Gartner end users mentioned the vendor without being prompted) and a vendor's ability to be considered by the marketplace. Gartner's end-user client search analytics results are also factored in as a demonstration of vendor awareness and interest.

  • Customer Experience — Customer experience is a very heavily weighted criterion, as data center professionals tell Gartner that they are evaluating vendors more and more on this capability. Because many products can now satisfy technical requirements, differences in product support take on greater importance.

Table 1.   Ability to Execute Evaluation Criteria

Evaluation Criteria

Weighting

Product/Service

High

Overall Viability

High

Sales Execution/Pricing

High

Market Responsiveness/Record

High

Marketing Execution

High

Customer Experience

High

Operations

No Rating

Source: Gartner (June 2016)

Completeness of Vision

Gartner analysts evaluate technology providers on their ability to convincingly articulate logical statements about current and future market direction, innovation, customer needs, and competitive forces, and how well they map to the Gartner position of the future of backup and recovery. Ultimately, technology providers are rated on their understanding of how market forces can be exploited to create opportunity for the provider:

  • Market Understanding — The more visionary vendors not only can observe the customers' wants, but also can enhance those wants with their added vision, and can potentially even shape or move the market in either a new direction or accelerate market activity and trends. Market understanding, like all Completeness of Vision evaluation criteria, is overweighted toward the recent past deliverables, as they are a clear demonstration of what the vendor has actually been able to achieve. This criterion is not overly influenced by the discussion of futures that a provider may offer, which may or may not come to pass in the time frame or with the amount of capability that was promised.

  • Marketing Strategy This relates to what vendor and backup solution message is described, how that message is communicated, what vehicles are used to effectively deliver it, and how well the buying public resonates with and remembers the message. In a market where many vendors and/or products can sound the same, or sometimes not even be known, message differentiation and overall awareness are vital.

  • Sales Strategy — The ability for the sales team to effectively and clearly communicate the current capabilities is considered, along with the future vision and roadmap, while also positively differentiating the vendors' offerings from the competition and alternative approaches.

  • Offering (Product) Strategy — The vendor's offering needs to be capable of more than meeting the current and future tasks. The product's completeness of overall capability, as well as the breadth and depth of the specific key features, will also be considered. The overall scalability of a single instance of the solution will also be considered. Vendors that deliver function ahead of the market or influence the industry will be deemed to have a superior product offering. The product should also be extensible, such that today's investments can easily be leveraged in the future. Vendors that deliver function ahead of the market, or influence the industry, will be deemed to have a superior product offering.

  • Innovation — This criterion looks in particular at the recent past (the last three years') track record for innovation and current customer production exploitation of new capabilities, as well as the near-term (less than 12 months) upcoming feature set and longer-term (three to five years) roadmap. Customer production exploitation of new capabilities is also factored in to validate the ability of the offering to withstand the demands of a data center environment.

Table 2.   Completeness of Vision Evaluation Criteria

Evaluation Criteria

Weighting

Market Understanding

High

Marketing Strategy

High

Sales Strategy

High

Offering (Product) Strategy

High

Business Model

No Rating

Vertical/Industry Strategy

No Rating

Innovation

High

Geographic Strategy

No Rating

Source: Gartner (June 2016)

Quadrant Descriptions

Leaders

Leaders have the highest combined measures of Ability to Execute and Completeness of Vision. They have the most comprehensive and scalable product portfolios. They have a proven track record of established market presence and financial performance. For vision, they are perceived in the industry as thought leaders, and have well-articulated plans for enhancing recovery capabilities, improving ease of deployment and administration, and increasing their scalability and product breadth. A fundamental sea change is occurring in the backup and recovery market. For vendors to have long-term success, they must plan to address the legacy requirements of traditional backup and recovery, while looking to expand their integration with and exploitation of emerging applications, hypervisors, snapshot and replication technologies, and public cloud capabilities. A cornerstone for Leaders is the ability to articulate how new requirements will be addressed as part of their vision for recovery management. As a group, Leaders can be expected to be considered part of most new purchase proposals and have high success rates in winning new business. This does not mean, however, that a large market share alone is a primary indicator of a Leader. Leaders are strategic vendors, well-positioned for the future, having established success in meeting the needs of upper midsize and large data centers.

Challengers

Challengers can execute today, but they have a more limited vision than Leaders, or they have yet to fully produce or market their vision. They have capable products and can perform well for many enterprises. These vendors have the financial and market resources and the capabilities to potentially become Leaders, but the important question is whether they understand the market trends and market requirements to succeed tomorrow, and whether they can sustain their momentum by executing at a high level over time. A Challenger may have a robust backup portfolio, but has not yet been able to fully leverage its opportunities, or does not have the same ability as Leaders to influence end-user expectations and/or be considered for substantially more or broader deployments. These vendors may not devote sufficient development resources to delivering products with broad industry appeal and differentiated features in a timely manner, or effectively market its capabilities and/or fully exploit enough field resources to result in a greater market presence.

Visionaries

Visionaries are forward-thinking, advancing their portfolio capabilities ahead, or well ahead, of the market, but their overall execution has not propelled them into being Challengers or possibly Leaders (often due to limited sales and marketing or elongated time to initially install and configure, but sometimes due to scalability or breadth of functionality and/or platform support). These vendors are predominantly differentiated by product innovation and perceived customer benefits, but they have not achieved solution completeness, or sustained broad sales, marketing and mind share success, or demonstrated continued successful large-enterprise deployments required to give them the higher visibility of Leaders. Some vendors move out of the Visionaries quadrant and into the Niche Players quadrant, because their technology is no longer visionary (the competition caught up to them) and/or they have not been able to establish a market presence that justifies moving to the Challengers or Leaders quadrants, or even remaining in the Visionaries quadrant.

Niche Players

It is important to note that Gartner does not recommend eliminating Niche Players from customer evaluations. Niche Players are specifically and consciously focused on a subsegment of the overall market, or they offer relatively broad capabilities without very large-enterprise scale, or the overall success of competitors in other quadrants. In several cases, Niche Players are very strong in the upper-midsize-enterprise segment, and they also opportunistically sell to the large enterprises, but with offerings and overall services that, at present, are not as complete as other vendors focused on the large-enterprise market. Niche Players may focus on specific geographies, vertical markets, or a focused backup deployment or use case service, or they may simply have modest horizons and/or lower overall capabilities, compared with competitors. Other Niche Player vendors are too new to the market or have fallen behind, and, although worth watching, have yet to fully develop complete functionality, or consistently demonstrate an expansive vision or the Ability to Execute.

Context

Backup and recovery is one of the oldest and most frequently performed operations in the data center. Despite the long timeline associated with backup, the practice has undergone a number of changes (such as new recovery techniques, new deployment options and pricing models, and a new, expanded set of vendors and approaches to consider) and challenges, such as how to protect server-virtualized environments, very large databases, emerging next-generation databases and big data applications, as well as how to integrate with the cloud. Gartner end-user inquiry call volume regarding backup has been rising by about 20% each year for the past eight years. Organizations worldwide are seeking ways to easily, quickly and cost-effectively ensure that their data is appropriately protected.

Organizations are also voicing the opinion that backup needs to improve a lot, not just a little. The ongoing frustration with backup implies that the data protection approaches of the past may no longer suffice in meeting current — much less future — recovery requirements. As such, many companies are willing to adopt new technologies and products from new vendors, and they have shown an increased willingness to augment or even completely switch backup and recovery providers to better meet their increasing service-level needs, overall cost requirements and ease of management needs.

Ransomware is on the rise, and backup remains the best protection against data loss. As a fail-safe, organizations should implement enterprise endpoint backup for laptops/workstations, and set recovery point objectives (RPOs) for each server deemed to be at greater risk to ransomware according to organizational requirements based on data loss time frame acceptable to the organization.

Market Overview

For years now, many organizations have continued to rearchitect their backups in an effort to modernize their approach to handle new data types and deployment models, and increased workload volumes, and to improve backup and restore times to meet rising SLAs. Disk-based solutions (including backup directly to disk and perhaps additionally to a cloud target, array-based snapshot and replication exploitation, server virtualization backup features, and leveraging compression, deduplication and other data management efficiency technologies) are among the key items being sought. Ease of deployment, with a rapid time to value, and a greater ease of daily administration are key requirements. Mission-critical workloads are predominantly being deployed in server-virtualized environments, making capable, scalable VM backup a mainstream requirement.

Gartner sees that many organizations are willing to deploy multiple backup solutions in an attempt to best match the needs of what is being protected (endpoints, remote office, VMs, SharePoint, etc.), to contain product costs, and/or to implement a solution that the staff will find easy to use. As a result, large vendors are no longer viewed as being safe choices, with many large vendors losing market share to emerging providers. Today, the market is willing to take on more risk with vendors and solutions than in the past.

?

Small & Midsize (<1000 Employees) Context

Market Differentiators

Just 10 years ago, two backup vendors dominated the midmarket. Today the market is fragmented, and IT staff in midsize organizations are faced with a broad range of data protection choices and options in a market segment that is up for grabs. The problem is further compounded by the lack of a majority approach, or a consensus between vendors and midmarket IT leaders alike, for how best to manage a midsize organization's backup requirements. It is even possible to observe a regular cadence of product entries from adjacent market segments by nontraditional backup vendors offering the next new thing to address the long-standing pain point of backup.

Scaling IT in a midsize organization frequently means that IT administrators use a multidisciplinary approach when delivering diverse IT services. Due to competing demands for time, midmarket organizations tend to revisit the backup problem only when the reliability of the existing solution is so disruptive that backup modernization can no longer be ignored. The fact that resources are scarce means that even selecting the right solution precludes the ability to conduct exhaustive proof of concepts with several vendors. The net result is that frequently, older backup software versions are still in use, perhaps with an outdated backup policy and methodology, and often run on an aged infrastructure. All of this leads to brittle and more error-prone backup practices that can yield lower backup success, and even lower still recovery success rates.

The midmarket tends to be the most receptive to trying new technologies and approaches because there is little risk in departing from the status quo of past operational or technological decisions. Well-known names in backup today, such as Commvault and Veeam Software, were initially much more heavily adopted by midmarket organizations. The next round vying for attention from the midmarket formed with Nimble Storage and SimpliVity positioning hybrid storage arrays and hyperconverged integrated systems (HCIS) with embedded backup. More recently, vendors such as Cohesity and Rubrik are getting attention less than one year after launch for their scale-out backup and recovery platforms.

The average midmarket organization is highly virtualized and has a backup footprint of 65TB with almost 60% under 50TB, which directly impacts what backup products, methodologies and media are relevant and most appropriate. Midmarket I&O leaders should increase the importance of soft features like (1) ease of administration and reporting, (2) product reliability and (3) product support, and de-emphasize potentially superfluous capabilities like (1) enterprise data center scalability and (2) broad operating system and application support, etc. that come with a commensurate price tag and level of additional complexity.

Considerations for Technology and Service Selection

Seek a solution-oriented vendor that offers the following:

  • Simplified pricing, maintenance, contracts and licensing or bundled solutions with an all-inclusive pricing scheme

  • Integrated appliances are highly recommended since they eliminate the requirement and potential complexity to size and procure separate hardware

  • Local postsales service partners and responsive support

  • Rapid hardware replacement SLAs

Intuitive interface and usability for the IT generalist will ideally include:

  • Minimal training required with wizard-based options for common tasks

  • Simple, ideally out-of-the-box, deployment with short time to value following nominal setup

  • Admin console should be purpose-built, not a trimmed-back version of an enterprise interface

  • Easy to integrate choice of traditional or deduplication disk target devices

  • Quick definition of backup policies

  • Autodiscovery and backup of new virtual machines on protected hypervisor hosts

  • Restore UI should be intuitive for the IT generalist, even for granular or redirected restores

  • Summary reports on service levels (for example, success/failure and critical alerts) to allow management by exception

Cloud as a backup storage target, or at least as a future option, should include:

  • Clear TCO, including overview of exit costs to return or transfer data

  • Straightforward encryption key management

  • Local backup cache to overcome internet bandwidth considerations when attempting to perform restores

  • In-cloud restores for faster granular access to a smaller number of files and/or for disaster recovery of selected virtual machines

  • Deduplication prior to data transmission for bandwidth reduction and cloud storage cost control

  • Ability to pin selected data on-site so that only rehydration of deltas or nonpinned files and applications from the cloud is required

Considerations for outsourcing backup as SaaS or with an MSP the following:

  • SLAs that meet RTO/RPO objectives

  • A clear services pricing model to avoid anything that might distort TCO, and an overview of exit costs at end of the relationship

  • A provider that can combine Backup and DR as this is often a gap for midmarket enterprises

Notable Vendors

Vendors included in this Magic Quadrant Perspective have customers that are successfully using their products and services. Selections are based on analyst opinion and references that validate IT provider claims; however, this is not an exhaustive list or analysis of vendors in this market. Use this perspective as a resource for evaluations, but explore the market further to gauge the ability of each vendor to address your unique business problems and technical concerns. Consider this research as part of your due diligence and in conjunction with discussions with Gartner analysts and other resources.

Barracuda Networks

Barracuda Networks offers a portfolio of application delivery, security and data protection appliances, with a focus on ease of ordering and delivery (it builds its own appliances), ease of daily administration, and backup capabilities for small to midsize Windows and Linux implementations running primarily Microsoft applications and VMware and Hyper-V virtualization. Barracuda Backup is delivered as a virtual appliance, or offered in physical appliances ranging from 250GB to 56TB of protected data, with backup data replicated to the Barracuda Cloud for disaster recovery and near-instant failover. Multiple appliances can be centrally reported on and managed from the Barracuda Cloud. Barracuda can also protect SaaS environments, such as Microsoft Office 365, with cloud-to-cloud backup.

Consider Barracuda when an all-encompassing backup software, appliance and cloud service is desired.

Dell

Dell's backup portfolio includes software and appliances that target small and midsize environments. Dell's backup portfolio includes the Dell Data Protection | Rapid Recovery, NetVault and vRanger backup applications, as well as a bundle of all products in the Dell Backup & Disaster Recovery Suite (BDRS). Rapid Recovery offers short RPOs (as frequent as five minutes) and RTOs (with near-instant Live Recovery) for Windows and Linux environments; NetVault can back up more heterogeneous and larger host environments, including network attached storage (NAS) with strong physical tape backup capabilities; and vRanger can focus on agentless virtual server backup. Rapid Recovery is often deployed via the Dell DL series of appliances. Note that Dell is currently engaged in acquiring EMC, and significant changes to the backup portfolio may result at any time before, or shortly after the companies become a single legal entity, later in 2016.

Consider Dell when Windows or Linux physical or virtual machine protection software, potentially delivered as an appliance is of interest.

Microsoft

Microsoft offers Data Protection Manager (DPM) as a standard part of its System Center (2012 R2 being the latest version) to back up Windows file systems and Microsoft enterprise applications on physical and virtual machines on both Hyper-V and VMware, and Linux VMs on Hyper-V. Due to DPM's tight integration with System Center and no additional license, it is widely used by System Center administrators in the midmarket. In the past year or two, Microsoft has been focusing on integrating DPM with Azure Backup and fixing bugs and deficiencies with its recent upgrade rollups to stabilize the product. DPM can run on-premises or in the Azure cloud, and snapshots of DPM servers can be sent via Azure Backup services to an Azure Backup vault as a long-term retention tier for backup.

Consider Microsoft DPM for organizations with small and midsize Windows environments looking for Microsoft-native data protection capabilities, often at no additional cost.

Veritas Technologies

After years of an uncertain future under Symantec, recently independent Veritas stepped up Backup Exec's release cadence, focusing on maintaining platform support currency, improving performance, and introducing and/or enhancing features between major releases. Backup Exec's two front-end, capacity-based pricing tiers offer foundation coverage for commonly deployed operating systems, applications and hypervisors, and a la carte expansion for advanced features including flexible options for a choice of backup targets and cloud. Physical hosts require agents; while Backup Exec integrates at the hypervisor level to autodiscover VMs, add them to the defined policy, and perform efficient, changed block-level backups for more frequent RPOs without requiring agents in each host.

Consider Veritas Backup Exec for midmarket organizations seeking a mature, scalable and feature-rich backup solution with capabilities ordinarily found only in enterprise products.

Honorable Mentions

Vendors that would also be appropriate on a midmarket organization's shortlist include: Acronis, Asigra, Axcient, Carbonite, CTERA Networks, Datto, Druva, Infrascale and StorageCraft.

Evidence

Placement on the Magic Quadrant is based on Gartner's view of a vendor's performance against the criteria noted in this research. Gartner's view regarding vendor placement on the Magic Quadrant is heavily influenced by more than 1,700 inquiries and conference one-on-one meetings conducted during the past 12 months with Gartner clients on the topic of backup/recovery software and integrated backup appliances. Gartner also utilizes worldwide end-user surveys, Gartner conference kiosk surveys and Gartner conference session polling data. The Magic Quadrant methodology includes the solicitation of references from each vendor; for this Magic Quadrant, Gartner conducted over 131 reference checks (via electronic survey and/or live interview) from a set of customers provided by each vendor. The included vendors submitted nearly 500 pages of responses to Gartner's Magic Quadrant survey on this topic, which were used as the basis for subsequent vendor briefings and follow-up meetings, demos, and correspondences.

Additionally, input from other Gartner analysts, industry contacts, and public sources, such as U.S. Securities and Exchange Commission filings, articles, speeches, published papers and public domain videos.

Evaluation Criteria Definitions

Ability to Execute

Product/Service: Core goods and services offered by the vendor for the defined market. This includes current product/service capabilities, quality, feature sets, skills and so on, whether offered natively or through OEM agreements/partnerships as defined in the market definition and detailed in the subcriteria.

Overall Viability: Viability includes an assessment of the overall organization's financial health, the financial and practical success of the business unit, and the likelihood that the individual business unit will continue investing in the product, will continue offering the product and will advance the state of the art within the organization's portfolio of products.

Sales Execution/Pricing: The vendor's capabilities in all presales activities and the structure that supports them. This includes deal management, pricing and negotiation, presales support, and the overall effectiveness of the sales channel.

Market Responsiveness/Record: Ability to respond, change direction, be flexible and achieve competitive success as opportunities develop, competitors act, customer needs evolve and market dynamics change. This criterion also considers the vendor's history of responsiveness.

Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the organization's message to influence the market, promote the brand and business, increase awareness of the products, and establish a positive identification with the product/brand and organization in the minds of buyers. This "mind share" can be driven by a combination of publicity, promotional initiatives, thought leadership, word of mouth and sales activities.

Customer Experience: Relationships, products and services/programs that enable clients to be successful with the products evaluated. Specifically, this includes the ways customers receive technical support or account support. This can also include ancillary tools, customer support programs (and the quality thereof), availability of user groups, service-level agreements and so on.

Operations: The ability of the organization to meet its goals and commitments. Factors include the quality of the organizational structure, including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and efficiently on an ongoing basis.

Completeness of Vision

Market Understanding: Ability of the vendor to understand buyers' wants and needs and to translate those into products and services. Vendors that show the highest degree of vision listen to and understand buyers' wants and needs, and can shape or enhance those with their added vision.

Marketing Strategy: A clear, differentiated set of messages consistently communicated throughout the organization and externalized through the website, advertising, customer programs and positioning statements.

Sales Strategy: The strategy for selling products that uses the appropriate network of direct and indirect sales, marketing, service, and communication affiliates that extend the scope and depth of market reach, skills, expertise, technologies, services and the customer base.

Offering (Product) Strategy: The vendor's approach to product development and delivery that emphasizes differentiation, functionality, methodology and feature sets as they map to current and future requirements.

Business Model: The soundness and logic of the vendor's underlying business proposition.

Vertical/Industry Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of individual market segments, including vertical markets.

Innovation: Direct, related, complementary and synergistic layouts of resources, expertise or capital for investment, consolidation, defensive or pre-emptive purposes.

Geographic Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of geographies outside the "home" or native geography, either directly or through partners, channels and subsidiaries as appropriate for that geography and market.


Read more »



Jul
2
TỪ CTR TỚI IBM: ĐỊNH HÌNH DOANH NGHIỆP
Posted by Thang Le Toan on 02 July 2017 04:38 AM

Không điều gì có thể thay thế cho khả năng nhìn nhận thấu đáo về các vấn đề nền tảng. Đáng lưu ý là nhiều công ty lớn có phần cốt lõi là một khẩu hiệu kinh doanh mang tính mô tả và truyền cảm hứng. Ở hai trong số những trường hợp bàn luận, chúng ta có thể thấy rõ ràng hiện tượng này. Tại Kodak, đó là khẩu hiệu “Bạn chỉ cần bấm nút, việc còn lại để chúng tôi lo”. Tại Ford: “Một chiếc xe cho đông đảo quần chúng”. Những hãng khác từ thập niêm 1980 trở về sau đề ra loại giá trị cốt lõi tương tự. Ví dụ, tại Coca-cola, đó là “dừng lại để sảng khoái”. Tại General Motors: “Một chiếc xe cho mọi túi tiền và mục đích”. Tại hãng điện thoại và điện tín Hoa Kỳ: “Một chính sách-một hệ thống, dịch vụ toàn cầu”.

Những khẩu hiệu này không chỉ gắn và tâm trí của người tiêu dùng, chúng còn giúp đào tạo đội ngũ bán hàng và định hình suy nghĩ của họ. chúng dường như trao cho họ một niềm tin rằng họ đang làm điều gì đó có ý nghĩa hơn việc thúc ép những khách hàng miễn cưỡng mu sản phẩm. Họ đang phục vụ nhân loại. Mỹ từ về dịch vụ xã hội tràn ngập các công ty của Mỹ, đặc biệt là trong thập niên 1920.

Watson không xây dựng một cụm từ như những khẩu hiệu được nêu ở trên nhưng ông sử dụng một từ duy nhất đã trở thành nhận diện của công ty, một thứ biểu ngã trương lên đằng sau mỗi bước đường của nó. Từ đó là “think” (suy nghĩ). Sự tập trung vào từ này đã trở thành sức mạnh đoàn kết công ty.

Giống như nhiều thứ khác ở IBM, THINK có nguồn gốc từ NCR. Watson đang cố gắng dồn mọi sự chú ý vào thông điệp bán hàng. Chắc chắn là có lần đứng trên sân khấu, ông đã viết từ THINK lên một trong những giá vẽ được dựng ở khắp mọi nơi trong trụ sở của NCR. Patterson tình cờ trông thấy và thích hiệu ứng của nó tới mức ông đã làm những tấm biển có chữ THINK và cho dán khắp nơi trong tòa nhà văn phòng. THINK theo Watson du nhập và CTR, tại đây nó trở thành một thứ thiêng liêng. Trong các tiểu phẩm trình diễn trong tòa nhà đào tạo của công ty, nó được khắc ghi ở bậc thang trên cùng của “Những nấc thang học tập”.

THINK là tiếng thét xung trận gây bối rối. Từ này có nghĩa là gì trong bối cảnh hoạt động kinh doanh của IBM? Kinh doanh là hành động, chứ không phải ngẫm nghỉ. THINK không có nghĩa là “suy nghĩ độc lập”. Khi Watson nói THINK, ông không có ý là “hãy nghĩ cho bản thân”. Ý nghĩa của ông là “hãy nghĩ như tôi”. Watson không phải là “nhà tổ chức” mặc “bộ đồ đen xám”. Ông xác định bản thân như một nhà lãnh đạo có sẵn mọi câu trả lời . Điều mà các nhân viên của IBM được cho là phải suy nghĩ đó là về những quan điểm và thái độ của chủ tịch Thomas J. Watson. Như một nhà báo đã nêu trong Saturday Evening Post: “Thomas J. Watson có lẽ đã ví dụ tiểu biểu cho cái ngày mà một người thống trị hoạt động kinh doanh. Trong tổ chức của ông, các thành viên ban lãnh đạo nói chủ thịch Watson chính là Công ty Máy móc Kinh doanh Quốc tế.. Theo cách nói của người IBM, chủ tịch công ty không phải là chủ tịch và là lãnh tụ. Cho tới đầu thập niên 1940, Watson đặt nhiều tin tưởng vào ba người trợ lý luôn cập nhật cho ông về tình hình kinh doanh bởi vì, theo một phóng viên, “toàn bộ sự nghiệp của họ đặt dưới sự bảo hộ của ông và họ nghĩ hệt như ông nghĩ”.

Hoàn toàn tình cờ, sản phẩm khiến Watson có thể đạt được thành tựu vĩ đại về kinh doanh lại chính là một cỗ máy biết suy nghĩ. Từ “think” (suy nghĩ) không có nghĩa là sự gợi ý về một cỗ máy biết “chơi” với các ý tưởng. Đúng hơn là, nó lắp ghép các dữ liệu có chủ đích và thể hiện các tính toán dựa trên các dữ liệu đó.

Về ba thành phần của CTR, đó là chữ T-Công ty máy thống kê-đã được ghép vào tổ chức hỗn hợp gần như một sự suy nghĩ bột khởi vào phúc cuối. Chữ C viết tắc cho từ “computing” trong tên của công ty không phải là sự báo trước cho việc ra đời máy vi tính. Đúng ra, nó sản xuất và tiếp thị cho các loại cân cho những mục đích sử dụng khác nhau. Chữ “recoding” viết tắt là R, là lĩnh vực kinh doanh đồng hồ đo thời gian.

Chứ T viết tắc của chữ “tabulating” (máy thống kê), thực sự là một lĩnh vực đầy tiềm năng. Máy thống kê có thể đem lại rất nhiều lợi ích nếu phát triển hơn nữa. Đó là chiếc máy có thể cắt giảm phần nào thứ chi phí không sinh lợi nhất mà bất kỳ doanh nghiệp nào cũng phải đối mặt: chi phí tính toán và lưu trữ hồ sơ dữ liệu. Hollerith đã chứng minh giá trị của nó thông qua việc áp dụng cho cuộc điều tra dân số. Nếu được tinh chỉnh, cổ máy kì diệu này còn có thể làm những điều gì khác nữa.

Nhận thức thấu đáo nhất trong toàn bộ sự nghiệp của Watson chính là về tiềm năng của máy thống kê. Như Caregie, Eastman và Ford, cũng như các CEO khác, Watson đã nhận được một món quà vì không bị mù quáng bởi quá khứ hoặc hiện tại. Ông đã không bước giật lùi về phía tương lai. Đúng ra, ông đã hình dung về điều có thể xảy ra và dồn công sức vào đó để đặt công ty của mình lên trên đỉnh của ngọn sóng thời đại. Năm 1920, máy thống kê vẫn là một bộ phận nhỏ nhất trong ba lĩnh vực kinh doanh của CTR. Thứ sinh lợi nhiều nhất là thiết bị đo thời gian, còn cân đứng thứ hai.

Nhưng đó là quá khứ và hiện tại. Đối với tương lại, Watson đã thấy là những thứ máy móc giúp cắt giảm nhân lực trong văn phòng có tiềm năng phát triển vô hạn. Quyết định quan trọng nhất của bất kỳ công ty nào đó là thị trường nào mà nó lựa chọn để phục vụ với loại sản phẩm gì. Nhận thức mang tính đột phá của Watson nằm ở chỗ lựa chọn thúc đẩy máy thống kê và các cơ quan chính quyền nữa. Nếu không có cái nhìn chiến lược này, Watson chắn chắn vẫn sẽ thành công, nhưng ông sẽ không thể trở thành một tượng đài trong kinh doanh./

Nguyễn Huy Tuân – Khoa QTKD


Read more »




Help Desk Software by Kayako