Live Chat Software by Kayako
 News Categories
(24)Microsoft Technet (2)StarWind (6)TechRepublic (4)ComuterTips (1)SolarWinds (1)Xangati (1) (30)VMware (8)NVIDIA (9)VDI (1)pfsense vRouter (4)VEEAM (3)Google (2)RemoteFX (1) (1)MailCleaner (1)Udemy (1)AUGI (2)AECbytes Architecture Engineering Constrution (8)VMGuru (2)AUTODESK (9) (1)Atlantis Blog (40)AT.COM (2) (1) (16) (3)hadoop360 (3)bigdatastudio (1) (1) (3)VECITA (1) (1)Palo Alto Networks (5) (2) (1)Nhịp Cầu đầu tư (3)VnEconomy (1)Reuters (1)Tom Tunguz (1) (1)Esri (1) (1)tweet (1)Tesla (1) (7)ITCNews (1) (1) Harvard Business Review (1)Haravan (2) (1) (8) (3)IBM (1) (2) (1) (9) (1) (1) (4) (1) (1) (1) (1) (1) (1) (1) (4) (5) (4) (3) (1) (1) (1) (3) (1) (27) (1) (1) (1) (5) (2) (1) (1) (3) (2) (2) (1) (21) (1) (1) (1) (1) (1) (1) (2)Engenius (1) (1) (1) (1) (1) (3) (6) (1) (2) (1) (1) (1) (1) (1) (1) (1) (1) (1) (1) (1) (1) (1) (1) (1) (2)VTV (6)NguyenTatThanh School (1) (1)
RSS Feed
New Quantum backup appliance brings Veeam to tape
Posted by Thang Le Toan on 17 August 2018 12:16 AM

Integrated disk backup appliances are now common, but Quantum and Veeam have taken the converged architecture to tape backups to eliminate the need for a dedicated external server.

Quantum and Veeam today launched what they call a converged tape appliance, which integrates Veeam data protection software with Quantum backup.

Until now, Veeam Backup and Replication users could only create tape backups by connecting a dedicated external server to host the Veeam software. The new Quantum backup appliance removed that layer of complexity by installing a blade server into its Scalar i3 tape library. The server is preconfigured to be Veeam-ready, with Windows Server 2016 -- Veeam's server OS of choice -- preinstalled.

By installing a server to a tape device, Quantum has created the industry's first ever tape appliance with built-in compute power.

"In some ways, this is a new category of product," said Eric Bassier, senior director of product management and marketing at Quantum, based in Colorado Springs, Colo. "In Veeam environments, every other tape vendor's tape requires a dedicated external server that runs the Veeam tape server software ... We took that server, we built it into the tape library so that we eliminate that physical server, and we make it that much simpler and faster to create tape for offline protection."

Customers can buy a device with one IBM SAS LTO-8 tape drive for $17,000 or a two-drive version for $23,000.

Tape storage has been around for a long time and "still remains one of the lowest-cost, long-term ways to store your data," said Ken Ringdahl, vice president of global alliance architecture at Veeam, based in Baar, Switzerland. But Bassier stressed the new Quantum backup appliance's true role in the modern backup and recovery system is it protects against ransomware.

Render of Quantum Scalar i3 tape appliance


Quantum's Scalar converged tape appliance for Veeam

"It's offline. Data stored on tape is not connected to the network in any way. Because it's offline, it is the best and most effective protection against ransomware," Bassier said.

The ransomware threat has brought on a renewed interest in tape for backup.

Edwin Yuen, senior analyst at Enterprise Strategy Group in Milford, Mass., said ransomware has gotten more sophisticated over the past 12 to 18 months, and tape provides an offline backup method.

"Ransomware is not an acute event," Yuen said. "You're getting infected, and it's sitting there, waiting. Oftentimes, it's mapping out or finding other backups or other restores."

Storing data offline in a tape cartridge like the new Quantum backup option provides an air gap between live production systems and backed up data that is not possible to achieve with disk. That air gap can prevent ransomware from infecting live data.

If you really think about tape, it's one of those technologies that got dismissed, but never actually went away.
Edwin Yuensenior analyst, Enterprise Strategy Group

"If you really think about tape, it's one of those technologies that got dismissed, but never actually went away. It was consistently used; it just wasn't in vogue, so to speak. But there's certainly been a renewed interest in new uses for tape," Yuen said. "This integration by Quantum and Veeam really makes it a lot easier to bring tape into this configuration, so you can take advantage of that air gap."

According to Yuen, thanks to market maturity and the age of magnetic tape technology, there are now only a few major companies that manufacture tape libraries. This is why Yuen said he finds the partnership between Quantum and Veeam especially noteworthy, as it demonstrates a relatively young company showing interest in tape.

"The fact that these two companies came together shows interest across the board," Yuen said. "It's not a 20-year standard industry company, but one that's been an up-and-comer now getting into the tape market through this appliance."

Read more »

Posted by Thang Le Toan on 24 May 2018 01:57 AM

Robojournalism is the use of software programs to generate articles, reports and other types of content. 

Sophisticated content generation programs rely upon a combination of artificial intelligence (AI), data analytics and machine learning to produce content that can be hard to differentiate from that written by a human.


When an earthquake struck Los Angeles in the early morning hours of February 1, 2014, a content generation algorithm created by programmer/journalist Ken Schwencke posted the story to the L.A. Times within eight minutes of the tremor, complete with a map pinpointing the epicenter. 

Schwencke's software is designed to receive structured data from the US Geological Survey (USGS) and to determine, based on an earthquake's magnitude and proximity to California, whether it is news. The content generation program assembles the details in a vocabulary specific to the subject matter, including typical journalistic terms and turns of phrase. 

Ben Walsh makes a presentation about what he calls "human-assisted journalism":

Read more »

VeloCloud-VMware acquisition will battle Cisco in the branch
Posted by Thang Le Toan on 06 November 2017 11:36 PM

The VeloCloud-VMware acquisition will mark the first time VMware will compete directly with Cisco in networking. Cisco, however, remains the 800-pound gorilla.

VMware plans to acquire SD-WAN vendor VeloCloud Networks, a move that would turn the branch office into a battleground for the virtualization provider and Cisco.

The VeloCloud-VMware acquisition, announced this week, would be carried out in early February. With VeloCloud, VMware would go head-to-head against Cisco's ViptelaIWAN and Meraki brands. SD-WAN, in general, intelligently routes branch traffic across multiple links, such as broadband, MPLS and LTE.

"This is the first time that Cisco and VMware will directly compete in the networking world," said Shamus McGillicuddy, an analyst at Enterprise Management Associates, based in Boulder, Colo.

Before, the closest Cisco and VMware came to competing in networking was with their software-defined networking platforms ACI and NSX, respectively. The products, however, serve mostly different purposes in the data center. NSX provisions network services within VMware's virtualized computing environments while ACI distributes application-centric policies to Cisco switches.

VMware SDN marches to the branch

The VeloCloud-VMware acquisition, however, marks the start of taking NSX to the branch, where Cisco is already offering ACI. Both vendors are also working on extending their respective SDN platforms to enterprise software running on public clouds.

In the branch, VMware plans to provide SD-WAN, security, routing and other services on an NSX-based network overlay that's hardware agnostic. Rather than supply branch appliances for VeloCloud software, VMware wants customers to buy certified hardware from different vendors.

This is the first time that Cisco and VMware will directly compete in the networking world.
Shamus McGillicuddyanalyst, Enterprise Management Associates

"That is certainly our longer-term vision for this. That it will be a pure software play," said Rajiv Ramaswami, COO of cloud services at VMware, during a conference call with reporters and analysts.

In the short term, VMware would support appliances sold by VeloCloud, Ramaswami said. VMware's parent company, Dell EMC, also sells hardware for VeloCloud software.

While VMware shies away from hardware, Cisco has delivered centralized software that provisions network services to the branch through a new line of routers, called the Catalyst 9000s. In the future, Cisco could also provide a software-only option through the Enterprise Network Functions Virtualization platform (ENFV)  the company introduced last year. ENFV would run on Cisco servers or third-party certified hardware.

"Cisco is making multiple bets in SD-WAN," McGillicuddy said.

Cloud orchestration a key piece of VeloCloud-VMware acquisition

VMware is banking on VeloCloud's cloud-based network orchestration tools to evolve into a significant differentiator from Cisco and other WAN infrastructure providers. VMware could eventually use the technology to orchestrate network services in the branch and the cloud, Ramaswami said.



Find more PRO+ content and other member only offers, here.

VMware's ambitions do not alter the fact that it has a difficult road ahead battling Cisco. The latter company dominates the networking market with more than 150,000 paying customers for its WAN products, according to Gartner. VMware is the largest supplier of data center virtualization, but is a newbie in networking.

VeloCloud's roughly 1,000 customers include service providers, as well as enterprises. AT&T, Deutsche Telekom, Sprint, Vonage and Windstream are examples of carriers that offer the company's SD-WAN product as a service.

VMware sells network virtualization software to service providers and expects VeloCloud to help grow that relatively small business. "VeloCloud and their deep relationship with the service provider community is a huge route to a market accelerator," said Peder Ulander, a vice president of strategy at VMware.

Read more »

Telehealth Platform và Telehealth là gì ?
Posted by Thang Le Toan on 06 October 2017 11:59 PM

Telehealth is the transmission of health-related services or information over the telecommunications infrastructure. The term covers both telemedicine, which includes remote patient monitoring, and non-clinical elements of the healthcare system, such as education.


Telehealth examinations can be performed by physicians, nurses or other healthcare professionals over a videoconference connection to answer a patient's specific question about their condition. A telehealth visit can also be a remote substitute for a regular physician exam or as a follow-up visit to a previous care episode.

Convenience, for both sides of the care equation, is one of the major benefits of telehealth. Patients can communicate with physicians from their homes, or the patient can travel to a nearby public telehealth kiosk where a physician can conduct a thorough inspection of the patient's well-being.

In the United States, differences in state telemedicine licensure laws complicate the practice of telehealth. Some states require physicians to have full medical licenses to be able to practice telemedicine, while other states mandate physicians have special telemedicine licenses. Medicare and Medicaid reimbursement for telehealth services, such as remote checkups, has slowly been catching up to the level of in-person healthcare and the majority of states provide some amount of financial reimbursement to providers who perform telehealth visits.

The American Medical Association is one of the major healthcare groups that called for standards to be applied to telehealth to give patients more access to remote care services. The American Telemedicine Association, established in 1993, promotes the delivery of care through remote means and hosts a yearly conference on the latest news and developments in telehealth. The U.S. Department of Veterans Affairs (VA) also supports the development of telehealth. A bill introduced in Congress in 2015 would allow qualified VA health professionals to treat U.S. veterans without requiring the patient and physician to be in the same state.


What is the difference between telehealth and telemedicine or are the terms always used interchangeably?

Read more »


Sách là nguồn tri thức vô giá của nhân loại. Vì vậy, hãy biết tận dụng nó làm chìa khóa then chốt dẫn đến cánh cửa thành công.


Khi Satya Nadella lên nắm quyền Giám đốc Điều hành tại Microsoft vào đầu năm 2014, ông đã tuyên bố sẽ đưa cả công ty lên đến một tầm cao mới của một kỷ nguyên phát triển vượt bậc, và quả thật những gì ông làm được cũng chứng tỏ Nadella đang đi đúng hướng trên con đường đã vạch ra của mình.

Điểm mấu chốt mà Nadella tập trung thúc đẩy chính là “tư tưởng cấp tiến”, nhấn mạnh tầm quan trọng của việc học hỏi lẫn nhau, rút kinh nghiệm từ chính những sai lầm trước đó của mình, từ đó ngày càng củng cố va phát triển, hoàn thiện bản thân hơn nữa.

Trong một cuộc phỏng vấn với Dina Bass đến từ Bloomberg, Nadella đã chia sẻ rằng cuốn sách “Mindset: The New Psychology of Success” xuất bản vào năm 2007 bởi Giáo sư tâm lý học Carol Dweck tại Đại học Stanford đã giúp ông rất nhiều trong việc hình thành bản năng, nhân cách cũng như các nguyên tắc “tư tưởng cấp tiến”, cuối cùng lấy đó làm tiêu chuẩn cốt lõi mà ông đang áp dụng ngay tại Microsoft.

Satya Nadella trao đổi cùng nhân viên
Satya Nadella trao đổi cùng nhân viên

Dưới đây là những dòng chia sẻ của Nadella:

“Có một quan điểm khá đơn giản đã được Giáo sư Carol Dweck nhắc đến, trong đó nói rằng nếu bạn chọn 2 người, một người ‘liên tục học hỏi’ và một người thuộc loại ‘biết tất cả mọi thứ’, người thứ nhất sẽ luôn giành được ưu thế hơn so với đối thủ còn lại xét trên tổng thể, kể cả khi ban đầu họ chưa chứng tỏ được rõ năng lực của mình cho lắm.”

Cuốn sách của Dweck đã tập trung khai thác những khía cạnh sâu xa của vấn đề trên, đề cập đến việc một số người có những tư tưởng được “lập trình sẵn” từ trước, rằng tài năng của họ là tự nhiên theo bản năng bên trong con người, do đó việc cố gắng hơn nữa được cho là tốn công vô ích. Trong khi đó, số khác lại có được “tư tưởng cấp tiến”, tin rằng mọi thứ đều có thể được tìm ra cách giải quyết nếu cống hiến và làm việc chăm chỉ.

“Không phải ai thông minh nhất cũng là người giỏi nhất,” trích lời Dweck trong cuốn sách của mình.

Quả thực, triết lý trên đã có tác dụng rất tích cực đối với sự phát triển và tăng trưởng của Microsoft, tiến tới ảnh hưởng lên cả cơ chế sản xuất mảng máy tính của mình sau thất bại ở thị trường smartphone. Từ đó, Nadella đã tiến hành cân nhắc thêm nhiều phương án trong tương lai, bao gồm cả những góc độ liên quan đến đối thủ khó chịu Linux và thách thức đi kèm.

Giáo sư Carol Dweck (Stanford)
Giáo sư Carol Dweck (Stanford)

Trích lời Nadella trên Bloomberg: “Tôi cần phải thư giãn đầu óc, tự hỏi mình rằng ‘Có lúc nào mình đã quá cứng nhắc, hay có lúc nào mình đã không áp dụng theo tiêu chuẩn đã đề ra hay không?’ Nếu có thể hoàn toàn làm chủ được nó, tôi tin rằng công ty sẽ thu được nhiều thành tựu sáng chói hơn nữa đúng như những kỳ vọng của mọi người.”

Đồng sáng lập ra Microsoft - Bill Gates - cũng là một độc giả hâm mộ những triết lý của Carol Dweck trong cuốn “Mindset” đã đề cập phía trên.

Những nhận xét và chia sẻ của Nadella đã giúp cho “Mindset” vượt lên đứng top những cuốn sách bán chạy nhất trên Amazon mặc dù nó đã ra đời được 6 năm.

Về nội dung của cuốn sách, Dweck đã tiến đến giải thích lí do tại sao tài năng không phải là yếu tố duy nhất giúp chúng ta nắm giữ thành công, mà là cách chúng ta hình thành suy nghĩ và quan điểm về nó. Bà đã nhấn mạnh rằng việc chúng ta quan trọng hóa vai trò của khả năng ban đầu không hoàn toàn khích lệ những tư tưởng tiến bộ, mà còn có thể phản tác dụng, dẫn đến cản trở thành tích. Với một động lực và phương pháp đúng đắn, chúng ta luôn thu được những kết quả tích cực, tương tự như việc giáo dục con trẻ bằng cách đánh vào lòng tự trọng của chúng. Tầm quan trọng của một suy nghĩ có thể thay đổi kết quả của cả một quá trình, dựa trên nghị lực và sự kiên nhẫn - vốn đã được áp dụng bởi nhiều người nổi tiếng trên thế giới - chính là nội dung cơ bản và chính yếu nhất mà Dweck muốn truyền đạt lại cho người đọc.

Bài lược dịch trên báo TechInsider

Read more »

Veeam revenue benefits from more $ 1 million deals
Posted by Thang Le Toan on 21 July 2017 02:26 AM

Veeam Software today reported a a 27% year-over-year increase in revenue growth for total bookings and 53% year-over-year growth for deals that are more than $100,000 during the second quarter of this year, fueled by larger customers and cloud backup.

The Veeam revenue spike came from adding 13,000 new customers last quarter and finished June with 255,000 customers worldwide. Veeam claims it is adding an average of 4,000 new customers each month.

“We are continuing to see growth in total customers,” CEO Peter McKay said. “All the markets are growing (but) for us the enterprise market is the one that is growing the fastest for us. Two years ago, we expanded our market into the enterprise. That is something we have paid a lot of attention to.


“The deals are quite big. We have done more million-dollar deals this year than we did all of last year.”

A privately held company, Veeam does not disclose a specific breakdown of its figures every quarter but last January it reported numbers for the overall 2016 year. Veeam hit $607.4 million in bookings in 2016, which included new license sales and maintenance revenue, compared to $474 million in 2015.

At the VeeamON user in conference in May, McKay put the Veeam revenue goals at $1 billion by 2018 and $1.5 billion by 2020.

Veeam started out in 2006 as virtual machine backup, and now focuses on the cloud. McKay said it is finally seeing enterprises move their data protection to the cloud after a long reluctance to do so.

“Ten or 15 years ago, any opportunity was about virtualization,” McKay said. “Today, it’s about having the ability to move applications to the cloud. It’s definitely a hybrid cloud story (for enterprise customers). They want to be able to move applications to the cloud when they are ready. It’s more of a cloud readiness thing that we are seeing. They want to go at their own pace.”

The bulk Veeam’s growth last quarter came from its flagship Availability Suite. The suite handles backup, restores and replication through Veeam Backup and Replication along with monitoring, reporting and capacity planning in Veeam ONE for VMware vSphere and Microsoft Hyper-V deployments.

The company also reported the Veeam Cloud and Service Provider VCSP program, which offers Disaster Recovery as a Service (DRaaS) and Backup as a Service (BaaS), generated 79% year-over-year growth in 2016.  License booking grew 57% annually from the enterprise level customers.

This last quarter marked the first full quarter of Veeam’s HPE partnership. Veeam Software is integrated with HPE 3PAR StoreServ, HPE StoreVirtual and HPE StoreOnce for data and application availability and monitoring.

Veeam last month also said it would support for Nutanix AHV, a KVM-based hypervisor, in the Veeam Availability Suite later in 2017. The company also has strategic relationships with Pure Storage, integrating  Veeam Backup and Replication with Pure’s snapshots. Also, Veeam provides backup for the IBM Bluemix cloud computing platform.

In May, the vendor added Veeam Availability for Amazon Web Services (AWS) for enterprises who want to move multi-cloud or hybrid cloud environments via an agentless backup and recovery of AWS instances. This solution works with N2W Software’s Cloud Protection Manager so enterprises can copy data from AWS to a Veeam-hosted repository for backups and cross-platform disaster recovery.

Read more »

Evaluate data backup and restoration efforts to gain business value
Posted by Thang Le Toan on 21 July 2017 02:25 AM

In many organizations, data backups are performed at a much higher rate than restores. Fewer restores are never a bad thing, but is there a way to back up data more efficiently?

Backups are a necessary, if tedious, component of storing data. We back up data regularly -- daily or even multiple times per day -- knowing we can restore it when the inevitable happens. But how often do we restore? When it comes to data backup and restoration, what is the ratio of backup volume to restore volume?

FAQs: Data backup and archiving

What’s the difference between data backups and archives? Can tapes be used for archives? Should I use disk for archives? In this free guide, the experts from answer these common, but nevertheless important questions to help you construct the right data protection strategy for your organization.

Because data loss is a common concern, it's likely that organizations transfer hundreds of times more data to backup than they end up restoring. Surely there must be some way to get more business value from this effort. Alternatively, is there a way to reduce the effort to achieve the current business value?

Layers of backups

The usual approach to data backup and restoration is to back up an entire computer. However, this may not be the most efficient way to protect applications and their data.

Many applications in our data centers have their own recovery strategies. If we use these built-in features, then we may not need so many backups.

Microsoft Windows file shares have a previous version functionality that enables user self-service restores for deleted files. Database applications use logs to enable point-in-time recovery from a less recent backup. If we are aware of these layers of protection, we can adjust our backups to be more efficient and, potentially, less frequent.

How often and how much do you back up?

The more frequently you back up, the more granular your restore; however, you also transfer more data and require more space for these additional backups.

Most modern data backup applications do not create full backups every time; they make one full data copy and follow that with incremental backups to minimize the data transferred. The efficiency of moving from full copies to incremental has enabled more frequent backups, possibly without regard for the value of these backups. If we understand the business value of the application, we may be able to reduce the frequency of the backups based on the business risk of data loss.

Backup vs. archive

One thing to consider is the difference between backup and archive. Backups are about returning data to a recent point in the past, and the restored data from a backup still has current business value. The need for backups is driven by the business risk of losing data. Data backup and restoration is a relatively frequent activity that needs to happen fast, as business operations are delayed until the restore is complete.

The more frequently you back up, the more granular your restore; however, you also transfer more data and require more space for these additional backups.

Archives are used to see the state of the business at some distant point in time. The restored data in an archive is no longer directly relevant to the business. The need for archives is driven by regulatory compliance. Restore from the archive is far less common, and can have longer lead times, as immediate business operation is not dependent on the restore.

The result of these different requirements is that backups are stored on disks and archives are most often stored on tape or on cloud-based object storage. Data in archives is trapped, while backup data is more available to deliver immediate business value.

Scanning your backups

It is also worth identifying which storage characteristics are beneficial for data backup and restoration. Backups are generally sequential and write-intensive. Restores are sequential and read-intensive.

Backup storage is usually optimized for storing a lot of data and for sequential access. Production primary storage is usually smaller and more optimized for random access. If we use backup storage for periodic scanning and sequential access tasks, then we can offload it from the primary storage. The result is better performance of the primary storage.

One example of scanning is locating personally identifiable information that is stored with specific compliance requirements, and checking that we aren't storing credit card numbers on systems that are not compliant with the Payment Card Industry Data Security Standard.

Remediation still needs to happen on the primary storage, but we can offload the scanning to secondary backup storage.

DevOps from backups

Over the last few years, a new generation of data backup and restoration products has appeared that uses solid-state, as well as hard disk drives. This hybrid backup storage offers excellent performance for random access to the data in solid-state.

What can your organization do to make the backup process more efficient?

The result is that these backup stores can be used for test and development activities. It may be as simple as standing up a copy of the production environment to test a net software version before deploying it to production. It may be integrated into a continuous integration and deployment pipeline so that new software versions developed in-house are tested with an exact and up-to-date copy of production before deployment. A full copy of production is a great place to do functional testing in a DevOps environment.

Next Steps

Is a cloud backup strategy right for you?

Flash-based backups become a serious contender

Emerging data backup technologies

Dig Deeper on Archiving and backup

Read more »

Before you back up to the cloud, ask these big questions
Posted by Thang Le Toan on 21 July 2017 02:16 AM

These five questions for providers will get you off on the right foot to backing up data to the cloud. Some considerations, like cost, may seem easy, but that's not the case.

Using cloud storage as a backup destination is increasingly popular, and it can be a great way to have infinite off-site backup capacity. But before you leap in and back up to the cloud -- resolving your immediate backup capacity problem -- it is worth asking a few questions.

Drill down into today's new backup approaches

Due to the features of most modern backup software – which include snapshot management, DR elements, cloud support, VM protection – backups can do so much more than simply restoring data in the event of a storage or server failure. Download this guide and not only discover the latest upgrades to today's top enterprise backup vendors, but also learn where backup software works best in your computing environment.

We've used Amazon Web Services (AWS) when discussing pricing and other issues, as it is the largest and best understood cloud provider. There are many other cloud providers that have backup services built on top of AWS.

What will it cost?

In theory, the storage capacity associated with backing up to the cloud and to an archive is close to free. The reality is that capacity is nowhere near as close to free as we might like.

In theory, the storage capacity associated with backing up to the cloud and to an archive is close to free. The reality is that capacity is nowhere near as close to free as we might like.

AWS Glacier -- the cheapest option -- currently lists at $0.004 per GB, per month, so every 100 TB of archive data you store will cost approximately $5,000 per year. Bear in mind that archives required for compliance reasons accumulate quickly. If you store a new 100 TB archive on Glacier every month for three years, you spend more than $250,000 just for storage. Keep it up, and in the sixth year, you will spend more than $300,000 to store your 7 petabytes of archives.

There will also be costs to transfer the data. Inbound data is usually free, but outbound data will cost you. On AWS, the transfer into Glacier is free, but when you restore one of your 100 TB archives back to your data center, the download will cost around another $10,000.

How long will it take?

There are two parts to the how long question: How long the transfers will take and how long it will take for the restores to start. The good news is that most cloud backup systems can transfer data at tens of gigabytes per second. Your throughput will be limited by your network connection. You will need to analyze your current backups to identify the volume of data you need to transfer, and then calculate the speed of the network connection required.

More significant is the latency factor from the time you request a restore until the data starts to flow. Glacier is a cheap place to store data, but it can take as long as 12 hours to start a restore. After the start time, you will need to wait for the actual data transfer, which will take as long as the backup took to transfer. If you have a recovery time objective of six hours, for example, using Glacier to back up to the cloud might not be the best option.

Is it safe?

If you are using a smaller cloud provider to back up to the cloud, you should review its physical and network security.

Data security would be your next area of concern. Backup data is usually transferred over an encrypted tunnel, most often a Secure Sockets Layer connection.

The data is also encrypted at rest, but who holds the encryption keys? Anyone gaining access to the encrypted data and the encryption keys can decrypt your backups. Can you manage your own keys? If the provider manages the keys, how are they stored? Is the key store as highly available as the object store, where the encrypted data resides? The last thing you want is to be unable to restore because the data center with the key store is down.

Is it good for DR?

Do you plan to back up to the cloud for disaster recovery (DR) purposes? A DR event implies that your data center is not available and that you need to restore whole systems. Does the backup allow you to restore into virtual machines (VMs) on the cloud or onto new servers in a different data center? What do you need to have in place before the restore can start: bare metal or an installed OS?

You should also work through what happens if you restore into VMs on the cloud, and then want to move production back to your own data center.

What about my compliance?

Often, an archive is about regulatory compliance and requires a guaranteed unmodified state of your infrastructure at some point in the past. If you plan to use cloud backup as an archive store, check whether you can make certain backup points read-only. Also look at whether there is any automated lifecycle for these compliance archives. Having archive points deleted automatically when they are past their required retention time will help keep the storage bills under control and limit your e-discovery liability.

What's the first question you ask potential cloud backup providers?

Next Steps

Recognize the importance of a strategy for cloud backup

Guide to cloud backup and disaster recovery

Options and limitations of data backup in the cloud


Dig Deeper on Cloud backup

Read more »

OCI 1.0 container image spec finds common ground among open source foes
Posted by Thang Le Toan on 21 July 2017 02:11 AM

Touted as the USB interface of container management, OCI 1.0 will ensure consistency at the lowest levels of infrastructure, and push the container wars battlefront up the stack.

The container wars rage on, but a ceasefire two years in the making will standardize the most basic container components.

Most enterprises don't muck around deep in Linux container plumbing. Still, agreement on a standard container image format and runtime among open source container management software vendors, such as IBM, Red Hat, Docker, Google and CoreOS, is crucial for the technology to be viable in the long run. Today, that consensus was finalized with version 1.0 of the Open Container Initiative (OCI) standard.

Download: Azure vs. Google vs. AWS container services

Access this expert breakdown of the top 3 players in the cloud container platforms market. While AWS, Google and Azure all abstract elements of Docker container management away from users to optimize application deployment and scalability, each offers unique features. Find out the benefits and drawbacks of all 3 services.

"Just the fact that vendors are agreeing is a good thing," said Fintan Ryan, analyst at RedMonk, based in Portland, Maine. "Without this agreement, containers wouldn't be usable by enterprises."

The specification began when Docker donated its runC utility to the Linux Foundation in 2015, which touched off the OCI project, followed by Docker's image format. The rest of the community's work the last two years has been focused to standardize those components for Windows, Solaris and Linux operating systems, as well as multiple families of processors from x86 to IBM mainframes.

Docker expects OCI to be adopted by the Cloud Native Computing Foundation (CNCF), which also governs its containerd daemon and Docker container orchestration rival Kubernetes. Even CoreOS' rkt containers support the OCI standard for runtime and image format, after the company deprecated its appc utility in favor of the Linux Foundation standard.

OCI 1.0 also officially laid to rest last year's buzz around the possibility of a Docker fork.

Without this agreement, containers wouldn't be usable by enterprises.
Fintan Ryananalyst, RedMonk

The next step for OCI will be a battery of tests and a certification process that can be used to designate higher-level open source container management software products as "OCI Compatible." Currently, any such label claimed by a vendor is a misnomer, according to Docker officials. OCI also doesn't cover interoperability and portability between various systems, such as between Linux and Windows OSes or between multiple container orchestration tools. This standard's ratification does not mean containers are portable across operating systems, though it lays the groundwork for that development in the future.

Enterprise IT consultants compare OCI 1.0 to the USB standard in consumer technology.

"I can develop a platform against this spec and it should just work," said Chris Riley, director of solutions architecture at cPrime Inc., an Agile software development consulting firm in San Francisco. "This provides an interface everybody can agree on."

Container images and runtimes: A minor treaty amid the container battles

Version 1.0 of the Open Container Initiative standard is an important milestone in container maturity but covers less than 5% of the Docker codebase. Meanwhile, the open source container management software community remains divided along several other technical lines, such as  Container Runtime Interface using Open Container Initiative runtimes (CRI-O) versus containerd, or Kubernetes versus Docker swarm mode. Industry watchers don't expect further détente akin to OCI in these areas, but say it probably won't be necessary, either.

"The marketplace will speak and define the next abstraction," Riley said. "There's also always going to be monitoring, orchestration and routing tools, but that's where the vendors will probably say, 'Let us figure out our own best way to do that.'"

While the Kubernetes 1.7 release lays the groundwork for the popular container orchestration platform to support CRI-O, readers of industry tea leaves predict containerd will establish itself as the de facto standard there.

"CRI-O seems to be something no one actually uses, but there's been community effort around it," said Gary Chen, an analyst at IDC, the Framingham, Mass., research firm. Its continued development, like that of CoreOS rkt, shows this is still a market where people like to have lots of alternatives, he said.

Once OCI testing and certification processes are established, work will progress more slowly. It's too soon to say what tack future OCI work will take, but container image distribution and signing could potentially be areas of focus, according to Docker reps.

Read more »

A CIO's guide to cloud computing investments
Posted by Thang Le Toan on 13 July 2017 01:40 AM

More enterprises are reaching for the cloud to spur innovation and growth. In this SearchCIO Essential Guide, learn how to maximize the business benefits of your cloud computing investments.


With cloud computing being used by businesses across all industries and acting as a digital transformation enabler, CIOs must constantly review their cloud computing investments. They will indeed be spending more money on it: Our annual IT priorities survey of 971 North American IT professionals found that cloud services will continue to attract substantial attention in 2017, with 64% planning to increase cloud spending. And according to market analyst firm IDC, worldwide spending on public cloud services and infrastructure is estimated to grow at a 21.5% compound annual growth rate -- nearly seven times the rate of the total IT spending growth -- to reach $203.4 billion in 2020.

This Essential Guide is designed to help delve into the many facets of cloud computing investments, including cloud innovation, security, storage, application development and service provider management.

Cloud and innovation

Driving innovation via the cloud

Cloud computing investments are no longer just about lowering costs and improving efficiency. Cloud-based technology fosters greater collaboration across the company and cloud-based IT services also call for fewer resources, providing organizations with the opportunity to invest in other business processes and innovations within their organization. This section highlights the best practices to gain cloud visibility, the role that crowdsourcing could play in the cloud and why cloud robotics could be game-changers for artificial intelligence (AI).

Securing cloud data

Assessing key risks of cloud computing investments

Moving data to the cloud comes with its fair share of risks. How safe is your data in the cloud? Are you equipped to handle a cloud failure or breach? In this section, learn about the latest tools and strategies to counteract cloud threats and protect cloud data. Also, learn about new cloud security roles, risk management best practices and steps to cultivate a cloud-security culture.


Is it time to hire a chief cloud security officer?

With hackers running rampant, cloud providers are looking for a new breed of CISO -- the chief cloud security officer. Security experts sound off on the skills needed for this new cloud security role. Continue Reading


Chief cloud security officer: A new role focused on security

Cloud computing is giving rise to jobs that never before existed. Scott Weller, CTO at Boston startup SessionM, explains how one such new cloud role -- chief cloud security officer -- can help mitigate possible attack vectors in the cloud. Continue Reading

Blog Post

Custom applications in the cloud pose new security threat

Research shows that custom applications in the cloud are running more critical business functions than ever, but IT security is aware of only a fraction of them. Continue Reading


For businesses, cloud security still a concern

Security experts and executives provide insights on strategies and tools that can help counteract threats to cloud data security. Continue Reading


Corporate cloud customers on front lines of data loss prevention

In this tip, information security and risk management expert Jeff Jenkins discusses the steps cloud vendors and their corporate customers must take to protect cloud data and mitigate security concerns. Continue Reading


Compliance, data security top reasons for cloud encryption

Securosis founder Rich Mogull discusses different cloud computing models and the main reasons for cloud encryption. Continue Reading


Building a security culture is an essential first step to cloud strategy success

In an era where more data is being moved to the cloud, a report shows that engaging the CISO and cultivating a cloud security culture is imperative. Continue Reading


Cloud storage

Tips to maximize your cloud-based storage

Will your business benefit from the use of object storage systems rather than file storage systems when storing data in the cloud? Read the articles in this section to know about cloud storage tools and applications that are best for your organization, why hybrid cloud could be the future of cloud storage and why the OpenStack platform could be gaining ground.


Collier: The OpenStack platform has gained new life

At the OpenStack Summit in Boston, OpenStack Foundation COO Mark Collier explained why the nearly seven-year-old open source software platform is far from dead. Continue Reading


OpenStack public cloud adoption faster outside of U.S.

Providers in the Asian and European markets are turning to OpenStack to run their public clouds. Forrester analyst Lauren Nelson discusses why it's not happening stateside. Continue Reading


Cloud applications: Object storage vs. file storage

In this tip, Marc Staimer, founder and senior analyst at Dragon Slayer Consulting, discusses the various advantages of object storage systems over file systems when it comes to storing cloud data. Continue Reading


Cloud storage FAQs

In this video, storage analyst George Crump discusses where cloud storage is most useful, whether object storage is necessary to gain the benefits of the cloud, how to integrate cloud into your environment and the benefits of multicloud environments. Continue Reading


Why hybrid cloud is driving the future of cloud storage

The appeal of cloud storage services has forced storage hardware and software vendors to provide ways for customers to access data externally, in a hybrid cloud model. Find out about the options involved. Continue Reading


Maximizing your cloud-based storage

Analyst and consultant Mike Matchett provides tips to get the most from cloud-based storage devices -- and explains why a company's data storage manager can be of help. Continue Reading

Cloud data management

Cloud-based analytics: Understanding benefits and risks

Cloud data management provides benefits like accelerated technology deployments and reduced costs associated with both system maintenance and capital expenditure. Read the articles in this section to understand the benefits as well as the risks associated with cloud-based business analytics and how data center best practices translate to the cloud.


Cloud-based technologies to empower digital business

Law firm Mayer Brown predicts more companies will adopt cloud services to assist their digital business efforts and to reduce costs. Continue Reading


What CIOs can learn from the Amazon cloud outage

In the wake of the Amazon Web Services Simple Storage Service outage, industry watchers offer pointers on how CIOs should react to and prepare for future disruption. Continue Reading


Cloud-based BI risks and best practices

In this tip, data management consultant David Loshin identifies the risks associated with cloud-based data warehousing and business analytics and explains how to protect against them. Continue Reading


Adoption levels of managing and analyzing data in the cloud remain low

Find out why some businesses are enthusiastic, while some are hesitant when it comes to doing big data analytics and BI in the cloud. Continue Reading


Deployment models

Public vs. private vs. hybrid cloud: What's best for your company?

Are you cloud-ready? Which cloud model is best for you? Whether you choose public, private or hybrid cloud, knowing your options and business needs are vital to cloud success. This section highlights the good and the bad aspects of a multicloud strategy, some basic cloud considerations and a cloud comparison to help you decide which model best suits your organization's needs.


Integration key to multicloud management

What makes a company multicloud? Judith Hurwitz, CEO of Needham, Mass., consulting outfit Hurwitz & Associates, explains it's about integrating, connecting and optimizing your on-premises and cloud IT operations to perform as one unit. Continue Reading

Blog Post

The pros and cons of a multicloud strategy

With multicloud strategies quickly gaining popularity among organizations, analysts at market research outfits Gartner and IDC weigh in on its benefits and drawbacks. Continue Reading


When thinking of creating a private cloud, start with why

A private cloud requires lots of money and the staff skills to pull it off. Before you get too far along the road to private cloud, you need to seriously consider the value it would bring and whether public cloud could suffice. Continue Reading


Evaluating the cost of private cloud development

A Forrester Research study found that public cloud business is booming. But what does the future of private cloud look like? Find out the challenges and high costs associated with building a private cloud. Continue Reading


Strategic cloud decisions

How to maximize your cloud benefits

Of the several cloud benefits, cloud inventory management helps in reducing costs and security risks. Negotiating a sound cloud contract also helps mitigate risks associated with the cloud. With more organizations making cloud computing investments, this section highlights the best practices for cloud inventory management and determining metrics that reap maximum benefits from cloud computing investments, and provides tips on the factors that need to be considered before negotiating terms and conditions of a cloud contract.


CIO's role in cloud inventory management

For an effective cloud inventory management, a CIO's first step is to build better relationships with the company's decision-makers. Continue Reading


Employ better metrics to fine-tune your cloud-first strategy

Determining the benefits of a cloud-first strategy is difficult in large, complex IT environments. But there are metrics available to help CIOs. Continue Reading


Three cloud roles that can help CIOs drive business success

In a recent webinar, Gartner analyst Donna Scott detailed the three cloud roles that CIOs need to fill in order to gain cloud benefits: one for forming strategy, one for implementing that strategy and one for budgeting. Continue Reading


Software-as-a-service CIO seeks more than colocation from cloud and managed services

Read how one company weighed factors in the decision to move from colocation to cloud and managed services. Continue Reading


CIO peer review essential in cloud contract negotiations

CIO and CTOs weigh in on what to consider during cloud contract negotiations, including cloud provider liabilities, cloud exit strategies and the importance of CIO peer review. Continue Reading




Expand your cloud vocabulary

Before you start making cloud computing investments, get a handle on basic service-related cloud computing terms. Let this glossary be your guide.

Read more »

Help Desk Software by Kayako