Live Chat Software by Kayako
 News Categories
(19)Microsoft Technet (2)StarWind (4)TechRepublic (3)ComuterTips (1)SolarWinds (1)Xangati (1)MyVirtualCloud.net (27)VMware (5)NVIDIA (9)VDI (1)pfsense vRouter (3)VEEAM (3)Google (2)RemoteFX (1)developers.google.com (1)MailCleaner (1)Udemy (1)AUGI (2)AECbytes Architecture Engineering Constrution (7)VMGuru (2)AUTODESK (1)storageioblog.com (1)Atlantis Blog (7)AT.COM (2)community.spiceworks.com (1)archdaily.com (14)techtarget.com (2)hadoop360 (3)bigdatastudio (1)virtualizetips.com (1)blogs.vmware.com (3)VECITA (1)vecom.vn (1)Palo Alto Networks (4)itnews.com.au (2)serverwatch.com (1)Nhịp Cầu đầu tư (3)VnEconomy (1)Reuters (1)Tom Tunguz (1)Medium.com (1)Esri (1)www.specommerce.com (1)tweet (1)Tesla (1)fool.com (6)ITCNews (1)businessinsider.com (1)hbr.org Harvard Business Review (1)Haravan (2)techcrunch.com (1)vn.trendmicro.com (3)thangletoan.wordpress.com (3)IBM (1)www.droidmen.com (2)blog.parallels.com (1)betanews.com (6)searchvmware.techtarget.com (1)www.bctes.com (1)www.linux.com (4)blog.capterra.com (1)theelearningcoach.com (1)www.examgeneral.com (1)www.wetutoringnation.com (1)chamilo.org/ (1)www.formalms.org (1)chalkup.co (1)www.mindonsite.com (5)moodle.org (4)moodle.croydon.ac.uk (1)opensource.com (1)1tech.eu (1)remote-learner.net (1)paradisosolutions.com (2)sourceforge.net (7)searchbusinessanalytics.techtarget.com (1)nscs.gov.sg (1)virten.net (1)fastest.com.vn (1)elearninglearning.com (2)www.computerweekly.com (1)youtube.com (2)computer.howstuffworks.com (2)techz.vn (2)techsignin.com (1)itworld.com (7)searchsecurity.techtarget.com (1)makeuseof.com (1)nikse.dk (1)4kdownload.com (1)thegioididong.com (1)itcentralstation.com (1)www.dddmag.com (1)Engenius (1)networkcomputing.com (1)woshub.com (1)hainam121.wordpress.com (1)www.lucidchart.com (1)www.mof.gov.vn (3)www.servethehome.com (6)www.analyticsvidhya.com
RSS Feed
News
Nov
6
VeloCloud-VMware acquisition will battle Cisco in the branch
Posted by Thang Le Toan on 06 November 2017 11:36 PM

The VeloCloud-VMware acquisition will mark the first time VMware will compete directly with Cisco in networking. Cisco, however, remains the 800-pound gorilla.

VMware plans to acquire SD-WAN vendor VeloCloud Networks, a move that would turn the branch office into a battleground for the virtualization provider and Cisco.

The VeloCloud-VMware acquisition, announced this week, would be carried out in early February. With VeloCloud, VMware would go head-to-head against Cisco's ViptelaIWAN and Meraki brands. SD-WAN, in general, intelligently routes branch traffic across multiple links, such as broadband, MPLS and LTE.

"This is the first time that Cisco and VMware will directly compete in the networking world," said Shamus McGillicuddy, an analyst at Enterprise Management Associates, based in Boulder, Colo.

Before, the closest Cisco and VMware came to competing in networking was with their software-defined networking platforms ACI and NSX, respectively. The products, however, serve mostly different purposes in the data center. NSX provisions network services within VMware's virtualized computing environments while ACI distributes application-centric policies to Cisco switches.

VMware SDN marches to the branch

The VeloCloud-VMware acquisition, however, marks the start of taking NSX to the branch, where Cisco is already offering ACI. Both vendors are also working on extending their respective SDN platforms to enterprise software running on public clouds.

In the branch, VMware plans to provide SD-WAN, security, routing and other services on an NSX-based network overlay that's hardware agnostic. Rather than supply branch appliances for VeloCloud software, VMware wants customers to buy certified hardware from different vendors.

This is the first time that Cisco and VMware will directly compete in the networking world.
Shamus McGillicuddyanalyst, Enterprise Management Associates

"That is certainly our longer-term vision for this. That it will be a pure software play," said Rajiv Ramaswami, COO of cloud services at VMware, during a conference call with reporters and analysts.

In the short term, VMware would support appliances sold by VeloCloud, Ramaswami said. VMware's parent company, Dell EMC, also sells hardware for VeloCloud software.

While VMware shies away from hardware, Cisco has delivered centralized software that provisions network services to the branch through a new line of routers, called the Catalyst 9000s. In the future, Cisco could also provide a software-only option through the Enterprise Network Functions Virtualization platform (ENFV)  the company introduced last year. ENFV would run on Cisco servers or third-party certified hardware.

"Cisco is making multiple bets in SD-WAN," McGillicuddy said.

Cloud orchestration a key piece of VeloCloud-VMware acquisition

VMware is banking on VeloCloud's cloud-based network orchestration tools to evolve into a significant differentiator from Cisco and other WAN infrastructure providers. VMware could eventually use the technology to orchestrate network services in the branch and the cloud, Ramaswami said.

PRO+

Content

Find more PRO+ content and other member only offers, here.

VMware's ambitions do not alter the fact that it has a difficult road ahead battling Cisco. The latter company dominates the networking market with more than 150,000 paying customers for its WAN products, according to Gartner. VMware is the largest supplier of data center virtualization, but is a newbie in networking.

VeloCloud's roughly 1,000 customers include service providers, as well as enterprises. AT&T, Deutsche Telekom, Sprint, Vonage and Windstream are examples of carriers that offer the company's SD-WAN product as a service.

VMware sells network virtualization software to service providers and expects VeloCloud to help grow that relatively small business. "VeloCloud and their deep relationship with the service provider community is a huge route to a market accelerator," said Peder Ulander, a vice president of strategy at VMware.


Read more »



Oct
6
Telehealth Platform và Telehealth là gì ?
Posted by Thang Le Toan on 06 October 2017 11:59 PM

Telehealth is the transmission of health-related services or information over the telecommunications infrastructure. The term covers both telemedicine, which includes remote patient monitoring, and non-clinical elements of the healthcare system, such as education.

 

Telehealth examinations can be performed by physicians, nurses or other healthcare professionals over a videoconference connection to answer a patient's specific question about their condition. A telehealth visit can also be a remote substitute for a regular physician exam or as a follow-up visit to a previous care episode.

Convenience, for both sides of the care equation, is one of the major benefits of telehealth. Patients can communicate with physicians from their homes, or the patient can travel to a nearby public telehealth kiosk where a physician can conduct a thorough inspection of the patient's well-being.

In the United States, differences in state telemedicine licensure laws complicate the practice of telehealth. Some states require physicians to have full medical licenses to be able to practice telemedicine, while other states mandate physicians have special telemedicine licenses. Medicare and Medicaid reimbursement for telehealth services, such as remote checkups, has slowly been catching up to the level of in-person healthcare and the majority of states provide some amount of financial reimbursement to providers who perform telehealth visits.

The American Medical Association is one of the major healthcare groups that called for standards to be applied to telehealth to give patients more access to remote care services. The American Telemedicine Association, established in 1993, promotes the delivery of care through remote means and hosts a yearly conference on the latest news and developments in telehealth. The U.S. Department of Veterans Affairs (VA) also supports the development of telehealth. A bill introduced in Congress in 2015 would allow qualified VA health professionals to treat U.S. veterans without requiring the patient and physician to be in the same state.

 

What is the difference between telehealth and telemedicine or are the terms always used interchangeably?


Read more »



Jul
29

Sách là nguồn tri thức vô giá của nhân loại. Vì vậy, hãy biết tận dụng nó làm chìa khóa then chốt dẫn đến cánh cửa thành công.

 

Khi Satya Nadella lên nắm quyền Giám đốc Điều hành tại Microsoft vào đầu năm 2014, ông đã tuyên bố sẽ đưa cả công ty lên đến một tầm cao mới của một kỷ nguyên phát triển vượt bậc, và quả thật những gì ông làm được cũng chứng tỏ Nadella đang đi đúng hướng trên con đường đã vạch ra của mình.

Điểm mấu chốt mà Nadella tập trung thúc đẩy chính là “tư tưởng cấp tiến”, nhấn mạnh tầm quan trọng của việc học hỏi lẫn nhau, rút kinh nghiệm từ chính những sai lầm trước đó của mình, từ đó ngày càng củng cố va phát triển, hoàn thiện bản thân hơn nữa.

Trong một cuộc phỏng vấn với Dina Bass đến từ Bloomberg, Nadella đã chia sẻ rằng cuốn sách “Mindset: The New Psychology of Success” xuất bản vào năm 2007 bởi Giáo sư tâm lý học Carol Dweck tại Đại học Stanford đã giúp ông rất nhiều trong việc hình thành bản năng, nhân cách cũng như các nguyên tắc “tư tưởng cấp tiến”, cuối cùng lấy đó làm tiêu chuẩn cốt lõi mà ông đang áp dụng ngay tại Microsoft.

Satya Nadella trao đổi cùng nhân viên
Satya Nadella trao đổi cùng nhân viên

Dưới đây là những dòng chia sẻ của Nadella:

“Có một quan điểm khá đơn giản đã được Giáo sư Carol Dweck nhắc đến, trong đó nói rằng nếu bạn chọn 2 người, một người ‘liên tục học hỏi’ và một người thuộc loại ‘biết tất cả mọi thứ’, người thứ nhất sẽ luôn giành được ưu thế hơn so với đối thủ còn lại xét trên tổng thể, kể cả khi ban đầu họ chưa chứng tỏ được rõ năng lực của mình cho lắm.”

Cuốn sách của Dweck đã tập trung khai thác những khía cạnh sâu xa của vấn đề trên, đề cập đến việc một số người có những tư tưởng được “lập trình sẵn” từ trước, rằng tài năng của họ là tự nhiên theo bản năng bên trong con người, do đó việc cố gắng hơn nữa được cho là tốn công vô ích. Trong khi đó, số khác lại có được “tư tưởng cấp tiến”, tin rằng mọi thứ đều có thể được tìm ra cách giải quyết nếu cống hiến và làm việc chăm chỉ.

“Không phải ai thông minh nhất cũng là người giỏi nhất,” trích lời Dweck trong cuốn sách của mình.

Quả thực, triết lý trên đã có tác dụng rất tích cực đối với sự phát triển và tăng trưởng của Microsoft, tiến tới ảnh hưởng lên cả cơ chế sản xuất mảng máy tính của mình sau thất bại ở thị trường smartphone. Từ đó, Nadella đã tiến hành cân nhắc thêm nhiều phương án trong tương lai, bao gồm cả những góc độ liên quan đến đối thủ khó chịu Linux và thách thức đi kèm.

Giáo sư Carol Dweck (Stanford)
Giáo sư Carol Dweck (Stanford)

Trích lời Nadella trên Bloomberg: “Tôi cần phải thư giãn đầu óc, tự hỏi mình rằng ‘Có lúc nào mình đã quá cứng nhắc, hay có lúc nào mình đã không áp dụng theo tiêu chuẩn đã đề ra hay không?’ Nếu có thể hoàn toàn làm chủ được nó, tôi tin rằng công ty sẽ thu được nhiều thành tựu sáng chói hơn nữa đúng như những kỳ vọng của mọi người.”

Đồng sáng lập ra Microsoft - Bill Gates - cũng là một độc giả hâm mộ những triết lý của Carol Dweck trong cuốn “Mindset” đã đề cập phía trên.

Những nhận xét và chia sẻ của Nadella đã giúp cho “Mindset” vượt lên đứng top những cuốn sách bán chạy nhất trên Amazon mặc dù nó đã ra đời được 6 năm.

Về nội dung của cuốn sách, Dweck đã tiến đến giải thích lí do tại sao tài năng không phải là yếu tố duy nhất giúp chúng ta nắm giữ thành công, mà là cách chúng ta hình thành suy nghĩ và quan điểm về nó. Bà đã nhấn mạnh rằng việc chúng ta quan trọng hóa vai trò của khả năng ban đầu không hoàn toàn khích lệ những tư tưởng tiến bộ, mà còn có thể phản tác dụng, dẫn đến cản trở thành tích. Với một động lực và phương pháp đúng đắn, chúng ta luôn thu được những kết quả tích cực, tương tự như việc giáo dục con trẻ bằng cách đánh vào lòng tự trọng của chúng. Tầm quan trọng của một suy nghĩ có thể thay đổi kết quả của cả một quá trình, dựa trên nghị lực và sự kiên nhẫn - vốn đã được áp dụng bởi nhiều người nổi tiếng trên thế giới - chính là nội dung cơ bản và chính yếu nhất mà Dweck muốn truyền đạt lại cho người đọc.

Bài lược dịch trên báo TechInsider


Read more »



Jul
21
Veeam revenue benefits from more $ 1 million deals
Posted by Thang Le Toan on 21 July 2017 02:26 AM

Veeam Software today reported a a 27% year-over-year increase in revenue growth for total bookings and 53% year-over-year growth for deals that are more than $100,000 during the second quarter of this year, fueled by larger customers and cloud backup.

The Veeam revenue spike came from adding 13,000 new customers last quarter and finished June with 255,000 customers worldwide. Veeam claims it is adding an average of 4,000 new customers each month.

“We are continuing to see growth in total customers,” CEO Peter McKay said. “All the markets are growing (but) for us the enterprise market is the one that is growing the fastest for us. Two years ago, we expanded our market into the enterprise. That is something we have paid a lot of attention to.

 

“The deals are quite big. We have done more million-dollar deals this year than we did all of last year.”

A privately held company, Veeam does not disclose a specific breakdown of its figures every quarter but last January it reported numbers for the overall 2016 year. Veeam hit $607.4 million in bookings in 2016, which included new license sales and maintenance revenue, compared to $474 million in 2015.

At the VeeamON user in conference in May, McKay put the Veeam revenue goals at $1 billion by 2018 and $1.5 billion by 2020.

Veeam started out in 2006 as virtual machine backup, and now focuses on the cloud. McKay said it is finally seeing enterprises move their data protection to the cloud after a long reluctance to do so.

“Ten or 15 years ago, any opportunity was about virtualization,” McKay said. “Today, it’s about having the ability to move applications to the cloud. It’s definitely a hybrid cloud story (for enterprise customers). They want to be able to move applications to the cloud when they are ready. It’s more of a cloud readiness thing that we are seeing. They want to go at their own pace.”

The bulk Veeam’s growth last quarter came from its flagship Availability Suite. The suite handles backup, restores and replication through Veeam Backup and Replication along with monitoring, reporting and capacity planning in Veeam ONE for VMware vSphere and Microsoft Hyper-V deployments.

The company also reported the Veeam Cloud and Service Provider VCSP program, which offers Disaster Recovery as a Service (DRaaS) and Backup as a Service (BaaS), generated 79% year-over-year growth in 2016.  License booking grew 57% annually from the enterprise level customers.

This last quarter marked the first full quarter of Veeam’s HPE partnership. Veeam Software is integrated with HPE 3PAR StoreServ, HPE StoreVirtual and HPE StoreOnce for data and application availability and monitoring.

Veeam last month also said it would support for Nutanix AHV, a KVM-based hypervisor, in the Veeam Availability Suite later in 2017. The company also has strategic relationships with Pure Storage, integrating  Veeam Backup and Replication with Pure’s snapshots. Also, Veeam provides backup for the IBM Bluemix cloud computing platform.

In May, the vendor added Veeam Availability for Amazon Web Services (AWS) for enterprises who want to move multi-cloud or hybrid cloud environments via an agentless backup and recovery of AWS instances. This solution works with N2W Software’s Cloud Protection Manager so enterprises can copy data from AWS to a Veeam-hosted repository for backups and cross-platform disaster recovery.


Read more »



Jul
21
Evaluate data backup and restoration efforts to gain business value
Posted by Thang Le Toan on 21 July 2017 02:25 AM

In many organizations, data backups are performed at a much higher rate than restores. Fewer restores are never a bad thing, but is there a way to back up data more efficiently?

Backups are a necessary, if tedious, component of storing data. We back up data regularly -- daily or even multiple times per day -- knowing we can restore it when the inevitable happens. But how often do we restore? When it comes to data backup and restoration, what is the ratio of backup volume to restore volume?

FAQs: Data backup and archiving

What’s the difference between data backups and archives? Can tapes be used for archives? Should I use disk for archives? In this free guide, the experts from SearchDataBackup.com answer these common, but nevertheless important questions to help you construct the right data protection strategy for your organization.

Because data loss is a common concern, it's likely that organizations transfer hundreds of times more data to backup than they end up restoring. Surely there must be some way to get more business value from this effort. Alternatively, is there a way to reduce the effort to achieve the current business value?

Layers of backups

The usual approach to data backup and restoration is to back up an entire computer. However, this may not be the most efficient way to protect applications and their data.

Many applications in our data centers have their own recovery strategies. If we use these built-in features, then we may not need so many backups.

Microsoft Windows file shares have a previous version functionality that enables user self-service restores for deleted files. Database applications use logs to enable point-in-time recovery from a less recent backup. If we are aware of these layers of protection, we can adjust our backups to be more efficient and, potentially, less frequent.

How often and how much do you back up?

The more frequently you back up, the more granular your restore; however, you also transfer more data and require more space for these additional backups.

Most modern data backup applications do not create full backups every time; they make one full data copy and follow that with incremental backups to minimize the data transferred. The efficiency of moving from full copies to incremental has enabled more frequent backups, possibly without regard for the value of these backups. If we understand the business value of the application, we may be able to reduce the frequency of the backups based on the business risk of data loss.

Backup vs. archive

One thing to consider is the difference between backup and archive. Backups are about returning data to a recent point in the past, and the restored data from a backup still has current business value. The need for backups is driven by the business risk of losing data. Data backup and restoration is a relatively frequent activity that needs to happen fast, as business operations are delayed until the restore is complete.

The more frequently you back up, the more granular your restore; however, you also transfer more data and require more space for these additional backups.

Archives are used to see the state of the business at some distant point in time. The restored data in an archive is no longer directly relevant to the business. The need for archives is driven by regulatory compliance. Restore from the archive is far less common, and can have longer lead times, as immediate business operation is not dependent on the restore.

The result of these different requirements is that backups are stored on disks and archives are most often stored on tape or on cloud-based object storage. Data in archives is trapped, while backup data is more available to deliver immediate business value.

Scanning your backups

It is also worth identifying which storage characteristics are beneficial for data backup and restoration. Backups are generally sequential and write-intensive. Restores are sequential and read-intensive.

Backup storage is usually optimized for storing a lot of data and for sequential access. Production primary storage is usually smaller and more optimized for random access. If we use backup storage for periodic scanning and sequential access tasks, then we can offload it from the primary storage. The result is better performance of the primary storage.

One example of scanning is locating personally identifiable information that is stored with specific compliance requirements, and checking that we aren't storing credit card numbers on systems that are not compliant with the Payment Card Industry Data Security Standard.

Remediation still needs to happen on the primary storage, but we can offload the scanning to secondary backup storage.

DevOps from backups

Over the last few years, a new generation of data backup and restoration products has appeared that uses solid-state, as well as hard disk drives. This hybrid backup storage offers excellent performance for random access to the data in solid-state.

What can your organization do to make the backup process more efficient?

The result is that these backup stores can be used for test and development activities. It may be as simple as standing up a copy of the production environment to test a net software version before deploying it to production. It may be integrated into a continuous integration and deployment pipeline so that new software versions developed in-house are tested with an exact and up-to-date copy of production before deployment. A full copy of production is a great place to do functional testing in a DevOps environment.

Next Steps

Is a cloud backup strategy right for you?

Flash-based backups become a serious contender

Emerging data backup technologies

Dig Deeper on Archiving and backup


Read more »



Jul
21
Before you back up to the cloud, ask these big questions
Posted by Thang Le Toan on 21 July 2017 02:16 AM

These five questions for providers will get you off on the right foot to backing up data to the cloud. Some considerations, like cost, may seem easy, but that's not the case.

Using cloud storage as a backup destination is increasingly popular, and it can be a great way to have infinite off-site backup capacity. But before you leap in and back up to the cloud -- resolving your immediate backup capacity problem -- it is worth asking a few questions.

Drill down into today's new backup approaches

Due to the features of most modern backup software – which include snapshot management, DR elements, cloud support, VM protection – backups can do so much more than simply restoring data in the event of a storage or server failure. Download this guide and not only discover the latest upgrades to today's top enterprise backup vendors, but also learn where backup software works best in your computing environment.

We've used Amazon Web Services (AWS) when discussing pricing and other issues, as it is the largest and best understood cloud provider. There are many other cloud providers that have backup services built on top of AWS.

What will it cost?

In theory, the storage capacity associated with backing up to the cloud and to an archive is close to free. The reality is that capacity is nowhere near as close to free as we might like.

In theory, the storage capacity associated with backing up to the cloud and to an archive is close to free. The reality is that capacity is nowhere near as close to free as we might like.

AWS Glacier -- the cheapest option -- currently lists at $0.004 per GB, per month, so every 100 TB of archive data you store will cost approximately $5,000 per year. Bear in mind that archives required for compliance reasons accumulate quickly. If you store a new 100 TB archive on Glacier every month for three years, you spend more than $250,000 just for storage. Keep it up, and in the sixth year, you will spend more than $300,000 to store your 7 petabytes of archives.

There will also be costs to transfer the data. Inbound data is usually free, but outbound data will cost you. On AWS, the transfer into Glacier is free, but when you restore one of your 100 TB archives back to your data center, the download will cost around another $10,000.

How long will it take?

There are two parts to the how long question: How long the transfers will take and how long it will take for the restores to start. The good news is that most cloud backup systems can transfer data at tens of gigabytes per second. Your throughput will be limited by your network connection. You will need to analyze your current backups to identify the volume of data you need to transfer, and then calculate the speed of the network connection required.

More significant is the latency factor from the time you request a restore until the data starts to flow. Glacier is a cheap place to store data, but it can take as long as 12 hours to start a restore. After the start time, you will need to wait for the actual data transfer, which will take as long as the backup took to transfer. If you have a recovery time objective of six hours, for example, using Glacier to back up to the cloud might not be the best option.

Is it safe?

If you are using a smaller cloud provider to back up to the cloud, you should review its physical and network security.

Data security would be your next area of concern. Backup data is usually transferred over an encrypted tunnel, most often a Secure Sockets Layer connection.

The data is also encrypted at rest, but who holds the encryption keys? Anyone gaining access to the encrypted data and the encryption keys can decrypt your backups. Can you manage your own keys? If the provider manages the keys, how are they stored? Is the key store as highly available as the object store, where the encrypted data resides? The last thing you want is to be unable to restore because the data center with the key store is down.

Is it good for DR?

Do you plan to back up to the cloud for disaster recovery (DR) purposes? A DR event implies that your data center is not available and that you need to restore whole systems. Does the backup allow you to restore into virtual machines (VMs) on the cloud or onto new servers in a different data center? What do you need to have in place before the restore can start: bare metal or an installed OS?

You should also work through what happens if you restore into VMs on the cloud, and then want to move production back to your own data center.

What about my compliance?

Often, an archive is about regulatory compliance and requires a guaranteed unmodified state of your infrastructure at some point in the past. If you plan to use cloud backup as an archive store, check whether you can make certain backup points read-only. Also look at whether there is any automated lifecycle for these compliance archives. Having archive points deleted automatically when they are past their required retention time will help keep the storage bills under control and limit your e-discovery liability.

What's the first question you ask potential cloud backup providers?

Next Steps

Recognize the importance of a strategy for cloud backup

Guide to cloud backup and disaster recovery

Options and limitations of data backup in the cloud

 

Dig Deeper on Cloud backup


Read more »



Jul
21
OCI 1.0 container image spec finds common ground among open source foes
Posted by Thang Le Toan on 21 July 2017 02:11 AM

Touted as the USB interface of container management, OCI 1.0 will ensure consistency at the lowest levels of infrastructure, and push the container wars battlefront up the stack.

The container wars rage on, but a ceasefire two years in the making will standardize the most basic container components.

Most enterprises don't muck around deep in Linux container plumbing. Still, agreement on a standard container image format and runtime among open source container management software vendors, such as IBM, Red Hat, Docker, Google and CoreOS, is crucial for the technology to be viable in the long run. Today, that consensus was finalized with version 1.0 of the Open Container Initiative (OCI) standard.

Download: Azure vs. Google vs. AWS container services

Access this expert breakdown of the top 3 players in the cloud container platforms market. While AWS, Google and Azure all abstract elements of Docker container management away from users to optimize application deployment and scalability, each offers unique features. Find out the benefits and drawbacks of all 3 services.

"Just the fact that vendors are agreeing is a good thing," said Fintan Ryan, analyst at RedMonk, based in Portland, Maine. "Without this agreement, containers wouldn't be usable by enterprises."

The specification began when Docker donated its runC utility to the Linux Foundation in 2015, which touched off the OCI project, followed by Docker's image format. The rest of the community's work the last two years has been focused to standardize those components for Windows, Solaris and Linux operating systems, as well as multiple families of processors from x86 to IBM mainframes.

Docker expects OCI to be adopted by the Cloud Native Computing Foundation (CNCF), which also governs its containerd daemon and Docker container orchestration rival Kubernetes. Even CoreOS' rkt containers support the OCI standard for runtime and image format, after the company deprecated its appc utility in favor of the Linux Foundation standard.

OCI 1.0 also officially laid to rest last year's buzz around the possibility of a Docker fork.

Without this agreement, containers wouldn't be usable by enterprises.
Fintan Ryananalyst, RedMonk

The next step for OCI will be a battery of tests and a certification process that can be used to designate higher-level open source container management software products as "OCI Compatible." Currently, any such label claimed by a vendor is a misnomer, according to Docker officials. OCI also doesn't cover interoperability and portability between various systems, such as between Linux and Windows OSes or between multiple container orchestration tools. This standard's ratification does not mean containers are portable across operating systems, though it lays the groundwork for that development in the future.

Enterprise IT consultants compare OCI 1.0 to the USB standard in consumer technology.

"I can develop a platform against this spec and it should just work," said Chris Riley, director of solutions architecture at cPrime Inc., an Agile software development consulting firm in San Francisco. "This provides an interface everybody can agree on."

Container images and runtimes: A minor treaty amid the container battles

Version 1.0 of the Open Container Initiative standard is an important milestone in container maturity but covers less than 5% of the Docker codebase. Meanwhile, the open source container management software community remains divided along several other technical lines, such as  Container Runtime Interface using Open Container Initiative runtimes (CRI-O) versus containerd, or Kubernetes versus Docker swarm mode. Industry watchers don't expect further détente akin to OCI in these areas, but say it probably won't be necessary, either.

"The marketplace will speak and define the next abstraction," Riley said. "There's also always going to be monitoring, orchestration and routing tools, but that's where the vendors will probably say, 'Let us figure out our own best way to do that.'"

While the Kubernetes 1.7 release lays the groundwork for the popular container orchestration platform to support CRI-O, readers of industry tea leaves predict containerd will establish itself as the de facto standard there.

"CRI-O seems to be something no one actually uses, but there's been community effort around it," said Gary Chen, an analyst at IDC, the Framingham, Mass., research firm. Its continued development, like that of CoreOS rkt, shows this is still a market where people like to have lots of alternatives, he said.

Once OCI testing and certification processes are established, work will progress more slowly. It's too soon to say what tack future OCI work will take, but container image distribution and signing could potentially be areas of focus, according to Docker reps.


Read more »



Jul
13
A CIO's guide to cloud computing investments
Posted by Thang Le Toan on 13 July 2017 01:40 AM

More enterprises are reaching for the cloud to spur innovation and growth. In this SearchCIO Essential Guide, learn how to maximize the business benefits of your cloud computing investments.

Introduction

With cloud computing being used by businesses across all industries and acting as a digital transformation enabler, CIOs must constantly review their cloud computing investments. They will indeed be spending more money on it: Our annual IT priorities survey of 971 North American IT professionals found that cloud services will continue to attract substantial attention in 2017, with 64% planning to increase cloud spending. And according to market analyst firm IDC, worldwide spending on public cloud services and infrastructure is estimated to grow at a 21.5% compound annual growth rate -- nearly seven times the rate of the total IT spending growth -- to reach $203.4 billion in 2020.

This Essential Guide is designed to help delve into the many facets of cloud computing investments, including cloud innovation, security, storage, application development and service provider management.

Cloud and innovation

Driving innovation via the cloud

Cloud computing investments are no longer just about lowering costs and improving efficiency. Cloud-based technology fosters greater collaboration across the company and cloud-based IT services also call for fewer resources, providing organizations with the opportunity to invest in other business processes and innovations within their organization. This section highlights the best practices to gain cloud visibility, the role that crowdsourcing could play in the cloud and why cloud robotics could be game-changers for artificial intelligence (AI).

Securing cloud data

Assessing key risks of cloud computing investments

Moving data to the cloud comes with its fair share of risks. How safe is your data in the cloud? Are you equipped to handle a cloud failure or breach? In this section, learn about the latest tools and strategies to counteract cloud threats and protect cloud data. Also, learn about new cloud security roles, risk management best practices and steps to cultivate a cloud-security culture.

Feature

Is it time to hire a chief cloud security officer?

With hackers running rampant, cloud providers are looking for a new breed of CISO -- the chief cloud security officer. Security experts sound off on the skills needed for this new cloud security role. Continue Reading

Feature

Chief cloud security officer: A new role focused on security

Cloud computing is giving rise to jobs that never before existed. Scott Weller, CTO at Boston startup SessionM, explains how one such new cloud role -- chief cloud security officer -- can help mitigate possible attack vectors in the cloud. Continue Reading

Blog Post

Custom applications in the cloud pose new security threat

Research shows that custom applications in the cloud are running more critical business functions than ever, but IT security is aware of only a fraction of them. Continue Reading

Feature

For businesses, cloud security still a concern

Security experts and executives provide insights on strategies and tools that can help counteract threats to cloud data security. Continue Reading

Tip

Corporate cloud customers on front lines of data loss prevention

In this tip, information security and risk management expert Jeff Jenkins discusses the steps cloud vendors and their corporate customers must take to protect cloud data and mitigate security concerns. Continue Reading

Video

Compliance, data security top reasons for cloud encryption

Securosis founder Rich Mogull discusses different cloud computing models and the main reasons for cloud encryption. Continue Reading

News

Building a security culture is an essential first step to cloud strategy success

In an era where more data is being moved to the cloud, a report shows that engaging the CISO and cultivating a cloud security culture is imperative. Continue Reading

 

Cloud storage

Tips to maximize your cloud-based storage

Will your business benefit from the use of object storage systems rather than file storage systems when storing data in the cloud? Read the articles in this section to know about cloud storage tools and applications that are best for your organization, why hybrid cloud could be the future of cloud storage and why the OpenStack platform could be gaining ground.

News

Collier: The OpenStack platform has gained new life

At the OpenStack Summit in Boston, OpenStack Foundation COO Mark Collier explained why the nearly seven-year-old open source software platform is far from dead. Continue Reading

News

OpenStack public cloud adoption faster outside of U.S.

Providers in the Asian and European markets are turning to OpenStack to run their public clouds. Forrester analyst Lauren Nelson discusses why it's not happening stateside. Continue Reading

Tip

Cloud applications: Object storage vs. file storage

In this tip, Marc Staimer, founder and senior analyst at Dragon Slayer Consulting, discusses the various advantages of object storage systems over file systems when it comes to storing cloud data. Continue Reading

Video

Cloud storage FAQs

In this video, storage analyst George Crump discusses where cloud storage is most useful, whether object storage is necessary to gain the benefits of the cloud, how to integrate cloud into your environment and the benefits of multicloud environments. Continue Reading

Feature

Why hybrid cloud is driving the future of cloud storage

The appeal of cloud storage services has forced storage hardware and software vendors to provide ways for customers to access data externally, in a hybrid cloud model. Find out about the options involved. Continue Reading

Opinion

Maximizing your cloud-based storage

Analyst and consultant Mike Matchett provides tips to get the most from cloud-based storage devices -- and explains why a company's data storage manager can be of help. Continue Reading

Cloud data management

Cloud-based analytics: Understanding benefits and risks

Cloud data management provides benefits like accelerated technology deployments and reduced costs associated with both system maintenance and capital expenditure. Read the articles in this section to understand the benefits as well as the risks associated with cloud-based business analytics and how data center best practices translate to the cloud.

News

Cloud-based technologies to empower digital business

Law firm Mayer Brown predicts more companies will adopt cloud services to assist their digital business efforts and to reduce costs. Continue Reading

News

What CIOs can learn from the Amazon cloud outage

In the wake of the Amazon Web Services Simple Storage Service outage, industry watchers offer pointers on how CIOs should react to and prepare for future disruption. Continue Reading

Tip

Cloud-based BI risks and best practices

In this tip, data management consultant David Loshin identifies the risks associated with cloud-based data warehousing and business analytics and explains how to protect against them. Continue Reading

Feature

Adoption levels of managing and analyzing data in the cloud remain low

Find out why some businesses are enthusiastic, while some are hesitant when it comes to doing big data analytics and BI in the cloud. Continue Reading

 

Deployment models

Public vs. private vs. hybrid cloud: What's best for your company?

Are you cloud-ready? Which cloud model is best for you? Whether you choose public, private or hybrid cloud, knowing your options and business needs are vital to cloud success. This section highlights the good and the bad aspects of a multicloud strategy, some basic cloud considerations and a cloud comparison to help you decide which model best suits your organization's needs.

News

Integration key to multicloud management

What makes a company multicloud? Judith Hurwitz, CEO of Needham, Mass., consulting outfit Hurwitz & Associates, explains it's about integrating, connecting and optimizing your on-premises and cloud IT operations to perform as one unit. Continue Reading

Blog Post

The pros and cons of a multicloud strategy

With multicloud strategies quickly gaining popularity among organizations, analysts at market research outfits Gartner and IDC weigh in on its benefits and drawbacks. Continue Reading

Feature

When thinking of creating a private cloud, start with why

A private cloud requires lots of money and the staff skills to pull it off. Before you get too far along the road to private cloud, you need to seriously consider the value it would bring and whether public cloud could suffice. Continue Reading

Feature

Evaluating the cost of private cloud development

A Forrester Research study found that public cloud business is booming. But what does the future of private cloud look like? Find out the challenges and high costs associated with building a private cloud. Continue Reading

 

Strategic cloud decisions

How to maximize your cloud benefits

Of the several cloud benefits, cloud inventory management helps in reducing costs and security risks. Negotiating a sound cloud contract also helps mitigate risks associated with the cloud. With more organizations making cloud computing investments, this section highlights the best practices for cloud inventory management and determining metrics that reap maximum benefits from cloud computing investments, and provides tips on the factors that need to be considered before negotiating terms and conditions of a cloud contract.

Feature

CIO's role in cloud inventory management

For an effective cloud inventory management, a CIO's first step is to build better relationships with the company's decision-makers. Continue Reading

Feature

Employ better metrics to fine-tune your cloud-first strategy

Determining the benefits of a cloud-first strategy is difficult in large, complex IT environments. But there are metrics available to help CIOs. Continue Reading

Tip

Three cloud roles that can help CIOs drive business success

In a recent webinar, Gartner analyst Donna Scott detailed the three cloud roles that CIOs need to fill in order to gain cloud benefits: one for forming strategy, one for implementing that strategy and one for budgeting. Continue Reading

News

Software-as-a-service CIO seeks more than colocation from cloud and managed services

Read how one company weighed factors in the decision to move from colocation to cloud and managed services. Continue Reading

Feature

CIO peer review essential in cloud contract negotiations

CIO and CTOs weigh in on what to consider during cloud contract negotiations, including cloud provider liabilities, cloud exit strategies and the importance of CIO peer review. Continue Reading

 

 

Glossary

Expand your cloud vocabulary

Before you start making cloud computing investments, get a handle on basic service-related cloud computing terms. Let this glossary be your guide.


Read more »



Jun
24
predictive maintenance (PdM)
Posted by Thang Le Toan on 24 June 2017 03:35 AM

Predictive maintenance (PdM) is the servicing of equipment when it is estimated that service is required, within a certain tolerance. PdM is used in railroads, industrial equipment, manufacturing plants and oil and gas processing.

 

Maintaining machinery and electronics is most cost-effective if done when it is needed. To that end, predictive maintenance systems are designed to ensure that servicing doesn't happen too soon, wasting money on unnecessary work, or too late, after wear and time have caused undue deterioration. PdM systems can also help plan an inventory for replacement parts and provide input on systems that need a design upgrade because of unacceptable performance.

PdM is enabled by the advances in sensor and communication technologies that are part of the ongoing trends of automation and the Internet of Things (IoT), and particularly the Industrial IoT.  Those advances enable continuous monitoring and data analytics for mechanical and electrical conditions, operational efficiency and other performance indicators.

PdM works through the use of sensors, often tied together with PdM software on a wireless sensor and actuator network (WSAN). The software takes into account time, mileage or usage and the measured, sometimes minute changes from sensors that can indicate a need for service before reduced performance or degradation of the equipment occurs. These measurement are often performed on machines while they are running to reduce impact on production.

Many measurements are taken to estimate the need for service, including:

  • Oil and oil contaminant analysis.
  • Sound and ultrasound analysis.
  • Vibration analysis.
  • Infrared analysis.

Read more »



Jun
22
kiosk
Posted by Thang Le Toan on 22 June 2017 06:56 AM

A kiosk (pronounced KEE-ahsk) is a small, free-standing physical structure that displays information or provides a service. Kiosks can be manned or unmanned, and unmanned kiosks can be digital or non-digital. The word kiosk is of French, Turkish and Persian origin and means pavilion or portico.

 

In business, kiosks are often used in locations with high foot traffic. In a shopping mall, for example, an unmanned, non-digital kiosk can be placed near entrances to provide people passing by with directions or promotional messaging. Manned kiosks temporarily set up in aisles can provide businesses that have seasonal sales cycles with a cost-effective way to display wares, and digital kiosks placed near movie theaters can provide online banking or ticket sales services.

Unmanned digital kiosks that provide customers with self-service capabilities are increasingly being used for such things as hotel check-in, retail sales check-out and healthcare screenings in pharmacies for vital information such as blood pressure. Amazon and Walmart are currently experimenting with how to optimize the click-and-mortar experience, testing kiosks that dispense merchandise previously ordered online.

When an unmanned kiosk is programmed with software that incorporates artificial intelligence, it can provide customers with an experience that, in some cases, is quite similar to that of a manned kiosk. For example, an intelligent check-in kiosk at an airport can monitor a variety of data sources, including passenger check-in flow, and programmatically request additional kiosks to be activated in real time during busy periods.

 

Continue Reading About kiosk


Read more »




Help Desk Software by Kayako