Live Chat Software by Kayako
 News Categories
(19)Microsoft Technet (2)StarWind (4)TechRepublic (3)ComuterTips (1)SolarWinds (1)Xangati (1) (27)VMware (5)NVIDIA (9)VDI (1)pfsense vRouter (3)VEEAM (3)Google (2)RemoteFX (1) (1)MailCleaner (1)Udemy (1)AUGI (2)AECbytes Architecture Engineering Constrution (7)VMGuru (2)AUTODESK (1) (1)Atlantis Blog (7)AT.COM (2) (1) (14) (2)hadoop360 (3)bigdatastudio (1) (1) (3)VECITA (1) (1)Palo Alto Networks (4) (2) (1)Nhịp Cầu đầu tư (3)VnEconomy (1)Reuters (1)Tom Tunguz (1) (1)Esri (1) (1)tweet (1)Tesla (1) (6)ITCNews (1) (1) Harvard Business Review (1)Haravan (2) (1) (3) (3)IBM (1) (2) (1) (6) (1) (1) (4) (1) (1) (1) (1) (1) (1) (1) (5) (4) (1) (1) (1) (1) (2) (7) (1) (1) (1) (1) (2) (1) (2) (2) (2) (1) (7) (1) (1) (1) (1) (1) (1) (1)Engenius (1) (1) (1) (1) (1) (3) (6)
RSS Feed
cloud application
Posted by Thang Le Toan on 15 November 2017 12:13 PM

A cloud application, or cloud app, is a software program where cloud-based and local components work together. This model relies on remote servers for processing logic that is accessed through a web browser with a continual internet connection.


Cloud application servers typically are located in a remote data center operated by a third-party cloud services infrastructure provider. Cloud-based application tasks may encompass email, file storage and sharing, order entry, inventory management, word processing, customer relationship management (CRM), data collection, or financial accounting features.


Benefits of cloud apps

Fast response to business needs. Cloud applications can be updated, tested and deployed quickly, providing enterprises with fast time to market and agility. This speed can lead to culture shifts in business operations.

Simplified operation. Infrastructure management can be outsourced to third-party cloud providers.

Instant scalability. As demand rises or falls, available capacity can be adjusted.

API use. Third-party data sources and storage services can be accessed with an application programming interface (API). Cloud applications can be kept smaller by using APIs to hand data to applications or API-based back-end services for processing or analytics computations, with the results handed back to the cloud application. Vetted APIs impose passive consistency that can speed development and yield predictable results.

Gradual adoption. Refactoring legacy, on-premises applications to a cloud architecture in steps, allows components to be implemented on a gradual basis.

Reduced costs. The size and scale of data centers run by major cloud infrastructure and service providers, along with competition among providers, has led to lower prices. Cloud-based applications can be less expensive to operate and maintain than equivalents on-premises installation.

Improved data sharing and security. Data stored on cloud services is instantly available to authorized users. Due to their massive scale, cloud providers can hire world-class security experts and implement infrastructure security measures that typically only large enterprises can obtain. Centralized data managed by IT operations personnel is more easily backed up on a regular schedule and restored should disaster recovery become necessary.

Benefits of cloud apps

How cloud apps work

Data is stored and compute cycles occur in a remote data center typically operated by a third-party company. A back end ensures uptime, security and integration and supports multiple access methods.

Cloud applications provide quick responsiveness and don't need to permanently reside on the local device. They can function offline, but can be updated online.

While under constant control, cloud applications don't always consume storage space on a computer or communications device. Assuming a reasonably fast internet connection, a well-written cloud application offers all the interactivity of a desktop application, along with the portability of a web application.

Cloud apps vs. web apps

With the advancement of remote computing technology, clear lines between cloud and web applications have blurred. The term cloud application has gained great cachet, sometimes leading application vendors with any online aspect to brand them as cloud applications.

Cloud and web applications access data residing on distant storage. Both use server processing power that may be located on premises or in a distant data center.

A key difference between cloud and web applications is architecture. A web application or web-based application must have a continuous internet connection to function. Conversely, a cloud application or cloud-based application performs processing tasks on a local computer or workstation. An internet connection is required primarily for downloading or uploading data.

A web application is unusable if the remote server is unavailable. If the remote server becomes unavailable in a cloud application, the software installed on the local user device can still operate, although it cannot upload and download data until service at the remote server is restored.

The difference between cloud and web applications can be illustrated with two common productivity tools, email and word processing. Gmail, for example, is a web application that requires only a browser and internet connection. Through the browser, it's possible to open, write and organize messages using search and sort capabilities. All processing logic occurs on the servers of the service provider (Google, in this example) via either the internet's HTTP or HTTPS protocols.

A CRM application accessed through a browser under a fee-based software as a service (SaaS) arrangement is a web application. Online banking and daily crossword puzzles are also considered web applications that don't install software locally.

An example of a word-processing cloud application that is installed on a workstation is Word's Microsoft Office 365. The application performs tasks locally on a machine without an internet connection. The cloud aspect comes into play when users save work to an Office 365 cloud server.

Cloud apps vs. desktop apps

Desktop applications are platform-dependent and require a separate version for each operating system. The need for multiple versions increases development time and cost, and complicates testing, version control and support. Conversely, cloud applications can be accessed through a variety of devices and operating systems and are platform-independent, which typically leads to significant cost savings.

Every device on a desktop application requires its own installation. Because it's not possible to enforce an upgrade whenever a new version is available, it's tricky to have all users running the same one. The need to provide support for multiple versions simultaneously can become a burden on tech support. Cloud applications don't face version control issues since users can access and run only the version available on the cloud.

Testing of cloud apps

Testing cloud applications prior to deployment is essential to ensure security and optimal performance.

A cloud application must consider internet communications with numerous clouds and a likelihood of accessing data from multiple sources simultaneously. Using API calls, a cloud application may rely on other cloud services for specialized processing. Automated testing can help in this multicloud, multisource and multiprovider ecosystem.

How do you differentiate between cloud applications and web applications?

The maturation of container and microservices technologies has introduced additional layers of testing and potential points of failure and communication. While containers can simplify application development and provide portability, a proliferation of containers introduces additional complexity. Containers must be managed, cataloged and secured, with each tested for its own performance, security and accuracy. Similarly, as legacy monolithic applications that perform numerous, disparate tasks are refactored into many single-task microservices that must interoperate seamlessly and efficiently, test scripts and processes grow correspondingly complex and time-consuming.

Testing cloud applications security includes penetration and data testing. Potential attack vectors, including advanced persistent threats, distributed denial of services (DDoS), phishing and social engineering, must also be examined.

Cloud applications must be tested to ensure processing logic is error-free. Test procedures may be required to conform to rules established by a given third-party provider.



Read more »

IBM Cloud Private pulls from Big Blue's roots
Posted by Thang Le Toan on 10 November 2017 02:04 AM

IBM sticks close to its roots with IBM Cloud Private, which taps Big Blue's enterprise and middleware strengths to move customers from the data center to private cloud.

Despite continually working to reinvent itself, IBM never strays far from its roots, as evidenced by its move to bring cloud-native technology to the enterprise data center to accelerate digital transformation efforts.

Earlier last week, IBM launched IBM Cloud Private, which enables enterprises to bring modern development technologies such as containers, microservices and APIs -- all attributes of public cloudenvironments -- to private clouds in the data center, where IBM has tenure as a leading technology provider.

Big Blue dominant in the data center

IBM has long held a dominant position in the data center, with its mainframe, database and middleware technology. Now, the company is launching off that base to help its enterprise customers in regulated industries or that have sensitive data -- such as healthcare, government and finance -- gain the benefits of cloud-native computing development tools and processes, portability and integration.

"As part of its private cloud offering, IBM's been enhancing its developer services in the form of an integrated DevOps tool chain via a service catalog featuring a range of runtimes, development frameworks, tools, middleware, OSS and other services," Charlotte Dunlap, an analyst with GlobalData, said. "This plays into IBM's intent to provide developers with the tools, languages and frameworks they're accustomed to using, e.g., extending services to Node.js or Swift developers."

Indeed, the new offering provides developers with access to a variety of management and DevOps tools, including application performance management, Netcool, UrbanCode and Cloud Brokerage. It also includes support for popular tools such as Jenkins, Prometheus, Grafana, and ElasticSearch.

Kubernetes at its core

Yet, it all starts with the Kubernetes container orchestration platform and supports both Docker and Cloud Foundry.

Steve Robinson, general manager of IBM Hybrid Cloud, said that after several entries into the private cloud space with offerings such as Bluemix Local and others, Big Blue "took a clean sheet of paper and took a look at modern development technologies" and decided to base IBM Cloud Private on Kubernetes. "Then, we decided to bring our DevOps stack and middleware stack forward," he said.

IBM introduced container-optimized versions of its core middleware -- IBM WebSphere Liberty, Db2 and MQ messaging middleware -- to complement the new product.


Positioning vs. competition


Meanwhile, some observers view IBM Cloud Private as IBM's answer to competing offerings such as Microsoft Azure Stack, which provides similar on-premises capabilities. However, IBM said that its strength in middleware and its foundation in enterprise systems set it apart.


"This better positions IBM against primary rivals which are Microsoft Azure Stack and VMware/Pivotal, with a cloud strategy that has evolved up the stack from [infrastructure as a service] to [platform as a service] and now to what they call 'enterprise transformation' -- meaning more personalized customer engagement capabilities fulfilled through technologies supporting multi-cloud, cognitive and API, and blockchain," Dunlap said of the new product. "IBM says 71% of its customers today use three or more clouds including public, private and departmental. Private remains their largest customer opportunity with complex requirements and latency issues."


This is a key opportunity for IBM in bridging from leading provider for traditional enterprise applications to leading provider for cloud-modernized and cloud-native applications on its IBM Cloud Private and IBM Public Cloud offerings.
Rhett Dillinghamanalyst, Moor Insights & Strategies


Based on its own data, IBM estimated that customers will spend more than $50 billion annually on private cloud infrastructure beginning in 2017 and growing at 15% to 20% each year through 2020.


Microsoft's one big advantage in the segment is being able to do both public and private cloud almost seamlessly, said Rob Enderle, an industry expert and founder of the Enderle Group.


"Recently, Cisco and Google partnered to provide the same capability, and now IBM is moving at the same opportunity," he said. "IBM, like Cisco, should be particularly strong on the on-premises side of this and their execution with SoftLayer has been very strong of late resulting in what should be a very competitive offering. This should expand the available market for IBM's now hybrid solution significantly."


In a statement, Tyler Best, CTO of car rental giant Hertz, said, "Private cloud is a must for many enterprises, such as ours, working to reduce or eliminate their dependence on internal data centers." He added that a strategy of public, private and hybrid cloud is "essential" for large enterprises transitioning from legacy systems to the cloud.


With such a big opportunity at stake, every cloud vendor is positioning itself to capture as much of the wave of enterprise interest in Kubernetes as possible onto its own platform, said Rhett Dillingham, an analyst at Moor Insights & Strategy. And with IBM Cloud Private, IBM is providing its Kubernetes-based platform for use on private infrastructure with the integrated value of its investment in complementary management and developer tooling.


"As part of this, IBM is offering new containerized versions of its software and development frameworks, because it has a big opportunity to help its existing software customers transition to cloud by modernizing their management of IBM WebSphere Liberty-, Db2- and MQ-based applications using containers via Kubernetes," Dillingham said. "This is a key opportunity for IBM in bridging from leading provider for traditional enterprise applications to leading provider for cloud-modernized and cloud-native applications on its IBM Cloud Private and IBM Public Cloud offerings."

Sticking to its knitting

So, with IBM Cloud Private, IBM is sticking to its knitting while helping to advance its enterprise customers with modern development tools.

"IBM Cloud Private extends the value of customers' existing IBM investments rather than being a new, on-premises cloud platform, like Microsoft's Azure Stack," said Charles King, principal analyst at Pund-IT.

The primary benefit of this offering is it enables enterprises to take advantage of the investments they've already made in existing systems, applications and data by bringing them into an elastic cloud platform.

"This will help accelerate application development, more easily expose these applications to new public cloud services and even provide the option of moving applications to the public cloud," said Michael Elder, distinguished engineer for the IBM Cloud Private platform. "We also think it sets an enterprise up with a powerful new tool for workload portability from their datacenter to the public cloud."

The platform provides tools to help bootstrap new applications into containers and enable existing applications for the cloud, he noted.

"We also build IBM Microservice Builder into the platform, which offers preconfigured Jenkins CI service build container images and publishes them to the built-in image registry right out of the box," Elder said.

The system also includes other management and security features, such as multi-cloud management automation, a security vulnerability advisor, data encryption and privileged access, and more.

Moreover, IBM Cloud Private supports Intel-based hardware from Cisco, Dell EMC, Lenovo and NetApp, and it can be deployed via VMware, Canonical and other OpenStack distributions.


Read more »

VeloCloud-VMware acquisition will battle Cisco in the branch
Posted by Thang Le Toan on 06 November 2017 11:36 PM

The VeloCloud-VMware acquisition will mark the first time VMware will compete directly with Cisco in networking. Cisco, however, remains the 800-pound gorilla.

VMware plans to acquire SD-WAN vendor VeloCloud Networks, a move that would turn the branch office into a battleground for the virtualization provider and Cisco.

The VeloCloud-VMware acquisition, announced this week, would be carried out in early February. With VeloCloud, VMware would go head-to-head against Cisco's ViptelaIWAN and Meraki brands. SD-WAN, in general, intelligently routes branch traffic across multiple links, such as broadband, MPLS and LTE.

"This is the first time that Cisco and VMware will directly compete in the networking world," said Shamus McGillicuddy, an analyst at Enterprise Management Associates, based in Boulder, Colo.

Before, the closest Cisco and VMware came to competing in networking was with their software-defined networking platforms ACI and NSX, respectively. The products, however, serve mostly different purposes in the data center. NSX provisions network services within VMware's virtualized computing environments while ACI distributes application-centric policies to Cisco switches.

VMware SDN marches to the branch

The VeloCloud-VMware acquisition, however, marks the start of taking NSX to the branch, where Cisco is already offering ACI. Both vendors are also working on extending their respective SDN platforms to enterprise software running on public clouds.

In the branch, VMware plans to provide SD-WAN, security, routing and other services on an NSX-based network overlay that's hardware agnostic. Rather than supply branch appliances for VeloCloud software, VMware wants customers to buy certified hardware from different vendors.

This is the first time that Cisco and VMware will directly compete in the networking world.
Shamus McGillicuddyanalyst, Enterprise Management Associates

"That is certainly our longer-term vision for this. That it will be a pure software play," said Rajiv Ramaswami, COO of cloud services at VMware, during a conference call with reporters and analysts.

In the short term, VMware would support appliances sold by VeloCloud, Ramaswami said. VMware's parent company, Dell EMC, also sells hardware for VeloCloud software.

While VMware shies away from hardware, Cisco has delivered centralized software that provisions network services to the branch through a new line of routers, called the Catalyst 9000s. In the future, Cisco could also provide a software-only option through the Enterprise Network Functions Virtualization platform (ENFV)  the company introduced last year. ENFV would run on Cisco servers or third-party certified hardware.

"Cisco is making multiple bets in SD-WAN," McGillicuddy said.

Cloud orchestration a key piece of VeloCloud-VMware acquisition

VMware is banking on VeloCloud's cloud-based network orchestration tools to evolve into a significant differentiator from Cisco and other WAN infrastructure providers. VMware could eventually use the technology to orchestrate network services in the branch and the cloud, Ramaswami said.



Find more PRO+ content and other member only offers, here.

VMware's ambitions do not alter the fact that it has a difficult road ahead battling Cisco. The latter company dominates the networking market with more than 150,000 paying customers for its WAN products, according to Gartner. VMware is the largest supplier of data center virtualization, but is a newbie in networking.

VeloCloud's roughly 1,000 customers include service providers, as well as enterprises. AT&T, Deutsche Telekom, Sprint, Vonage and Windstream are examples of carriers that offer the company's SD-WAN product as a service.

VMware sells network virtualization software to service providers and expects VeloCloud to help grow that relatively small business. "VeloCloud and their deep relationship with the service provider community is a huge route to a market accelerator," said Peder Ulander, a vice president of strategy at VMware.

Read more »

Docker-supported OS list expands with Enterprise Edition update
Posted by Thang Le Toan on 20 August 2017 11:20 PM

Docker Enterprise Edition fired back at Kubernetes with new support for mixed clusters and applications, as well as advanced security features that target large enterprises.

Docker Enterprise Edition has strengthened its case for large IT buyers of container orchestration tools, with new OS support, security and policy-based automation features.

Docker-supported OS types now include IBM z Systems mainframe OSes and Microsoft Windows Server 2016, as well as mixed clusters and applications that run on mainframes, Windows and Linux. Fine-grained, role-based access control and policy-based automation for container images through a DevOps pipeline also are part of this August Docker Enterprise Edition release.

With the addition of these Docker-supported OS features, Windows and Linux containers, as well as mainframe-based ones, can share a cluster of hosts. With this release, mixed OS containers can also be stacked, using a newly developed overlay network, into hybrid applications that may mix, for example, Apache Tomcat servers with Microsoft SQL Server databases.

This will be a key feature for enterprise IT shops that plan to move to container orchestration in the next year or two and use it to modernize legacy applications, said Chris Riley, director of solutions architecture at cPrime, an Agile software development consulting firm in Foster City, Calif.

"Deep container adoption within traditional enterprises is in its formative stages," Riley said. "The addition of z Systems and Windows [Server] native support will show benefits in the next couple of years, as companies upgrade their Windows infrastructure and coordinate that with their mainframe systems."

Mainstream enterprises aren't yet demanding hybrid clusters and applications, according to analysts. However, Docker officials have said HR software giant ADP -- one of the primary beta testers of this Docker Enterprise Edition release -- already mixes and matches Docker-supported OS workloads.

"Typically, these applications are managed separately, but as enterprises move to microservices and DevOps, the ability to manage applications with the same process, regardless of operating system, will be desirable," said Jay Lyman, analyst at 451 Research.

Enterprises also want to run hybrid cloud infrastructures; this portends a future in which such infrastructures are much more flexible and container portability means apps can run anywhere. Docker seems attuned to this with the features it's chosen for this release, Lyman said.

Enterprises that want these abilities from Docker Enterprise Edition should be prepared to open their wallets. Some of the most advanced features introduced in the August 2017 release -- such as node-based security isolation for multi-tenant environments, policy-based container image promotion in DevOps pipelines and continuous security vulnerability scanning -- require Docker Enterprise Edition Advanced licenses, which are priced at $3,500 per node, per year. Advanced licenses also must be purchased separately for Windows and Linux servers.

The pricing makes it clear that Docker is going after "big fish" customers, Lyman said. "They're clearly looking to drive larger deal sizes, as is the Kubernetes community of vendors -- and that's driving intense competition, as well as innovation."

Kubernetes complexity makes IT shops look twice at Docker

The Docker Enterprise Edition update comes weeks after rival container orchestration platform Kubernetes made its appeal to enterprise IT shops with support for granular network security and stateful application support in June's version 1.7.

"These two are increasingly competing and evolving together," 451's Lyman noted. "To some extent, you see [the Kubernetes community and Docker] making moves responsive to what the other is doing."

Kubernetes and the many commercial container orchestration packages that bundle it for enterprises, such as CoreOS's Tectonic and Red Hat's OpenShift, boast reference customers that include Experian, Deutsche Bank, BMW and T-Systems. But big companies also came out in favor of Docker's container orchestration this year, from ADP to Hyatt Hotels and The Northern Trust Company. While Kubernetes was an early mover in the container orchestration space and is backed by the experience of web-scale companies such as Google, Docker has made advanced security features generally available in its products, while many in the Kubernetes community remain in beta.

For some enterprises, Docker swarm mode appeals in contrast to the reputation that Kubernetes has for management complexity. One such firm is Rosetta Stone, which has evaluated Docker swarm mode for its container orchestration against Kubernetes and concluded that Kubernetes would be "overkill" for its container orchestration needs.

"Each of our microservices is crazy simple -- just web apps," said Kevin Burnett, DevOps lead for the global education software company in Arlington, Va. "We want to use the simplest possible orchestration tool that supports our use case."

Docker container orchestration also appeals to enterprises, because it comes from the same vendor that popularized Linux containers in Docker. Adding Docker swarm mode to Docker Engine means that much of Docker's container orchestration is already installed with the infrastructure that Rosetta Stone already runs.

However, the company is not inclined to pay the price for the advanced features in Enterprise Edition, and it likely would adopt the open source Community Edition, Burnett said.

"The features they're adding in this release were not for customers like us, in my estimation," Burnett said. Rosetta Stone has some Windows infrastructure it acquired with another company, but is moving away from that and doesn't have mainframe workloads.

"The security stuff seems nice, but it doesn't seem like they've added major features and wouldn't tip the scales," Burnett said.


Read more »

Hadoop data governance takes hold in companies as data gets 'bigger'
Posted by Thang Le Toan on 21 July 2017 02:08 AM

LinkedIn, Cleveland Clinic and fitness company Beachbody are examples of organizations that have increased their data governance efforts in connection with big data applications.

When LinkedIn Corp. was a smaller company, it didn't matter so much internally how data captured from its social networking website for analysis was formatted and structured.

"You could really log anything and access it later," said Yael Garten, LinkedIn's director of data science. That let data scientists work quickly on analytics applications, she added, without having to worry about any data inconsistencies that might result.

But things changed, as the company and the amount of data it generated grew rapidly. Now, people see the wisdom of better governing the data in LinkedIn's Hadoop environment so it's standardized throughout the analytics cycle, Garten explained. Otherwise, "it becomes a nightmare when you have hundreds of teams emitting data and hundreds of teams consuming data," she noted. That's particularly true, she said, if data is stored schema-free -- a lesson that LinkedIn learned early on.

Tools of the data governance trade

Yael Garten

LinkedIn's Hadoop data governance process includes an internally developed system called the Unified Metrics Platform, which facilitates development of consistent metrics data for reporting uses. Garten also pointed to a data model review committee that evaluates whether models will successfully produce the specified data. And she cited another homegrown technology called Dali that provides a common API into Hadoop data sets for both data producers and users at the Mountain View, Calif., company, now owned by Microsoft.

[I]t becomes a nightmare when you have hundreds of teams emitting data and hundreds of teams consuming data.
Yael Garten director of data science, LinkedIn

Cleveland Clinic has made data governance a bigger priority in connection with a big data deployment as well. Eric Hixson, its senior program administrator for business intelligence, said the Cleveland-based health system created a formal data governance program last year after expanding from a conventional data warehouse architecture to one that also includes Hadoop, advanced analytics software, self-service BI tools and other technologies.

The new architecture, modeled after a logical data warehouse concept outlined by Gartner, was accompanied by a change in Cleveland Clinic's internal culture to make the health system more data-driven and position it to use analytics as a competitive differentiator, Hixson said during a presentation at the 2017 TDWI Leadership Summit in Las Vegas. To support that premise, the data governance initiative is aimed at upgrading risk management capabilities and improving data quality and usability, he added.

All pumped up for data governance

The deployment of a cloud-based date lake last December also prompted new Hadoop data governance processes at Beachbody LLC, a maker of fitness and nutrition products based in Santa Monica, Calif.

The big data system runs in the Amazon Web Services cloud and includes Hive and the Spark processing engine in addition to Hadoop, said Eric Anderson, Beachbody's executive director of data. It gives the company's data scientists and analysts self-service access to more types of data than they could get from its existing Oracle data warehouse, including website activity data, workout-video streaming logs and call-center records. They can also access more sensitive data than before, and at a more granular level. "Those are all governance challenges for us," Anderson said.

Eric Anderson

Data governance and usage policies have been documented for users of the data lake platform, he noted. Anderson's team has also created a data inventory that lists what's available in the system, along with a data dictionary and another document with data lineage information. That's all posted on a web portal to make the system "more transparent" to the users, Anderson explained. He added that the documentation "is more of an intermediate step than we maybe would have done before" in the data warehouse environment since there's less data to deal with there.

More and more organizations may well find themselves taking similar intermediate steps on big data and Hadoop data governance in the years ahead. William McKnight, president of McKnight Consulting Group, compared data to an abundant natural resource in a keynote speech at the Enterprise Data World 2017 conference in Atlanta. "We're not going to run out of data, but we might get overwhelmed by it," McKnight said, pointing to the growing importance of effective data management processes.

Next Steps

What's the biggest challenge you've encountered on big data governance?

Expanding big data architectures add to data governance challenges

Consultant Anne Marie Smith on why big data needs to be governed

Balance of planning and flexibility needed in data lake deployments

Read more »


Unified communications (UC) is the integration of communication technologies to help employees exchange ideas and do their jobs more effectively. An effective UC plan can help tie a variety of interoperable communication tools to business processes and applications.

Popular UC components include:

Voice - Most UC offerings are voice-centric because the leading vendors have deep roots in telephony.

Conferencing and collaboration - In addition to audio, video and Web conferencing, these components include collaboration features such as shared virtual workspaces, whiteboarding, file sharing and document sharing.

Presence technology - Presence servers gather presence information from various sources and provide unified presence information to end users or applications.

Instant messaging - Enterprise IM systems offer security and privacy that public IM services cannot.

Speech access and virtual assistants - Virtual assistants provide intelligent screening and allow end users to filter messages and access calendars, contacts, voice and video through voice command.

Mobility - Integrating the mobile users' voice and real-time communications services with core enterprise communications lets them do their jobs regardless of location.

Unified messaging - Unified messaging (UM) integrates voice, fax and email messages and message notification. Most UM products add a variety of advanced call and message management functions, including desktop call screening of inbound calls, find me/follow me, live reply or call return, and cross-media messaging.

An important aspect of every successful UC deployment is the integration of communication and collaboration technology with business processes and workflow applications. In addition to reducing the need for employees to use shadow IT, a unified communication plan can reduce bottlenecks that occur when a person or program must wait for human input. For example, UC technologies can be used to automate contact with the next person in a sequence of steps or facilitate set-up for an ad hoc meeting with geographically-dispersed attendees.

Unified communication systems can be deployed in-house, in the cloud or as hybrid services. In a Unified Communications as a Service (UCaaS) delivery model, communication and collaboration applications and services are outsourced to a third-party provider and delivered over an IP network, usually the internet. UCaaS is known for providing high levels of availability (HA) as well as flexibility and scalability for core business tasks.



Hyperconnectivity is a state of unified communications (UC) in which the traffic-handling capacity and bandwidth of a network always exceed the demand. The number of communications pathways and nodes is much greater than the number of subscribers. All devices that could conceivably benefit from being connected to a network are in fact connected.

In the ultimate hyperconnected infrastructure, electronic and computer devices of all kinds can communicate among each other to whatever extent each individual user desires. Such devices can include:

  • Personal computers (PCs)
  • Personal digital assistants (PDAs)
  • MP3 devices
  • Cellular telephones
  • Television receivers
  • Personal radios
  • Global positioning system (GPS) receivers
  • Digital cameras and other video devices
  • Virtual reality (VR) and telepresence systems
  • Fixed and mobile robots
  • Radio-frequency identification (RFID) tags
  • Motor vehicles, boats and aircraft
  • Medical devices
  • Industrial, farm and ranch equipment
  • Common home appliances.

The term hyperconnectivity was first used by Barry Wellman and Anabel Quan-Haase to describe the evolving state of communications in society. Nortel Networks has embraced the hyperconnectivity paradigm to define their person-to-person, person-to-machine and machine-to-machine communications systems.


Tham khảo xu hướng năm 2017:

2017 Trends in Unified Communications & Collaboration

Irwin Lazar of Nemertes Research predicts the 6 collaboration trends we will see this year, and then we expand on some of those in this exclusive e-guide. Don’t miss out on key UCC opportunities – become a member now and get this complimentary download. 

Read more »

Certification Central - CISSP®
Posted by Thang Le Toan on 04 October 2016 10:44 PM

Studying for, obtaining and maintaining your CISSP® certification has now become more convenient with

Earn your CISSP® with (ISC)2®

Obtaining CISSP certification is globally recognized as a standard of achievement for security professionals.  Today, many large corporations and governmental agencies now require the certification for a position, thus giving CISSPs a higher earning potential and greatly expanded career opportunities., in partnership with global information security educator and certification leader (ISC)2, are now providing information security professionals tools and resources to earn and maintain your CISSP certification.

Step 1: Test your knowledge

First, test your knowledge by taking the CISSP practice test.  This is a free benefit to members. We encourage you to come back often while you are studying for the CISSP.

Step 2: Study for the CISSP test

Step 3: Practice the CISSP test

This practice session will offer you a preview of 20 questions pulled straight from previous CISSP exams to give you a sneak peak of what the certification exam entails. 

Come back daily for a new batch of questions and check out our related study resources to help boost your score.


As a member of SearchSecurity, you have free access to our database of CISSP practice test questions, presented in cooperation with (ISC)2.


Step 4: Schedule your exam date

You can schedule your exam with Pearson Vue. (ISC)2 testing partner.

Already a CISSP? has the resources you need to earn your CPEs -- including these (ISC)2 approved methods:

  • Attend a local, live seminar with industry experts.
  • Go online and participate in one of our Virtual Events.
  • Stay informed with Information Security magazine.
  • Use our free online training courses

Learn more about's CPE options.

What is CISSP?

  • To learn more about CISSP certification check out their website.
  • And for more information, please email (ISC)2 Education or call +1.866.462.4777 (toll-free in North America only) or +1.703.891.6781 outside the United States.
This was last published inJanuary 2013

Read more »

Help Desk Software by Kayako