Live Chat Software by Kayako
 News Categories
(19)Microsoft Technet (2)StarWind (4)TechRepublic (3)ComuterTips (1)SolarWinds (1)Xangati (1)MyVirtualCloud.net (27)VMware (8)NVIDIA (9)VDI (1)pfsense vRouter (3)VEEAM (3)Google (2)RemoteFX (1)developers.google.com (1)MailCleaner (1)Udemy (1)AUGI (2)AECbytes Architecture Engineering Constrution (7)VMGuru (2)AUTODESK (1)storageioblog.com (1)Atlantis Blog (11)AT.COM (2)community.spiceworks.com (1)archdaily.com (14)techtarget.com (2)hadoop360 (3)bigdatastudio (1)virtualizetips.com (1)blogs.vmware.com (3)VECITA (1)vecom.vn (1)Palo Alto Networks (4)itnews.com.au (2)serverwatch.com (1)Nhịp Cầu đầu tư (3)VnEconomy (1)Reuters (1)Tom Tunguz (1)Medium.com (1)Esri (1)www.specommerce.com (1)tweet (1)Tesla (1)fool.com (6)ITCNews (1)businessinsider.com (1)hbr.org Harvard Business Review (1)Haravan (2)techcrunch.com (1)vn.trendmicro.com (3)thangletoan.wordpress.com (3)IBM (1)www.droidmen.com (2)blog.parallels.com (1)betanews.com (8)searchvmware.techtarget.com (1)www.bctes.com (1)www.linux.com (4)blog.capterra.com (1)theelearningcoach.com (1)www.examgeneral.com (1)www.wetutoringnation.com (1)chamilo.org/ (1)www.formalms.org (1)chalkup.co (1)www.mindonsite.com (5)moodle.org (4)moodle.croydon.ac.uk (1)opensource.com (1)1tech.eu (1)remote-learner.net (1)paradisosolutions.com (2)sourceforge.net (13)searchbusinessanalytics.techtarget.com (1)nscs.gov.sg (1)virten.net (1)fastest.com.vn (1)elearninglearning.com (2)www.computerweekly.com (1)youtube.com (3)computer.howstuffworks.com (2)techz.vn (2)techsignin.com (1)itworld.com (13)searchsecurity.techtarget.com (1)makeuseof.com (1)nikse.dk (1)4kdownload.com (1)thegioididong.com (1)itcentralstation.com (1)www.dddmag.com (1)Engenius (1)networkcomputing.com (1)woshub.com (1)hainam121.wordpress.com (1)www.lucidchart.com (1)www.mof.gov.vn (3)www.servethehome.com (6)www.analyticsvidhya.com
RSS Feed
News
Mar
20
Sage adds Intacct financial management software to its ERP
Posted by Thang Le Toan on 20 March 2018 12:52 AM

Sage says the move will boost its cloud financial management software and U.S. presence. Analysts think it's a good technology move but are unsure about the market impact.

Sage Software intends to expand both its cloud offerings and its customer base in North America.

Sage, an ERP vendor based in Newcastle upon Tyne, U.K., is acquiring Intacct, a San Jose-based vendor of financial management software for $850 million, according to the company.

Sage's core products include the Sage X3 ERP system, the Sage One accounting and invoicing application and Sage Live real-time accounting software. The company's products are aimed primarily at SMBs, and Sage claims that it has just over 6 million users worldwide, with the majority of these in Europe.

Intacct provides SaaS financial management software to SMBs, with most of its customer base in North America, according to the company.

The move to acquire Intacct demonstrates Sage's determination to "win the cloud" and expand its U.S. customer base, according to a Sage press release announcing the deal.

"Today we take another major step forward in delivering our strategy and we are thrilled to welcome Intacct into the Sage family," Stephen Kelly, Sage CEO, said in the press release. "The acquisition of Intacct supports our ambitions for accelerating growth by winning new customers at scale and builds on our other cloud-first acquisitions, strengthening the Sage Business Cloud. Intacct opens up huge opportunities in the North American market, representing over half of our total addressable market."

Combining forces makes sense for Intacct because the company shares the same goals as Sage, according to Intacct CEO Robert Reid.

"We are excited to become part of Sage because we are relentlessly focused on the same goal -- to deliver the most innovative cloud solutions for our customers," Reid said in the press release. "Intacct is growing rapidly in our market and we are proud to be a recognized customer satisfaction leader across midsize, large and global enterprise businesses. By combining our strengths with those of Sage, we can jointly accelerate success for our customers."

Intacct brings real cloud DNA to financial management software

Intacct's specialty in cloud financial management software should complement Sage's relatively weak financial functionality, according to Cindy Jutras, president of the ERP consulting firm Mint Jutras.

"[Intacct] certainly brings real cloud DNA, and a financial management solution that would be a lot harder to grow out of than the solutions they had under the Sage One brand," Jutras said. "It also has stronger accounting than would be embedded within Sage X3. I would expect X3 to still be the go-to solution for midsize manufacturers since that was never Intacct's target, but Intacct may very well become the go-to ERP for service companies, like professional services."

Jutras also mentioned that Intacct was one of the first applications to address the new ASC 606 revenue recognition rules, something that Sage has not done yet. Sage's cloud strategy has been murky up to this point, but Jutras was unsure that this move will clarify that.

"It doesn't seem any of its existing products -- except their new Sage Live developed on the Salesforce platform -- are multi-tenant SaaS and up until recently they seemed to be going the hybrid route by leaving ERP on premises and surrounding it with cloud services," she said.

The deal should strengthen Sage's position in the SMB market, according to Chris Devault, manager of software selection at Panorama Consulting Solutions.

"This is a very good move for Sage, as it will bring a different platform and much needed technology to help Sage round out their small to mid-market offerings," Devault said.

Getting into the U.S. market

Overall it appears to be a positive move for Sage, both from a technology and market perspective, according to Holger Mueller, vice president and principal analyst at Constellation Research Inc.

"It's a good move by Sage to finally tackle finance in the cloud and get more exposure to the largest software market in the world, the U.S.," Mueller said. "But we see more than finance moving to the cloud, as customers are starting to look for or demand a complete suite to be available on the same platform. Sage will have to move fast to integrate Intacct and get to a compelling cloud suite roadmap."

Time will also tell if this move will position Sage better in the SMB ERP landscape.

"It's early to say, but it puts them in the SMB category with Oracle NetSuite, FinancialForce, Epicor and Acumatica at the lower end," Mueller said.

In what ways do you think Sage ERP is enhanced by adding Intacct financial management software?

Sage's developer certifications for Sage ERP X3 add more muscle to its ISV ecosystem.

SaaS can ease the pain of ERP upgrades, but some users like letting the vendor do it.


Read more »



Mar
20
The risk analytics software your company really needs
Posted by Thang Le Toan on 20 March 2018 12:49 AM

Risk analytics tools are more and more critical for CFOs seeking to improve operational efficiency. Just one problem: It can be hard to figure out just what those tools are.

As the use of big data is a documented phenomenon across corporate America, risk analytics – that is, using analytics to collect, analyze and measure real-time data to predict risk to make better business decisions -- is also becoming more popular.

That's according to Sanjaya Krishna, U.S. digital risk consulting leader at KPMG in Washington, D.C.

By using risk analytics software, CFOs can improve operational efficiency and keep their companies' exposures to acceptable risk. But where exactly does a CFO go to "get" risk analytics tools?

The search for risk analytics software

"Risk analytics is a fairly broad term, so there are a number of things that come to mind when we talk about risk analytics," Krishna said. "There are a number of specialized risk analytics products. There are also broader analytic packages that can … 'check the risk analytics box' to a certain extent, though the package isn't built to be a risk analytics solution."

There are products, such as KPMG Risk Front, that focus on providing customized risk analytics based on public internet commentary, Krishna said. And KPMG's Continuous Monitoring product provides for customized risk analytics based on internal transactional data.

Enterprises should consider a solution that takes these differences into account, making sure that a dashboard can become detailed and granular, while also offering a 50,000-foot view.
Rajiv Shahsenior solutions architect, GigaSpaces Technologies Inc.

There is also a number of established enterprise governance, risk and compliance packages that provide companies a way of housing and analyzing all sorts of identified risks at the enterprise level or within certain business areas, he said.

Finally, there are highly specialized, industry-specific risk analytic tools, especially in the financial services industry, according to Krishna.

Risk analytics tools, regardless of the industry, have been around for a while, said Danny Baker, vice president of market strategy, financial and risk management solutions at Fiserv Inc., a provider of financial services technology based in Brookfield, Wis.

"They have historically been purposed for less strategic items -- they were seen as just a checkbox to please the regulators," he said.

Now, though, risk analytics software has transitioned and evolved from tactical, point solutions to helping organizations optimize their strategic futures.

"Especially for banks and credit unions, risk analytics tools are focused more on strategy and the need to integrate with other departments, like finance," Baker said. "The integration across departments is key."

But it's not just the tools that are important.

Sometimes a company may even use a database as a risk analytics tool, said Ken Krupa, enterprise CTO at MarkLogic Corp., an Enterprise NoSQL database provider in San Carlos, Calif.

Taking the broad approach to the data quality issue

"There are, indeed, specialized products, as well as packages that play a role in risk analytics," Krupa said. "These third-party suites of tools do a lot of the math on where there are risks, but if the math is based on bad or incomplete data, risk cannot be adequately addressed."

What's more, oftentimes, a company doesn't have a clear picture of the quality of the data that it's working with because making that data available from upstream systems depends on complex extract, transform and load (ETL) processes supported by a large team of developers of varying skill sets, he said.

Therefore, there's actually an inherent risk in not having transparent access to a 360-degree view of the data -- mainly caused by data in silos. However, leveraging a database that can successfully integrate the many silos of data can go a long way toward minimizing data quality risks, according to Krupa.

"You may not initially think of a database as a risk analytics tool, but the right kind of database serves a critical role in organizing all of the inputs that risk analytics tools use," he said. "The right type of database -- one that minimizes ETL dependency and provides a clear view of all kinds of data, like that offered by MarkLogic -- can make risk analytics better, faster and with less cost."

Anand Venugopal, head of StreamAnalytix product management and go-to-market at Impetus Technologies Inc. in Los Gatos, Calif., concurred with Krupa that bringing all a company's data into one place is critical to enabling better risk-based business decisions.

Since many organizations are in the process of modernizing their infrastructures -- particularly around analytics platforms -- they are moving away from point solutions if they can, he said.

The new paradigm is bringing together all the relevant information -- if not in one place, at least, having the mechanisms to bring it together on demand -- and then do the analytics together in one place, Venugopal said.

"So, what is beyond proven is that analytics and decision-making [are] more accurate not with more advanced algorithms, but with more data, i.e., diverse data, and more data sources, i.e., 25 different data sources as opposed to five different data sources," he said.

It all points to the fact that even with moderate algorithms, more data gives organizations better results than trying to use "rocket science algorithms" with limited data, Venugopal said.

"What that means to enterprise technology is that they are building risk platforms on top of the modern data warehouses, which combines a variety of internal and external data sets, and trying to combine real-time data feeds -- real-time triggers, real-time market factors, currency risk, etc. -- which was not part of the previous generation's capabilities," he said.

Single-point products can only address limited portions of this because that's how they're designed; enterprise risk can only be covered with a broader approach, according to Venugopal.

"I think the trend [in enterprises] is more toward building sophisticated risk strategies and applications, and they're building out those and they're using core big data technology components like the Hadoop stack, like the Spark stack and tools like Impetus' Extreme Analytics," he said.

Custom risk analytics software and other considerations

Organizations looking to implement technology to mitigate risk have to consider a few additional things, including the usability and feature set, according to Rajiv Shah, senior solutions architect for GigaSpaces Technologies Inc. in New York City.

What have you found most confusing when searching for risk analytics software?

"For instance, high-volume traders need a solution that won't interfere with the data sync that is critical to being up to the microsecond," he said.

A product that offers multilevel dashboarding is also key, according to Shah.

For example, the data a CFO needs to know is far different than, say, what a risk or compliance officer needs to know, he said.

"Enterprises should consider a solution that takes these differences into account, making sure that a dashboard can become detailed and granular, while also offering a 50,000-foot view," Shah said. "And a strong risk mitigation strategy and tool set should be able to identify and simulate a wide range of scenarios."

According to Fiserv's Baker, it's important that a risk mitigation technology doesn't hinder a company's regular operations.

"For larger organizations, it often becomes critical to build your own solution to meet the needs," he said.

Mike Juchno, partner at Ernst & Young Advisory Services, agreed that there is a custom tool component to risk analytics.

"Many of our clients already have these tools -- they're some sort of predictive analytics tool like SPSS, like SAS, like R, and some visualization on top of them, like Tableau or Power BI," he said. "So, we are able to build something custom to deal with a risk that may be unique to them or their industry or their particular situation. So, we typically find that it's a custom approach."

When it comes to looking for an off-the-shelf product, CFOs often hear about risk analytics tools from their peer-to-peer organizations. These groups come together to share information about tools.

"Of course, you're going to also look toward other companies or competitors that are doing risk management and performance management well and see what tools they have in place," Baker said. "The most high-performing clients I see embed their tools into not only solving current risk, but also expecting and forecasting future risk."

Although an organization can go to Fiserv and ask for a menu of risk analytics tools, it's more successful if both the company and Fiserv drill down into what the organization is trying to accomplish and customize the tools from there, according to Baker.

Most organizations want to make better strategic decisions, as the challenges of growth are greater now, and improve their forward-looking, strategic discipline and processing, he said.

The focus has shifted to agility and efficiency when implementing risk analytics tools, Baker said.

"The high-performing Fiserv clients I work with have integrated risk analytics tools into finance operations," he said. "These advanced solutions offer an integrative solution that also forecasts and plans for the strategic future."

Organizations are increasingly being thoughtful with their risk processes, he said. And in recent years questions to vendors have evolved from "what are your risk tools?" to "how do I get better information to make decisions for the future?"


Read more »



Mar
20

Regulatory compliance, loan covenants and currency risk are common targets, as organizations sift through ERP and other data looking for patterns that might give early warning.

As CFO of TIBCO Software Inc., Tom Berquist spends a lot of time working on risks, such as the failure to live up to loan covenants. Berquist uses risk analytics software to stay on top of things.

"As a private equity-backed company -- we're owned by Vista Equity Partners -- we carry a large amount of debt," he said. "We have covenants associated with that and they're tied to a number of our financial metrics." Consequently, a major part of Berquist's risk-management process is to stay in front of what's going on with the business. If there's going to be softness in TIBCO's top-line revenue, he has to make sure to manage the company's cost structure so it doesn't violate any of the covenants. Berquist said he has a lot of risk analytics tied to that business problem.

The intent of risk analytics is to give CFOs and others in the C-suite a complete, up-to-date risk profile "as of now," said Thomas Frénéhard, director of solution management, governance, risk and compliance at software vendor SAP.

"There's no need to wait for people to compile information at the end of the quarter and send you [information] that's outdated," Frénéhard said. "What CFOs want now is their financial exposure today."

Looking for patterns in corporate data

Risk analytics involves the use of data analysis to obtain insights into various risks in financial, operational and business processes, as well as to monitor risks in ways that can't be achieved through more traditional approaches to risk management, financial controls and compliance management, said John Verver, a strategic advisor to ACL Services, a maker of governance, risk and compliance software based in Vancouver, B.C.

Some of the most common uses of risk analytics are in core financial processes and core ERP application areas, including the purchase-to-pay and order-to-cash cycles, revenue and payroll -- "analyzing and testing the detailed transactions, for example, to look for indications of fraud [and] indications of noncompliance with regulatory requirements and controls," Verver said.

Once the data is in one place, CFOs should be able to easily visualize the data in a risk dashboard.
Dan Zittingchief product officer, ACL Services

Using advanced risk management -- i.e., risk analytics software -- will allow CFOs to access data from complex systems, including ERP environments, and easily identify key areas of risk, said Dan Zitting, chief product officer of ACL Services.

"The technology can be set up to pull data from the HR, sales and billing departments, for example, and cross-reference the information within the program's interface," Zitting said in an email. "Once the data is in one place, CFOs should be able to easily visualize the data in a risk dashboard that summarizes activity and flags changes in risk."

Berquist also uses risk analytics to manage foreign currency risk for TIBCO, which is an international company, as well as risks connected to managing cash.

"Every month I close the books, I get all my actuals and I export them all into my data warehouse and I load up my dashboards. I happen to use TIBCO Spotfire [business intelligence software], but you can load them up in any risk analytics tool," he said. "Then I review where we stand on everything that has happened so far. Are expenses in line? Where does our revenue stand? What happened with currency? What happened with cash? How does the balance sheet look? That's the first part of the problem."

The second part is forecasting what will happen with TIBCO's expenses, which helps Berquist ensure that the company is going to generate sufficient cash to avoid violating covenants and mitigate the effects of offshore currency fluctuations.

Berquist said there are general-purpose risk management technologies, some of which are tied to such things as identifying corporate fraud, but there is also company- or industry-specific risk analytics software.

"My big concern is financial risk, so most of my [use of risk analytics] is around those types of measures," he said.

Risk analytics software helps CFOs make better decisions for the future because without an approach that allows them to run different scenarios and determine potential outcomes, they end up making gut instinct-oriented or seat-of-the-pants decisions, according to Berquist.

Sharing a similar view is Tom Kimner, head of global product marketing and operations for risk management at SAS Institute Inc., a provider of analytics software, based in Cary, N.C.

"What makes risk analytics a little bit different, in some cases, is that risk generally deals with the future and uncertainty," Kimner said.

Cristina Silingardi, a former CFO and treasurer at HamaTech USA Inc., a manufacturer of equipment for the semiconductor industry, concurred with Berquist that risk assessment can no longer be done as it used to be based on individuals' knowledge of their businesses, their instincts and a few key data points.

"There is so much data right now, and the biggest change I see is that now this data encompasses structured internal company data as well as unstructured external data," said Silingardi, now managing director of vcfo Holdings, a consulting firm based in Austin, Texas, that specializes in finance, recruiting and human resources.

CFOs started getting more involved with risk analytics when they needed better revenue metrics to understand predictability and trends, she said. Risk analytics software went beyond traditional risk-management tools by adding real-time reporting that puts key metrics right in front of CFOs and updates them all day long. Such data can help CFOs keep an eye on regulatory and contractual noncompliance from vendors, according to Silingardi.

"It helps them with pattern recognition, but only if [they] can translate that to really good visual dashboards that are looking at this data. [CFOs] used to focus only on a few things. Now, [they're] using all this data to get a much better picture," she said.

Forward-thinking mindset is key

Historically, risk analysis and assessment has tended to be a reactive and subjective process, according to Daniel Smith, director of data science and innovation at Syntelli Solutions Inc., a data analytics company based in Charlotte, N.C. After something bad happens, the tendency is for people to say, "'Let's investigate it, or, 'Let's all huddle up and think about what could happen and create a bunch of speculative scenarios,'" he said.

That's exactly the way many of SAP's customers still look at risk: through the rear-view mirror, said Bruce McCuaig, director of governance, risk and compliance solution marketing at SAP.

"Once or twice a year they report to the board and they look backwards, but what I think we're seeing now is the ability to look forward and report frequently online and in real time," McCuaig said.

In modern analytics and modern business, companies want to focus more on proactive, predictive and objective risk, Smith said. While focusing on risk in this manner gives CFOs visibility into the future, many don't have the pipeline of data and a single source of consolidated data to enable them to do that.

"They need a system, a way to collect that data and be able to analyze it," he said. "From a strategic point of view, it's more of a data initiative."

The goal is to give people the skills and applications to view highly interactive and multidimensional data as opposed to a traditional, two-dimensional tabular view in a spreadsheet, Smith said.

When it comes to risk analytics, CFOs should be thinking about techniques, not specific tools. Risk analysis is more about understanding ways to mine data better than about which platform can do it, according to Smith.

"Risk analytics is part of something larger. At SAP, we don't have a category of solutions called 'risk analytics,'" McCuaig said. "There are a variety of analytics tools that will serve the purpose."

How has your company used risk analytics?


Read more »



Mar
20
Risk mapping key to security, business integration
Posted by Thang Le Toan on 20 March 2018 12:43 AM

It’s no secret that data protection has become integral to bottom line success for digital businesses. As a result, it’s time for InfoSec professionals to crawl out of their caves and start communicating with the rest of the business, Tom Kartanowicz, head of information security at Natixis, North America, told the audience at the recent CDM Media CISO Summit.

To facilitate this communication, the language these pros will use is the language of security risk, Kartanowicz said.

“As security professionals, if we want to be taken seriously we need to put what we do into the risk lens to talk to the business so they understand the impact and how we’re trying to reduce the impact of the types of threats we’re seeing,” Kartanowicz said.

For example, even though the chief information security officer and chief risk officer may appear to be two different islands in an organization, they are part of the same team, he reminded the audience.

 

Business is the bridge that links them together so instead of working in silos, security professionals should carve out what Kartanowicz calls a “friends and family plan” that forms allies with other departments in their organization. The human resources department can help discipline somebody who might be an internal threat to the organization, corporate communications can help talk to the media and customers when there are incidents like DDoS and malware attacks, and the legal department can be valuable allies when it is time to take action against bad actors, he explained.

“As the CISO or as the head of InfoSec, you are missing out on a lot of valuable intelligence if you are not talking to all these different teams,” he stressed.

Risk mapping — a data visualization tool that outlines an organization’s specific risks — is an effective way to identify threats and vulnerabilities, then communicate them to the business, he said. Risk mapping helps an organization identify the areas where it’s going to spend their security budget, how to implement solutions and, most importantly, helps identify specific instances of risk reduction, he said.

Kartanowicz said there are two things to consider when evaluating and determining the likelihood of a risk: how easy is it to exploit and how often it occurs.

“If the vulnerabilities require technical skills held by 1% of the population, it’s going to be pretty difficult to exploit,” he said. “If on the other hand, anybody on the street can exploit it, it’s going to be pretty easy.”

It is then time to address the specific risks, he said.

“In the enterprise risk management world, the business can accept the risks, avoid the risks or [work to] mitigate the risks — this is where InfoSec comes in — or transfer the risks,” he said.

Using tools such as the NIST cybersecurity framework can help InfoSec reduce the risks, he said. It’s important that organizations tie in their disaster recovery, backup strategy, business continuity and crisis management into whatever the framework they choose, he added. Organizations should also ensure they have baseline controls in place to help minimize the risk of a data breach, he added.

But as threats evolve and vulnerabilities change, he suggested that the risk map be re-evaluated annually. Business requirements are constantly evolving and organizations are always entering different markets, but companies need to be constantly aware of the threat landscape, he added.

“Incidents will always occur; risk is not going away,” he said.


Read more »



Mar
20
risk map (risk heat map)
Posted by Thang Le Toan on 20 March 2018 12:42 AM

A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A risk map helps companies identify and prioritize the risks associated with their business.

The goal of a risk map is to improve an organization's understanding of its risk profile and appetite, clarify thinking on the nature and impact of risks, and improve the organization's risk assessment model. In the enterprise, a risk map is often presented as a two-dimensional matrix. For example, the likelihood a risk will occur may be plotted on the x-axis, while the impact of the same risk is plotted on the y-axis.

A risk map is considered a critical component of enterprise risk management because it helps identify risks that need more attention. Identified risks that fall in the high-frequency and high-severity section can then be made a priority by organizations. If the organization is dispersed geographically and certain risks are associated with certain geographical areas, risks might be illustrated with a heat map, using color to illustrate the levels of risk to which individual branch offices are exposed.

risk matrix example

A risk matrix that includes natural disasters and human risk factors.

How to create a risk map

Identification of inherent risks is the first step in creating a risk map. Risks can be broadly categorized into strategic risk, compliance risk, operational risk, financial risk and reputational risk, but organizations should aim to chart their own lists by taking into consideration specific factors that might affect them financially. Once the risks have been identified, it is necessary to understand what kind of internal or external events are driving the risks.

The next step in risk mapping is evaluating the risks: estimating the frequency, the potential impact and possible control processes to offset the risks. The risks should then be prioritized. The most impactful risks can be managed by applying control processes to help lessen their potential occurrence.

As threats evolve and vulnerabilities change, a risk map must be re-evaluated periodically. Organizations also must review their risk maps regularly to ensure key risks are being managed effectively.

Why it's important to create a risk map

A risk map offers a visualized, comprehensive view of the likelihood and impact of an organization's risks. This helps the organization improve risk management and risk governance by prioritizing risk management efforts. This risk prioritization enables them to focus time and money on the most potentially damaging risks identified in a heat map chart.

A risk map also facilitates interdepartmental dialogues about an organization's inherent risks and promotes communication about risks throughout the organization. It helps organizations visualize risks in relation to each other, and it guides the development of a control assessment of how to deal with the risks and the consequence of those risks.

The map can help the company visualize how risks in one part of the organization can affect operations of another business unit within the organization.

How has creating a risk heat map helped your organization's risk management efforts?

A risk map also adds precision to an organization's risk assessment strategy and identifies gaps in an organization's risk management processes.

 


Read more »



Mar
20
risk map (risk heat map)
Posted by Thang Le Toan on 20 March 2018 12:42 AM

A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A risk map helps companies identify and prioritize the risks associated with their business.

The goal of a risk map is to improve an organization's understanding of its risk profile and appetite, clarify thinking on the nature and impact of risks, and improve the organization's risk assessment model. In the enterprise, a risk map is often presented as a two-dimensional matrix. For example, the likelihood a risk will occur may be plotted on the x-axis, while the impact of the same risk is plotted on the y-axis.

A risk map is considered a critical component of enterprise risk management because it helps identify risks that need more attention. Identified risks that fall in the high-frequency and high-severity section can then be made a priority by organizations. If the organization is dispersed geographically and certain risks are associated with certain geographical areas, risks might be illustrated with a heat map, using color to illustrate the levels of risk to which individual branch offices are exposed.

risk matrix example

A risk matrix that includes natural disasters and human risk factors.

How to create a risk map

Identification of inherent risks is the first step in creating a risk map. Risks can be broadly categorized into strategic risk, compliance risk, operational risk, financial risk and reputational risk, but organizations should aim to chart their own lists by taking into consideration specific factors that might affect them financially. Once the risks have been identified, it is necessary to understand what kind of internal or external events are driving the risks.

The next step in risk mapping is evaluating the risks: estimating the frequency, the potential impact and possible control processes to offset the risks. The risks should then be prioritized. The most impactful risks can be managed by applying control processes to help lessen their potential occurrence.

As threats evolve and vulnerabilities change, a risk map must be re-evaluated periodically. Organizations also must review their risk maps regularly to ensure key risks are being managed effectively.

Why it's important to create a risk map

A risk map offers a visualized, comprehensive view of the likelihood and impact of an organization's risks. This helps the organization improve risk management and risk governance by prioritizing risk management efforts. This risk prioritization enables them to focus time and money on the most potentially damaging risks identified in a heat map chart.

A risk map also facilitates interdepartmental dialogues about an organization's inherent risks and promotes communication about risks throughout the organization. It helps organizations visualize risks in relation to each other, and it guides the development of a control assessment of how to deal with the risks and the consequence of those risks.

The map can help the company visualize how risks in one part of the organization can affect operations of another business unit within the organization.

How has creating a risk heat map helped your organization's risk management efforts?

A risk map also adds precision to an organization's risk assessment strategy and identifies gaps in an organization's risk management processes.

 


Read more »



Nov
10
IBM Cloud Private pulls from Big Blue's roots
Posted by Thang Le Toan on 10 November 2017 02:04 AM

IBM sticks close to its roots with IBM Cloud Private, which taps Big Blue's enterprise and middleware strengths to move customers from the data center to private cloud.

Despite continually working to reinvent itself, IBM never strays far from its roots, as evidenced by its move to bring cloud-native technology to the enterprise data center to accelerate digital transformation efforts.

Earlier last week, IBM launched IBM Cloud Private, which enables enterprises to bring modern development technologies such as containers, microservices and APIs -- all attributes of public cloudenvironments -- to private clouds in the data center, where IBM has tenure as a leading technology provider.

Big Blue dominant in the data center

IBM has long held a dominant position in the data center, with its mainframe, database and middleware technology. Now, the company is launching off that base to help its enterprise customers in regulated industries or that have sensitive data -- such as healthcare, government and finance -- gain the benefits of cloud-native computing development tools and processes, portability and integration.

"As part of its private cloud offering, IBM's been enhancing its developer services in the form of an integrated DevOps tool chain via a service catalog featuring a range of runtimes, development frameworks, tools, middleware, OSS and other services," Charlotte Dunlap, an analyst with GlobalData, said. "This plays into IBM's intent to provide developers with the tools, languages and frameworks they're accustomed to using, e.g., extending services to Node.js or Swift developers."

Indeed, the new offering provides developers with access to a variety of management and DevOps tools, including application performance management, Netcool, UrbanCode and Cloud Brokerage. It also includes support for popular tools such as Jenkins, Prometheus, Grafana, and ElasticSearch.

Kubernetes at its core

Yet, it all starts with the Kubernetes container orchestration platform and supports both Docker and Cloud Foundry.

Steve Robinson, general manager of IBM Hybrid Cloud, said that after several entries into the private cloud space with offerings such as Bluemix Local and others, Big Blue "took a clean sheet of paper and took a look at modern development technologies" and decided to base IBM Cloud Private on Kubernetes. "Then, we decided to bring our DevOps stack and middleware stack forward," he said.

IBM introduced container-optimized versions of its core middleware -- IBM WebSphere Liberty, Db2 and MQ messaging middleware -- to complement the new product.

 

Positioning vs. competition

 

Meanwhile, some observers view IBM Cloud Private as IBM's answer to competing offerings such as Microsoft Azure Stack, which provides similar on-premises capabilities. However, IBM said that its strength in middleware and its foundation in enterprise systems set it apart.

 

"This better positions IBM against primary rivals which are Microsoft Azure Stack and VMware/Pivotal, with a cloud strategy that has evolved up the stack from [infrastructure as a service] to [platform as a service] and now to what they call 'enterprise transformation' -- meaning more personalized customer engagement capabilities fulfilled through technologies supporting multi-cloud, cognitive and API, and blockchain," Dunlap said of the new product. "IBM says 71% of its customers today use three or more clouds including public, private and departmental. Private remains their largest customer opportunity with complex requirements and latency issues."

 

This is a key opportunity for IBM in bridging from leading provider for traditional enterprise applications to leading provider for cloud-modernized and cloud-native applications on its IBM Cloud Private and IBM Public Cloud offerings.
Rhett Dillinghamanalyst, Moor Insights & Strategies

 

Based on its own data, IBM estimated that customers will spend more than $50 billion annually on private cloud infrastructure beginning in 2017 and growing at 15% to 20% each year through 2020.

 

Microsoft's one big advantage in the segment is being able to do both public and private cloud almost seamlessly, said Rob Enderle, an industry expert and founder of the Enderle Group.

 

"Recently, Cisco and Google partnered to provide the same capability, and now IBM is moving at the same opportunity," he said. "IBM, like Cisco, should be particularly strong on the on-premises side of this and their execution with SoftLayer has been very strong of late resulting in what should be a very competitive offering. This should expand the available market for IBM's now hybrid solution significantly."

 

In a statement, Tyler Best, CTO of car rental giant Hertz, said, "Private cloud is a must for many enterprises, such as ours, working to reduce or eliminate their dependence on internal data centers." He added that a strategy of public, private and hybrid cloud is "essential" for large enterprises transitioning from legacy systems to the cloud.

 

With such a big opportunity at stake, every cloud vendor is positioning itself to capture as much of the wave of enterprise interest in Kubernetes as possible onto its own platform, said Rhett Dillingham, an analyst at Moor Insights & Strategy. And with IBM Cloud Private, IBM is providing its Kubernetes-based platform for use on private infrastructure with the integrated value of its investment in complementary management and developer tooling.

 

"As part of this, IBM is offering new containerized versions of its software and development frameworks, because it has a big opportunity to help its existing software customers transition to cloud by modernizing their management of IBM WebSphere Liberty-, Db2- and MQ-based applications using containers via Kubernetes," Dillingham said. "This is a key opportunity for IBM in bridging from leading provider for traditional enterprise applications to leading provider for cloud-modernized and cloud-native applications on its IBM Cloud Private and IBM Public Cloud offerings."

Sticking to its knitting

So, with IBM Cloud Private, IBM is sticking to its knitting while helping to advance its enterprise customers with modern development tools.

"IBM Cloud Private extends the value of customers' existing IBM investments rather than being a new, on-premises cloud platform, like Microsoft's Azure Stack," said Charles King, principal analyst at Pund-IT.

The primary benefit of this offering is it enables enterprises to take advantage of the investments they've already made in existing systems, applications and data by bringing them into an elastic cloud platform.

"This will help accelerate application development, more easily expose these applications to new public cloud services and even provide the option of moving applications to the public cloud," said Michael Elder, distinguished engineer for the IBM Cloud Private platform. "We also think it sets an enterprise up with a powerful new tool for workload portability from their datacenter to the public cloud."

The platform provides tools to help bootstrap new applications into containers and enable existing applications for the cloud, he noted.

"We also build IBM Microservice Builder into the platform, which offers preconfigured Jenkins CI service build container images and publishes them to the built-in image registry right out of the box," Elder said.

The system also includes other management and security features, such as multi-cloud management automation, a security vulnerability advisor, data encryption and privileged access, and more.

Moreover, IBM Cloud Private supports Intel-based hardware from Cisco, Dell EMC, Lenovo and NetApp, and it can be deployed via VMware, Canonical and other OpenStack distributions.

 


Read more »



Aug
20
Docker-supported OS list expands with Enterprise Edition update
Posted by Thang Le Toan on 20 August 2017 11:20 PM

Docker Enterprise Edition fired back at Kubernetes with new support for mixed clusters and applications, as well as advanced security features that target large enterprises.

Docker Enterprise Edition has strengthened its case for large IT buyers of container orchestration tools, with new OS support, security and policy-based automation features.

Docker-supported OS types now include IBM z Systems mainframe OSes and Microsoft Windows Server 2016, as well as mixed clusters and applications that run on mainframes, Windows and Linux. Fine-grained, role-based access control and policy-based automation for container images through a DevOps pipeline also are part of this August Docker Enterprise Edition release.

With the addition of these Docker-supported OS features, Windows and Linux containers, as well as mainframe-based ones, can share a cluster of hosts. With this release, mixed OS containers can also be stacked, using a newly developed overlay network, into hybrid applications that may mix, for example, Apache Tomcat servers with Microsoft SQL Server databases.

This will be a key feature for enterprise IT shops that plan to move to container orchestration in the next year or two and use it to modernize legacy applications, said Chris Riley, director of solutions architecture at cPrime, an Agile software development consulting firm in Foster City, Calif.

"Deep container adoption within traditional enterprises is in its formative stages," Riley said. "The addition of z Systems and Windows [Server] native support will show benefits in the next couple of years, as companies upgrade their Windows infrastructure and coordinate that with their mainframe systems."

Mainstream enterprises aren't yet demanding hybrid clusters and applications, according to analysts. However, Docker officials have said HR software giant ADP -- one of the primary beta testers of this Docker Enterprise Edition release -- already mixes and matches Docker-supported OS workloads.

"Typically, these applications are managed separately, but as enterprises move to microservices and DevOps, the ability to manage applications with the same process, regardless of operating system, will be desirable," said Jay Lyman, analyst at 451 Research.

Enterprises also want to run hybrid cloud infrastructures; this portends a future in which such infrastructures are much more flexible and container portability means apps can run anywhere. Docker seems attuned to this with the features it's chosen for this release, Lyman said.

Enterprises that want these abilities from Docker Enterprise Edition should be prepared to open their wallets. Some of the most advanced features introduced in the August 2017 release -- such as node-based security isolation for multi-tenant environments, policy-based container image promotion in DevOps pipelines and continuous security vulnerability scanning -- require Docker Enterprise Edition Advanced licenses, which are priced at $3,500 per node, per year. Advanced licenses also must be purchased separately for Windows and Linux servers.

The pricing makes it clear that Docker is going after "big fish" customers, Lyman said. "They're clearly looking to drive larger deal sizes, as is the Kubernetes community of vendors -- and that's driving intense competition, as well as innovation."

Kubernetes complexity makes IT shops look twice at Docker

The Docker Enterprise Edition update comes weeks after rival container orchestration platform Kubernetes made its appeal to enterprise IT shops with support for granular network security and stateful application support in June's version 1.7.

"These two are increasingly competing and evolving together," 451's Lyman noted. "To some extent, you see [the Kubernetes community and Docker] making moves responsive to what the other is doing."

Kubernetes and the many commercial container orchestration packages that bundle it for enterprises, such as CoreOS's Tectonic and Red Hat's OpenShift, boast reference customers that include Experian, Deutsche Bank, BMW and T-Systems. But big companies also came out in favor of Docker's container orchestration this year, from ADP to Hyatt Hotels and The Northern Trust Company. While Kubernetes was an early mover in the container orchestration space and is backed by the experience of web-scale companies such as Google, Docker has made advanced security features generally available in its products, while many in the Kubernetes community remain in beta.

For some enterprises, Docker swarm mode appeals in contrast to the reputation that Kubernetes has for management complexity. One such firm is Rosetta Stone, which has evaluated Docker swarm mode for its container orchestration against Kubernetes and concluded that Kubernetes would be "overkill" for its container orchestration needs.

"Each of our microservices is crazy simple -- just web apps," said Kevin Burnett, DevOps lead for the global education software company in Arlington, Va. "We want to use the simplest possible orchestration tool that supports our use case."

Docker container orchestration also appeals to enterprises, because it comes from the same vendor that popularized Linux containers in Docker. Adding Docker swarm mode to Docker Engine means that much of Docker's container orchestration is already installed with the infrastructure that Rosetta Stone already runs.

However, the company is not inclined to pay the price for the advanced features in Enterprise Edition, and it likely would adopt the open source Community Edition, Burnett said.

"The features they're adding in this release were not for customers like us, in my estimation," Burnett said. Rosetta Stone has some Windows infrastructure it acquired with another company, but is moving away from that and doesn't have mainframe workloads.

"The security stuff seems nice, but it doesn't seem like they've added major features and wouldn't tip the scales," Burnett said.

 


Read more »



Aug
20
How did a Moodle security vulnerability enable remote code execution?
Posted by Thang Le Toan on 20 August 2017 11:11 PM

A series of logic flaws in Moodle enabled attackers to remotely execute code on servers. Expert Michael Cobb explains how the Moodle security vulnerability can be exploited.

A vulnerability found in Moodle, an open source, PHP-based learning management system used by tens of thousands of universities internationally, left servers and their data open to compromise. According to the researcher that discovered the issue, the Moodle security vulnerability is actually made up of several small flaws, and it can enable attackers to execute PHP code on related servers. What does this vulnerability entail, and what can be done about it?

Netanel Rubin, security researcher and CEO of Vaultra, found that by exploiting a series of minor vulnerabilities, he could chain them together to remotely execute code on a server running Moodle.

Moodle is an open source learning management system that stores a lot of sensitive information, like students' grades, tests and private data, making it an attractive target for hackers. The Moodle security vulnerability is tracked as CVE-2017-2641 and Moodle Tracker issue MDL-58010.

The attack works on almost all Moodle versions, so administrators should move to the latest version, version 3.2.2, to fix the problem as soon as possible. Besides updating to the latest version, administrators should also check for any new administrators, plug-ins or templates within Moodle, and search for any new files in the file system in case the server has been compromised.

The coding and logic flaws that contributed to this Moodle security vulnerability are a consequence of the size and complexity of the Moodle system; it contains thousands of files, hundreds of different components and around two million lines of PHP code, written and updated by various different developers at different times.

A new function, update_user_preferences, was added to Moodle to replace the update_users function. It implemented a privilege check, so even if an attacker could change settings using user preferences, it would only work on their own privileges.

While the new function removed the possibility of changing every user attribute, the code failed to check which preference was being changed. The previous function used the setuserpref.php file to check that the preference that needed to be updated was listed in the ajax_updatable_user_prefs array, which defines the preferences that can be changed via Ajax to ensure no critical values can be altered.

Ironically, in an attempt to reduce any potential abuse of the user attribute update function, the new privilege check actually introduced this Moodle security vulnerability. It's possible the developer thought that user preferences could not be exploited to mount a full-scale attack, as they only affect the graphical user interface part of the system.

However, the lack of containment enables an object injection attack to update any row in the entire database, such as administrator accounts, passwords and site configuration. Rubin discovered that this and other false assumptions made during code development could be leveraged to eventually execute PHP code on the server.

Logic flaws can and will occur in any system featuring a large code base, particularly when it's developed over a long period of time by a changing team of developers.

According to Steve McConnell, author of Code Complete, software projects that reach 512,000 lines of code or more can see four to 100 coding errors per thousand lines of code. A typical web application utilizes multiple languages, such as Java, HTML, PHP, Python, CSS, third-party libraries and components, and so on, and there are very few developers that know or understand how to use and integrate each of them without introducing any security vulnerabilities.

To reduce the chances of developers introducing logic flaws or omitting security and validation checks, it should be a requirement that they add a minimum level of in-code comments using an agreed-upon comment style, along with more verbose supporting documentation. Wikipedia has a comprehensive list of comment styles.

Although time spent on commenting and documenting code will slow down development, it will ensure developers making changes in the future can fully understand what a function does, how it does it and what checks are required on the data it handles. It is important that functions receiving data passed by other functions don't carry the assumption that the data has already been validated, as the previous function may have validated it against a different set of requirements or rules.

A good example is a telephone number. A function to retrieve and display a user's telephone number from a database may well accept + and () symbols, but if that function then passes the data to a function that actually calls the number, these characters could cause the function to fail if they are not removed before being processed.

Ask the expert:
Want to ask Michael Cobb a question about application security? Submit your questions now via email. (All questions are anonymous.)

How does your enterprise eliminate logic flaws from code development?


Read more »



Aug
20
How can Google's CAPTCHA challenge be bypassed using Google tools?
Posted by Thang Le Toan on 20 August 2017 11:09 PM

The ReBreakCaptcha exploit can bypass Google's reCAPTCHA verification system using flaws in Google's own API. Expert Michael Cobb explains how the attack works.

Researchers at East-Ee Security demonstrated a proof-of-concept bypass of Google's reCAPTCHA V2 verification system that uses different image, audio or text prompts to verify that a person, as opposed to a bot, is attempting to log in. Their exploit technique, called ReBreakCaptcha, makes use of web-based Google tools to break through Google's system. What are the flaws in Google's API that make this attack possible? What is the threat of bots being able to bypass this measure?

 

A CAPTCHA, or a Completely Automated Public Turing Test to Tell Computers and Humans Apart, is used to protect forms on websites from being abused by bots and other nonhuman interactions, the idea being that it poses a test that humans can pass, but that an automated computer program can't.

CAPTCHA challenge tests include image and text challenges, as well as an audio test option to ensure that users with visual impairments can respond. ReCAPTCHA is a free CAPTCHA service provided by Google that enables developers to easily incorporate CAPTCHA functionality into a website.

A post on the East-Ee Security website explained how a proof-of-concept Python script could automate the breaking of reCAPTCHA challenges by using Google's Speech Recognition API.

The blog explains how to force a site to present an audio CAPTCHA challenge and then convert the audio to the correct WAV file format, before sending it to Google's Speech Recognition API. The API response is a string version of the correct answer that can then be used to answer the CAPTCHA challenge. The script automates the various tasks, and then answers the CAPTCHA in an acceptable period of time without any user intervention. However, according to an update from East-Ee, many users who downloaded the script complained that it failed to correctly solve harder CAPTCHA challenges.

The script may work on a simple challenge, but if Google suspects a nonhuman interaction, or if the answer to a CAPTCHA comes from a public proxy or IP address that Google has flagged as suspicious, then the reCAPTCHA service presents the user with a harder version of the CAPTCHA challenge. The harder audio challenges include background noise and an overlapping voice.

In an apparent effort to patch the vulnerability, Google has also raised the minimum number of digits used in a challenge from four or five to between 10 and 12, and it immediately switches to more complex challenges when a high-volume attack is identified. Even an updated version of the attack doesn't appear to have fully overcome these harder challenges; some of the harder audio challenges are even difficult for humans to decipher due to the constant hissing noises and overlapping voices.

Attempts to beat Google's CAPTCHA have been published before -- by Stiltwalker in 2012 and AppSec Labs in 2016 -- and there are various paid-for services that offer to automate the process, like Captcha Solutions, but the success rate of these tools is not known.

Does your enterprise use a verification system like reCAPTCHA to stop bots?


Read more »




Help Desk Software by Kayako