Live Chat Software by Kayako
 News Categories
(19)Microsoft Technet (2)StarWind (4)TechRepublic (3)ComuterTips (1)SolarWinds (1)Xangati (1)MyVirtualCloud.net (27)VMware (8)NVIDIA (9)VDI (1)pfsense vRouter (3)VEEAM (3)Google (2)RemoteFX (1)developers.google.com (1)MailCleaner (1)Udemy (1)AUGI (2)AECbytes Architecture Engineering Constrution (7)VMGuru (2)AUTODESK (1)storageioblog.com (1)Atlantis Blog (11)AT.COM (2)community.spiceworks.com (1)archdaily.com (14)techtarget.com (2)hadoop360 (3)bigdatastudio (1)virtualizetips.com (1)blogs.vmware.com (3)VECITA (1)vecom.vn (1)Palo Alto Networks (4)itnews.com.au (2)serverwatch.com (1)Nhịp Cầu đầu tư (3)VnEconomy (1)Reuters (1)Tom Tunguz (1)Medium.com (1)Esri (1)www.specommerce.com (1)tweet (1)Tesla (1)fool.com (6)ITCNews (1)businessinsider.com (1)hbr.org Harvard Business Review (1)Haravan (2)techcrunch.com (1)vn.trendmicro.com (3)thangletoan.wordpress.com (3)IBM (1)www.droidmen.com (2)blog.parallels.com (1)betanews.com (8)searchvmware.techtarget.com (1)www.bctes.com (1)www.linux.com (4)blog.capterra.com (1)theelearningcoach.com (1)www.examgeneral.com (1)www.wetutoringnation.com (1)chamilo.org/ (1)www.formalms.org (1)chalkup.co (1)www.mindonsite.com (5)moodle.org (4)moodle.croydon.ac.uk (1)opensource.com (1)1tech.eu (1)remote-learner.net (1)paradisosolutions.com (2)sourceforge.net (13)searchbusinessanalytics.techtarget.com (1)nscs.gov.sg (1)virten.net (1)fastest.com.vn (1)elearninglearning.com (2)www.computerweekly.com (1)youtube.com (3)computer.howstuffworks.com (2)techz.vn (2)techsignin.com (1)itworld.com (13)searchsecurity.techtarget.com (1)makeuseof.com (1)nikse.dk (1)4kdownload.com (1)thegioididong.com (1)itcentralstation.com (1)www.dddmag.com (1)Engenius (1)networkcomputing.com (1)woshub.com (1)hainam121.wordpress.com (1)www.lucidchart.com (1)www.mof.gov.vn (3)www.servethehome.com (6)www.analyticsvidhya.com
RSS Feed
Latest Updates
Mar
28

Multi-user Windows 10 could replace RDSH, which isn't in the first Windows Server 2019 preview. The move has ramifications for app compatibility, XenApp and Windows licensing.

Các con thú máy tính ngày càng rời xa Microsoft Windows ?

If RDSH will not be available in Windows Server 2019, IT pros are left wondering where that leaves their remote application delivery strategies.

The first preview of Windows Server 2019, which Microsoft released last week, does not include the Remote Desktop Session Host (RDSH) role. Several IT consultants and analysts said they expect Microsoft to allow for multi-user Windows 10 sessions as a replacement for server-based RDSH -- a possibility that drew mixed reactions.

 

"If they indeed take the session-hosted capabilities out of 2019 … it has to still work somewhere, and the only way to do that is with a multi-user thing -- that has to be from Microsoft," said Cláudio Rodrigues, CEO of WTSLabs consultancy in Nepean, Ont. "At the end of the day, it might be just re-skinning the cat. People will buy into that, I can bet."

Where's RDSH?

Microsoft declined to comment on the future of RDSH and the possibility of multi-user Windows 10. The company will disclose more information soon, and Remote Desktop Services (RDS) is "not gone," said Jeff Woolsey, principal program manager for Windows Server, on Twitter. RDS refers to the group of technologies that provide access to remote desktops and apps. RDSH is the component of RDS that allows multiple remote users to connect to session-based desktops and published apps.

It's still possible that Microsoft might make the RDSH role available when Windows Server 2019 becomes generally available in the second half of this year, but it's not likely, experts said. Microsoft had already removed the option for using a GUI to manage RDSH in Windows Server 2016 for the Semi-Annual Channel, leaving that capability available only to the Long-Term Servicing Channel (LTSC). Still, Microsoft could keep RDSH in Windows Server 2019 on the LTSC just for enterprise customers, said Jeff Wilhelm, CTO at Envision Technology Advisors, a solutions provider in Pawtucket, R.I. 

"There's going to have to be some point in time where Microsoft provides some clarity, not just from a technical perspective but also from a roadmap perspective," Wilhelm said. "I find it extremely hard to believe they would just unceremoniously cut that feature. I believe there will be a version that will allow RDS."

Even if Windows Server 2019 does not support RDSH, this issue isn't likely to affect a lot of organizations right away. Thirty-two percent of respondents in the TechTarget 2018 IT Priorities Survey said they're just moving to Windows Server 2016 this year, and 15% said they plan to move to Windows Server 2012.

"Probably 90% of the market will not even touch [Windows Server] 2019 from an application hosting perspective for minimum a year or two," Rodrigues said.

Potential for multi-user Windows 10

VDI takes a single-user Windows approach. There's one VM running a desktop operating system per user. Multi-user Windows 10 would instead allow multiple user sessions to run on one VM directly on the client OS.

If multi-user Windows 10 indeed sees the light of day, it would be a similar approach to RDSH on Windows Server -- which enables multiple user sessions to run on one server operating system -- so customers wouldn't experience a major change, experts said.

"It's semantics," Wilhelm said. "A multi-user OS is a server OS. There's a lot of similarities between the Windows Server kernel and the Windows [desktop] kernel."

If people are expecting Windows 10 multi-user to magically solve application issues, they are going to be disappointed.
Cláudio Rodrigues, CEO, WTSLabs

Desktops delivered from RDSH on shared Windows servers can help IT optimize resources and support more workers. Application compatibility can be an issue, however, if an app update prevents another app on the server from functioning properly, or if a legacy app isn't supported. Multi-user Windows 10 could address some of those compatibility issues, because applications would run directly on a Windows client VM. IT would also see similar benefits as far as resource optimization. But legacy apps could remain a problem if they're not supported on the latest version of Windows 10.

"A lot of those issues will still exist," Rodrigues said. "If people are expecting Windows 10 multi-user to magically solve application issues, they are going to be disappointed."

Plus, a lot of apps are built to detect when they're connected to RDSH and Windows Server, to help ensure they can work with the server OS. It would be critical for Microsoft to allow applications to do something similar with Windows 10, Rodrigues said.

In the past, Microsoft licensing restrictions prevented Windows shops from running remote multi-user desktop and application sessions directly from the client OS. In July 2017, Microsoft changed its rules and allowed virtualization rights for Windows 10 VMs on Azure. Some observers pointed to that shift as a sign that Microsoft is trying to make multi-user Windows 10 remote sessions possible.

Another option for Microsoft is instead moving customers to Remote Desktop modern infrastructure, which offers RDS running as a service on Azure.

"Most companies now with virtualized apps, they are cloud based," said Jack Gold, founder of J. Gold Associates LLC, a mobile analyst firm in Northborough, Mass. "Microsoft is probably seeing less demand for RDSH on local servers, and they are also trying to push their customers to use Azure."

Multi-user Windows 10 questions remain

Citrix XenApp runs on RDSH, so it's possible Citrix shops could see some changes with multi-user Windows 10 as well.

"Does this mean XenApp will die and XenDesktop will take over?" said James Rankin, solutions architect at Howell Technology Group, an IT consultancy in the U.K. "From a Citrix perspective, it might be a boon to them."

Would you use multi-session Windows 10 in place of RDSH on Windows Server? Why or why not?

Citrix could offer a multi-user XenDesktop capability that works the same way XenApp did, for example, but organizations would need to test that a multi-use desktop behaves with its applications the same way it did with the server-based capability, Rodrigues said.

How to license multi-user Windows 10 also remains a question.

"Would you pay more if you activated the multi-user version?" Rankin said. "Or is it simply part of Enterprise [licensing]?"

Microsoft could use the existing RDS licensing server infrastructure and change the pricing and naming to create a model for multi-user client OS sessions, Rodrigues said.

"It's a chance for them to simplify and in a way unify the licensing message," he said.

That could help address IT's calls for improved Microsoft licensing over the years, Gold said.

"Customers are saying, 'Look, we just can't deal with this anymore,'" he said.


Read more »



Mar
20
Sage adds Intacct financial management software to its ERP
Posted by Thang Le Toan on 20 March 2018 12:52 AM

Sage says the move will boost its cloud financial management software and U.S. presence. Analysts think it's a good technology move but are unsure about the market impact.

Sage Software intends to expand both its cloud offerings and its customer base in North America.

Sage, an ERP vendor based in Newcastle upon Tyne, U.K., is acquiring Intacct, a San Jose-based vendor of financial management software for $850 million, according to the company.

Sage's core products include the Sage X3 ERP system, the Sage One accounting and invoicing application and Sage Live real-time accounting software. The company's products are aimed primarily at SMBs, and Sage claims that it has just over 6 million users worldwide, with the majority of these in Europe.

Intacct provides SaaS financial management software to SMBs, with most of its customer base in North America, according to the company.

The move to acquire Intacct demonstrates Sage's determination to "win the cloud" and expand its U.S. customer base, according to a Sage press release announcing the deal.

"Today we take another major step forward in delivering our strategy and we are thrilled to welcome Intacct into the Sage family," Stephen Kelly, Sage CEO, said in the press release. "The acquisition of Intacct supports our ambitions for accelerating growth by winning new customers at scale and builds on our other cloud-first acquisitions, strengthening the Sage Business Cloud. Intacct opens up huge opportunities in the North American market, representing over half of our total addressable market."

Combining forces makes sense for Intacct because the company shares the same goals as Sage, according to Intacct CEO Robert Reid.

"We are excited to become part of Sage because we are relentlessly focused on the same goal -- to deliver the most innovative cloud solutions for our customers," Reid said in the press release. "Intacct is growing rapidly in our market and we are proud to be a recognized customer satisfaction leader across midsize, large and global enterprise businesses. By combining our strengths with those of Sage, we can jointly accelerate success for our customers."

Intacct brings real cloud DNA to financial management software

Intacct's specialty in cloud financial management software should complement Sage's relatively weak financial functionality, according to Cindy Jutras, president of the ERP consulting firm Mint Jutras.

"[Intacct] certainly brings real cloud DNA, and a financial management solution that would be a lot harder to grow out of than the solutions they had under the Sage One brand," Jutras said. "It also has stronger accounting than would be embedded within Sage X3. I would expect X3 to still be the go-to solution for midsize manufacturers since that was never Intacct's target, but Intacct may very well become the go-to ERP for service companies, like professional services."

Jutras also mentioned that Intacct was one of the first applications to address the new ASC 606 revenue recognition rules, something that Sage has not done yet. Sage's cloud strategy has been murky up to this point, but Jutras was unsure that this move will clarify that.

"It doesn't seem any of its existing products -- except their new Sage Live developed on the Salesforce platform -- are multi-tenant SaaS and up until recently they seemed to be going the hybrid route by leaving ERP on premises and surrounding it with cloud services," she said.

The deal should strengthen Sage's position in the SMB market, according to Chris Devault, manager of software selection at Panorama Consulting Solutions.

"This is a very good move for Sage, as it will bring a different platform and much needed technology to help Sage round out their small to mid-market offerings," Devault said.

Getting into the U.S. market

Overall it appears to be a positive move for Sage, both from a technology and market perspective, according to Holger Mueller, vice president and principal analyst at Constellation Research Inc.

"It's a good move by Sage to finally tackle finance in the cloud and get more exposure to the largest software market in the world, the U.S.," Mueller said. "But we see more than finance moving to the cloud, as customers are starting to look for or demand a complete suite to be available on the same platform. Sage will have to move fast to integrate Intacct and get to a compelling cloud suite roadmap."

Time will also tell if this move will position Sage better in the SMB ERP landscape.

"It's early to say, but it puts them in the SMB category with Oracle NetSuite, FinancialForce, Epicor and Acumatica at the lower end," Mueller said.

In what ways do you think Sage ERP is enhanced by adding Intacct financial management software?

Sage's developer certifications for Sage ERP X3 add more muscle to its ISV ecosystem.

SaaS can ease the pain of ERP upgrades, but some users like letting the vendor do it.


Read more »



Mar
20
The risk analytics software your company really needs
Posted by Thang Le Toan on 20 March 2018 12:49 AM

Risk analytics tools are more and more critical for CFOs seeking to improve operational efficiency. Just one problem: It can be hard to figure out just what those tools are.

As the use of big data is a documented phenomenon across corporate America, risk analytics – that is, using analytics to collect, analyze and measure real-time data to predict risk to make better business decisions -- is also becoming more popular.

That's according to Sanjaya Krishna, U.S. digital risk consulting leader at KPMG in Washington, D.C.

By using risk analytics software, CFOs can improve operational efficiency and keep their companies' exposures to acceptable risk. But where exactly does a CFO go to "get" risk analytics tools?

The search for risk analytics software

"Risk analytics is a fairly broad term, so there are a number of things that come to mind when we talk about risk analytics," Krishna said. "There are a number of specialized risk analytics products. There are also broader analytic packages that can … 'check the risk analytics box' to a certain extent, though the package isn't built to be a risk analytics solution."

There are products, such as KPMG Risk Front, that focus on providing customized risk analytics based on public internet commentary, Krishna said. And KPMG's Continuous Monitoring product provides for customized risk analytics based on internal transactional data.

Enterprises should consider a solution that takes these differences into account, making sure that a dashboard can become detailed and granular, while also offering a 50,000-foot view.
Rajiv Shahsenior solutions architect, GigaSpaces Technologies Inc.

There is also a number of established enterprise governance, risk and compliance packages that provide companies a way of housing and analyzing all sorts of identified risks at the enterprise level or within certain business areas, he said.

Finally, there are highly specialized, industry-specific risk analytic tools, especially in the financial services industry, according to Krishna.

Risk analytics tools, regardless of the industry, have been around for a while, said Danny Baker, vice president of market strategy, financial and risk management solutions at Fiserv Inc., a provider of financial services technology based in Brookfield, Wis.

"They have historically been purposed for less strategic items -- they were seen as just a checkbox to please the regulators," he said.

Now, though, risk analytics software has transitioned and evolved from tactical, point solutions to helping organizations optimize their strategic futures.

"Especially for banks and credit unions, risk analytics tools are focused more on strategy and the need to integrate with other departments, like finance," Baker said. "The integration across departments is key."

But it's not just the tools that are important.

Sometimes a company may even use a database as a risk analytics tool, said Ken Krupa, enterprise CTO at MarkLogic Corp., an Enterprise NoSQL database provider in San Carlos, Calif.

Taking the broad approach to the data quality issue

"There are, indeed, specialized products, as well as packages that play a role in risk analytics," Krupa said. "These third-party suites of tools do a lot of the math on where there are risks, but if the math is based on bad or incomplete data, risk cannot be adequately addressed."

What's more, oftentimes, a company doesn't have a clear picture of the quality of the data that it's working with because making that data available from upstream systems depends on complex extract, transform and load (ETL) processes supported by a large team of developers of varying skill sets, he said.

Therefore, there's actually an inherent risk in not having transparent access to a 360-degree view of the data -- mainly caused by data in silos. However, leveraging a database that can successfully integrate the many silos of data can go a long way toward minimizing data quality risks, according to Krupa.

"You may not initially think of a database as a risk analytics tool, but the right kind of database serves a critical role in organizing all of the inputs that risk analytics tools use," he said. "The right type of database -- one that minimizes ETL dependency and provides a clear view of all kinds of data, like that offered by MarkLogic -- can make risk analytics better, faster and with less cost."

Anand Venugopal, head of StreamAnalytix product management and go-to-market at Impetus Technologies Inc. in Los Gatos, Calif., concurred with Krupa that bringing all a company's data into one place is critical to enabling better risk-based business decisions.

Since many organizations are in the process of modernizing their infrastructures -- particularly around analytics platforms -- they are moving away from point solutions if they can, he said.

The new paradigm is bringing together all the relevant information -- if not in one place, at least, having the mechanisms to bring it together on demand -- and then do the analytics together in one place, Venugopal said.

"So, what is beyond proven is that analytics and decision-making [are] more accurate not with more advanced algorithms, but with more data, i.e., diverse data, and more data sources, i.e., 25 different data sources as opposed to five different data sources," he said.

It all points to the fact that even with moderate algorithms, more data gives organizations better results than trying to use "rocket science algorithms" with limited data, Venugopal said.

"What that means to enterprise technology is that they are building risk platforms on top of the modern data warehouses, which combines a variety of internal and external data sets, and trying to combine real-time data feeds -- real-time triggers, real-time market factors, currency risk, etc. -- which was not part of the previous generation's capabilities," he said.

Single-point products can only address limited portions of this because that's how they're designed; enterprise risk can only be covered with a broader approach, according to Venugopal.

"I think the trend [in enterprises] is more toward building sophisticated risk strategies and applications, and they're building out those and they're using core big data technology components like the Hadoop stack, like the Spark stack and tools like Impetus' Extreme Analytics," he said.

Custom risk analytics software and other considerations

Organizations looking to implement technology to mitigate risk have to consider a few additional things, including the usability and feature set, according to Rajiv Shah, senior solutions architect for GigaSpaces Technologies Inc. in New York City.

What have you found most confusing when searching for risk analytics software?

"For instance, high-volume traders need a solution that won't interfere with the data sync that is critical to being up to the microsecond," he said.

A product that offers multilevel dashboarding is also key, according to Shah.

For example, the data a CFO needs to know is far different than, say, what a risk or compliance officer needs to know, he said.

"Enterprises should consider a solution that takes these differences into account, making sure that a dashboard can become detailed and granular, while also offering a 50,000-foot view," Shah said. "And a strong risk mitigation strategy and tool set should be able to identify and simulate a wide range of scenarios."

According to Fiserv's Baker, it's important that a risk mitigation technology doesn't hinder a company's regular operations.

"For larger organizations, it often becomes critical to build your own solution to meet the needs," he said.

Mike Juchno, partner at Ernst & Young Advisory Services, agreed that there is a custom tool component to risk analytics.

"Many of our clients already have these tools -- they're some sort of predictive analytics tool like SPSS, like SAS, like R, and some visualization on top of them, like Tableau or Power BI," he said. "So, we are able to build something custom to deal with a risk that may be unique to them or their industry or their particular situation. So, we typically find that it's a custom approach."

When it comes to looking for an off-the-shelf product, CFOs often hear about risk analytics tools from their peer-to-peer organizations. These groups come together to share information about tools.

"Of course, you're going to also look toward other companies or competitors that are doing risk management and performance management well and see what tools they have in place," Baker said. "The most high-performing clients I see embed their tools into not only solving current risk, but also expecting and forecasting future risk."

Although an organization can go to Fiserv and ask for a menu of risk analytics tools, it's more successful if both the company and Fiserv drill down into what the organization is trying to accomplish and customize the tools from there, according to Baker.

Most organizations want to make better strategic decisions, as the challenges of growth are greater now, and improve their forward-looking, strategic discipline and processing, he said.

The focus has shifted to agility and efficiency when implementing risk analytics tools, Baker said.

"The high-performing Fiserv clients I work with have integrated risk analytics tools into finance operations," he said. "These advanced solutions offer an integrative solution that also forecasts and plans for the strategic future."

Organizations are increasingly being thoughtful with their risk processes, he said. And in recent years questions to vendors have evolved from "what are your risk tools?" to "how do I get better information to make decisions for the future?"


Read more »



Mar
20

Regulatory compliance, loan covenants and currency risk are common targets, as organizations sift through ERP and other data looking for patterns that might give early warning.

As CFO of TIBCO Software Inc., Tom Berquist spends a lot of time working on risks, such as the failure to live up to loan covenants. Berquist uses risk analytics software to stay on top of things.

"As a private equity-backed company -- we're owned by Vista Equity Partners -- we carry a large amount of debt," he said. "We have covenants associated with that and they're tied to a number of our financial metrics." Consequently, a major part of Berquist's risk-management process is to stay in front of what's going on with the business. If there's going to be softness in TIBCO's top-line revenue, he has to make sure to manage the company's cost structure so it doesn't violate any of the covenants. Berquist said he has a lot of risk analytics tied to that business problem.

The intent of risk analytics is to give CFOs and others in the C-suite a complete, up-to-date risk profile "as of now," said Thomas Frénéhard, director of solution management, governance, risk and compliance at software vendor SAP.

"There's no need to wait for people to compile information at the end of the quarter and send you [information] that's outdated," Frénéhard said. "What CFOs want now is their financial exposure today."

Looking for patterns in corporate data

Risk analytics involves the use of data analysis to obtain insights into various risks in financial, operational and business processes, as well as to monitor risks in ways that can't be achieved through more traditional approaches to risk management, financial controls and compliance management, said John Verver, a strategic advisor to ACL Services, a maker of governance, risk and compliance software based in Vancouver, B.C.

Some of the most common uses of risk analytics are in core financial processes and core ERP application areas, including the purchase-to-pay and order-to-cash cycles, revenue and payroll -- "analyzing and testing the detailed transactions, for example, to look for indications of fraud [and] indications of noncompliance with regulatory requirements and controls," Verver said.

Once the data is in one place, CFOs should be able to easily visualize the data in a risk dashboard.
Dan Zittingchief product officer, ACL Services

Using advanced risk management -- i.e., risk analytics software -- will allow CFOs to access data from complex systems, including ERP environments, and easily identify key areas of risk, said Dan Zitting, chief product officer of ACL Services.

"The technology can be set up to pull data from the HR, sales and billing departments, for example, and cross-reference the information within the program's interface," Zitting said in an email. "Once the data is in one place, CFOs should be able to easily visualize the data in a risk dashboard that summarizes activity and flags changes in risk."

Berquist also uses risk analytics to manage foreign currency risk for TIBCO, which is an international company, as well as risks connected to managing cash.

"Every month I close the books, I get all my actuals and I export them all into my data warehouse and I load up my dashboards. I happen to use TIBCO Spotfire [business intelligence software], but you can load them up in any risk analytics tool," he said. "Then I review where we stand on everything that has happened so far. Are expenses in line? Where does our revenue stand? What happened with currency? What happened with cash? How does the balance sheet look? That's the first part of the problem."

The second part is forecasting what will happen with TIBCO's expenses, which helps Berquist ensure that the company is going to generate sufficient cash to avoid violating covenants and mitigate the effects of offshore currency fluctuations.

Berquist said there are general-purpose risk management technologies, some of which are tied to such things as identifying corporate fraud, but there is also company- or industry-specific risk analytics software.

"My big concern is financial risk, so most of my [use of risk analytics] is around those types of measures," he said.

Risk analytics software helps CFOs make better decisions for the future because without an approach that allows them to run different scenarios and determine potential outcomes, they end up making gut instinct-oriented or seat-of-the-pants decisions, according to Berquist.

Sharing a similar view is Tom Kimner, head of global product marketing and operations for risk management at SAS Institute Inc., a provider of analytics software, based in Cary, N.C.

"What makes risk analytics a little bit different, in some cases, is that risk generally deals with the future and uncertainty," Kimner said.

Cristina Silingardi, a former CFO and treasurer at HamaTech USA Inc., a manufacturer of equipment for the semiconductor industry, concurred with Berquist that risk assessment can no longer be done as it used to be based on individuals' knowledge of their businesses, their instincts and a few key data points.

"There is so much data right now, and the biggest change I see is that now this data encompasses structured internal company data as well as unstructured external data," said Silingardi, now managing director of vcfo Holdings, a consulting firm based in Austin, Texas, that specializes in finance, recruiting and human resources.

CFOs started getting more involved with risk analytics when they needed better revenue metrics to understand predictability and trends, she said. Risk analytics software went beyond traditional risk-management tools by adding real-time reporting that puts key metrics right in front of CFOs and updates them all day long. Such data can help CFOs keep an eye on regulatory and contractual noncompliance from vendors, according to Silingardi.

"It helps them with pattern recognition, but only if [they] can translate that to really good visual dashboards that are looking at this data. [CFOs] used to focus only on a few things. Now, [they're] using all this data to get a much better picture," she said.

Forward-thinking mindset is key

Historically, risk analysis and assessment has tended to be a reactive and subjective process, according to Daniel Smith, director of data science and innovation at Syntelli Solutions Inc., a data analytics company based in Charlotte, N.C. After something bad happens, the tendency is for people to say, "'Let's investigate it, or, 'Let's all huddle up and think about what could happen and create a bunch of speculative scenarios,'" he said.

That's exactly the way many of SAP's customers still look at risk: through the rear-view mirror, said Bruce McCuaig, director of governance, risk and compliance solution marketing at SAP.

"Once or twice a year they report to the board and they look backwards, but what I think we're seeing now is the ability to look forward and report frequently online and in real time," McCuaig said.

In modern analytics and modern business, companies want to focus more on proactive, predictive and objective risk, Smith said. While focusing on risk in this manner gives CFOs visibility into the future, many don't have the pipeline of data and a single source of consolidated data to enable them to do that.

"They need a system, a way to collect that data and be able to analyze it," he said. "From a strategic point of view, it's more of a data initiative."

The goal is to give people the skills and applications to view highly interactive and multidimensional data as opposed to a traditional, two-dimensional tabular view in a spreadsheet, Smith said.

When it comes to risk analytics, CFOs should be thinking about techniques, not specific tools. Risk analysis is more about understanding ways to mine data better than about which platform can do it, according to Smith.

"Risk analytics is part of something larger. At SAP, we don't have a category of solutions called 'risk analytics,'" McCuaig said. "There are a variety of analytics tools that will serve the purpose."

How has your company used risk analytics?


Read more »



Mar
20
Risk mapping key to security, business integration
Posted by Thang Le Toan on 20 March 2018 12:43 AM

It’s no secret that data protection has become integral to bottom line success for digital businesses. As a result, it’s time for InfoSec professionals to crawl out of their caves and start communicating with the rest of the business, Tom Kartanowicz, head of information security at Natixis, North America, told the audience at the recent CDM Media CISO Summit.

To facilitate this communication, the language these pros will use is the language of security risk, Kartanowicz said.

“As security professionals, if we want to be taken seriously we need to put what we do into the risk lens to talk to the business so they understand the impact and how we’re trying to reduce the impact of the types of threats we’re seeing,” Kartanowicz said.

For example, even though the chief information security officer and chief risk officer may appear to be two different islands in an organization, they are part of the same team, he reminded the audience.

 

Business is the bridge that links them together so instead of working in silos, security professionals should carve out what Kartanowicz calls a “friends and family plan” that forms allies with other departments in their organization. The human resources department can help discipline somebody who might be an internal threat to the organization, corporate communications can help talk to the media and customers when there are incidents like DDoS and malware attacks, and the legal department can be valuable allies when it is time to take action against bad actors, he explained.

“As the CISO or as the head of InfoSec, you are missing out on a lot of valuable intelligence if you are not talking to all these different teams,” he stressed.

Risk mapping — a data visualization tool that outlines an organization’s specific risks — is an effective way to identify threats and vulnerabilities, then communicate them to the business, he said. Risk mapping helps an organization identify the areas where it’s going to spend their security budget, how to implement solutions and, most importantly, helps identify specific instances of risk reduction, he said.

Kartanowicz said there are two things to consider when evaluating and determining the likelihood of a risk: how easy is it to exploit and how often it occurs.

“If the vulnerabilities require technical skills held by 1% of the population, it’s going to be pretty difficult to exploit,” he said. “If on the other hand, anybody on the street can exploit it, it’s going to be pretty easy.”

It is then time to address the specific risks, he said.

“In the enterprise risk management world, the business can accept the risks, avoid the risks or [work to] mitigate the risks — this is where InfoSec comes in — or transfer the risks,” he said.

Using tools such as the NIST cybersecurity framework can help InfoSec reduce the risks, he said. It’s important that organizations tie in their disaster recovery, backup strategy, business continuity and crisis management into whatever the framework they choose, he added. Organizations should also ensure they have baseline controls in place to help minimize the risk of a data breach, he added.

But as threats evolve and vulnerabilities change, he suggested that the risk map be re-evaluated annually. Business requirements are constantly evolving and organizations are always entering different markets, but companies need to be constantly aware of the threat landscape, he added.

“Incidents will always occur; risk is not going away,” he said.


Read more »



Mar
20
risk map (risk heat map)
Posted by Thang Le Toan on 20 March 2018 12:42 AM

A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A risk map helps companies identify and prioritize the risks associated with their business.

The goal of a risk map is to improve an organization's understanding of its risk profile and appetite, clarify thinking on the nature and impact of risks, and improve the organization's risk assessment model. In the enterprise, a risk map is often presented as a two-dimensional matrix. For example, the likelihood a risk will occur may be plotted on the x-axis, while the impact of the same risk is plotted on the y-axis.

A risk map is considered a critical component of enterprise risk management because it helps identify risks that need more attention. Identified risks that fall in the high-frequency and high-severity section can then be made a priority by organizations. If the organization is dispersed geographically and certain risks are associated with certain geographical areas, risks might be illustrated with a heat map, using color to illustrate the levels of risk to which individual branch offices are exposed.

risk matrix example

A risk matrix that includes natural disasters and human risk factors.

How to create a risk map

Identification of inherent risks is the first step in creating a risk map. Risks can be broadly categorized into strategic risk, compliance risk, operational risk, financial risk and reputational risk, but organizations should aim to chart their own lists by taking into consideration specific factors that might affect them financially. Once the risks have been identified, it is necessary to understand what kind of internal or external events are driving the risks.

The next step in risk mapping is evaluating the risks: estimating the frequency, the potential impact and possible control processes to offset the risks. The risks should then be prioritized. The most impactful risks can be managed by applying control processes to help lessen their potential occurrence.

As threats evolve and vulnerabilities change, a risk map must be re-evaluated periodically. Organizations also must review their risk maps regularly to ensure key risks are being managed effectively.

Why it's important to create a risk map

A risk map offers a visualized, comprehensive view of the likelihood and impact of an organization's risks. This helps the organization improve risk management and risk governance by prioritizing risk management efforts. This risk prioritization enables them to focus time and money on the most potentially damaging risks identified in a heat map chart.

A risk map also facilitates interdepartmental dialogues about an organization's inherent risks and promotes communication about risks throughout the organization. It helps organizations visualize risks in relation to each other, and it guides the development of a control assessment of how to deal with the risks and the consequence of those risks.

The map can help the company visualize how risks in one part of the organization can affect operations of another business unit within the organization.

How has creating a risk heat map helped your organization's risk management efforts?

A risk map also adds precision to an organization's risk assessment strategy and identifies gaps in an organization's risk management processes.

 


Read more »



Mar
20
risk map (risk heat map)
Posted by Thang Le Toan on 20 March 2018 12:42 AM

A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A risk map helps companies identify and prioritize the risks associated with their business.

The goal of a risk map is to improve an organization's understanding of its risk profile and appetite, clarify thinking on the nature and impact of risks, and improve the organization's risk assessment model. In the enterprise, a risk map is often presented as a two-dimensional matrix. For example, the likelihood a risk will occur may be plotted on the x-axis, while the impact of the same risk is plotted on the y-axis.

A risk map is considered a critical component of enterprise risk management because it helps identify risks that need more attention. Identified risks that fall in the high-frequency and high-severity section can then be made a priority by organizations. If the organization is dispersed geographically and certain risks are associated with certain geographical areas, risks might be illustrated with a heat map, using color to illustrate the levels of risk to which individual branch offices are exposed.

risk matrix example

A risk matrix that includes natural disasters and human risk factors.

How to create a risk map

Identification of inherent risks is the first step in creating a risk map. Risks can be broadly categorized into strategic risk, compliance risk, operational risk, financial risk and reputational risk, but organizations should aim to chart their own lists by taking into consideration specific factors that might affect them financially. Once the risks have been identified, it is necessary to understand what kind of internal or external events are driving the risks.

The next step in risk mapping is evaluating the risks: estimating the frequency, the potential impact and possible control processes to offset the risks. The risks should then be prioritized. The most impactful risks can be managed by applying control processes to help lessen their potential occurrence.

As threats evolve and vulnerabilities change, a risk map must be re-evaluated periodically. Organizations also must review their risk maps regularly to ensure key risks are being managed effectively.

Why it's important to create a risk map

A risk map offers a visualized, comprehensive view of the likelihood and impact of an organization's risks. This helps the organization improve risk management and risk governance by prioritizing risk management efforts. This risk prioritization enables them to focus time and money on the most potentially damaging risks identified in a heat map chart.

A risk map also facilitates interdepartmental dialogues about an organization's inherent risks and promotes communication about risks throughout the organization. It helps organizations visualize risks in relation to each other, and it guides the development of a control assessment of how to deal with the risks and the consequence of those risks.

The map can help the company visualize how risks in one part of the organization can affect operations of another business unit within the organization.

How has creating a risk heat map helped your organization's risk management efforts?

A risk map also adds precision to an organization's risk assessment strategy and identifies gaps in an organization's risk management processes.

 


Read more »



Mar
8
Deep learning projects: Cloud-based AI or dedicated hardware?
Posted by Thang Le Toan on 08 March 2018 12:28 AM

Are deep learning projects part of your AI agenda this year? Here's how to evaluate the tradeoffs between using cloud-based AI infrastructure versus dedicated hardware.

Chip and system vendors are developing -- and rapidly innovating -- new AI processors designed for deep learning projects that use neural networks, the computing systems designed to approximate how human brains work.

At the same time, many cloud vendors have also been introducing these processing capabilities via dedicated GPUs and field programmable gate arrays (FPGAs), the integrated circuits designed to be customized after manufacturing. Google, which has stated that AI is strategic across all its businesses, is offering dedicated AI services built on its custom tensor processing unit (TPU), the company's application-specific integrated circuit developed specifically for neural network deep learning projects.

"Cloud providers are betting that, over time, all companies will use deep learning and want to get a head start," said Sid J. Reddy, chief scientist at Conversica, which develops AI software for marketing and sales.

As CIOs begin mapping out their AI strategies -- in particular, their need and ability to do deep learning projects -- they must consider a variety of tradeoffs between using faster, more efficient private AI infrastructure, the operational efficiencies of the cloud, and their anticipated AI development lifecycle.

In general, private AI infrastructure is cost-effective for companies doing multiple, highly customized AI projects. If those companies are using data from applications running in the cloud, however, the cost of moving data into an on-premises AI system could offset the value of having dedicated hardware, making cloud-based AI cheaper. But, for many deep learning projects in this incredible fast-moving field, the economics could quickly change. Here's a breakdown.

Take small steps

Private AI infrastructure requires a large investment in fixed costs and ongoing maintenance costs. Because of the capital expense related to building and maintaining private AI infrastructure, cloud-based AI services -- even when they cost in aggregate more than private infrastructure -- can be the smart economic choice as enterprises flesh out their AI strategy before making a bigger commitment.

Sid J. ReddySid J. Reddy

For small companies, fears about the high price of using this new AI infrastructure shouldn't be the reason to not try deep learning projects, Reddy said. As deep learning becomes more accepted as state-of-the-art for a wide range of tasks, he believes that more AI algorithms will transition to it. This is because deep learning promises to reduce some of the overhead in preparing data and optimizing new AI models.

Enterprises and small companies, alike, also need to determine if they have enough data to train the models for their deep learning projects without "overfitting," or creating a model that does not make accurate predictions for new data. Reddy said this is easier for a startup like Conversica that has data from hundreds of millions of conversations to work with. "It might not be the case with other startups that have limited aggregated data to begin with," he said.

Going beyond the basics

Some cloud providers like Microsoft with its Cognitive Services in Azure use FPGA chips under the hood for improving specific AI services. This approach hides the complexity of the FPGA from the customer, while providing some of the cost savings that FPGA chips provide on the back end. AWS has taken a different approach, becoming the first provider to allow enterprises to directly access FPGAs for some applications. And enterprises are starting to experiment with these.

For example, Understory, a weather forecasting service, has started moving some of its heavier machine learning algorithms into the cloud using AWS' new FPGA service to help with the analysis.

Eric HewittEric Hewitt

"Given our expansion of stations and our plan for growth, we will need to become smarter about the types of processors and metal we run our analyses and algorithms on," said Eric Hewitt, vice president of technology at Understory. "We would not push this type of power to our edge computing layer, but for real-time algorithms running on a network of data, it's feasible that we would use them."

Private AI, good for specialized needs

Some IT executives believe significant cost savings and performance improvements can be reaped by customizing AI-related hardware.

Rix RyskampRix Ryskamp

"I use a private infrastructure because my very specific needs are sold at a premium in the cloud," said Rix Ryskamp, CEO of UseAIble, an AI algorithm vendor. "If I had more general needs (typically, not machine learning), I would use cloud-only solutions for simplicity."

CIOs also need to think about the different components in the AI development lifecycle when deciding how to architect their deep learning projects. In the early research and development stages of an AI lifecycle, enterprises analyze large data sets to optimize a production-ready set of AI models. These models require less processing power when done in an on-premises production system than in cloud-based AI infrastructure. Therefore, Ryskamp recommended companies use private infrastructure for R&D.

The cloud, on the other hand, is often a better fit for production apps as long as requirements -- like intensive processing power -- do not make cost a problem.

"CIOs who already prefer the cloud should use it so long as their AI/[machine learning] workloads do not require so much custom hardware that cloud vendors cannot be competitive," Ryskamp said.

Energy efficiency, a red herring in deep learning projects?

Robert LeeRobert Lee

"In general, the economics of doing large-scale deep learning projects in the public cloud are not favorable," said Robert Lee, chief architect with FlashBlade at Pure Storage, a data storage provider.

On the flip side, Lee agreed that training is most cost-effective where data is collected or situated. So, if an enterprise is drawing on a large pool of SaaS data, or using a cloud-based data lake, then he said it does makes more sense to implement the deep learning project in the cloud.

Indeed, the economic calculus of on-premises versus using cloud-based AI infrastructure will also vary according to a company's resources and timetable. The attraction of deploying private infrastructure, so that it can take advantage of the greater power efficiency of FPGAs and new AI-chips, is only one benefit, Lee argued.

"The bigger Opex lever is in making data science teams more productive by optimizing and streamlining the process of data collection, curation, transformation and training," he argued.

Tremendous time and effort is often spent in the extract, transform and load-like phases of deep learning projects, which create delays to data science teams, rather than running the AI algorithms themselves.

Continuous learning blurs choice between cloud-based AI and private

The other consideration is that as AI systems mature and evolve, continuous or active learning will become more important. Initial approaches to AI have centered around training models to do prediction/classification, then deploying them into production to analyze data as it's generated.

Which will you choose for your deep learning projects: cloud-based AI or on-premises hardware?

"We are starting to realize that in most use-cases, we are never actually done training and that there's no clear break between learning and practicing," Lee said.

In the long run, CIOs will need to see that AI models in deep learning projects are very much like humans who continuously learn. A good model is like the undergraduate with an engineering degree who was trained in basic concepts and has a good basic understanding about how to think about engineering. But expertise is developed over time and with experience, while learning on the job. Implementing these kinds of learning loops will blur the lines around distinctions such as doing the R&D component on private infrastructure versus in cloud-based AI infrastructure.

"Just like their human counterparts, AI systems need to continuously learn -- they need to be fed a constant pipeline of data collection/inference/evaluation/retraining wherever possible," Lee said.


Read more »



Feb
20
sensor analytics
Posted by Thang Le Toan on 20 February 2018 01:30 PM

Sensor analytics is the statistical analysis of data that is created by wired or wireless sensors.

A primary goal of sensor analytics is to detect anomalies. The insight that is gained by examining deviations from an established point of reference can have many uses, including predicting and proactively preventing equipment failure in a manufacturing plant, alerting a nurse in an electronic intensive care unit (eICU) when a patient’s blood pressure drops, or allowing a data center administrator to make data-driven decisions about heating, ventilating and air conditioning (HVAC).

Because sensors are often always on, it can be challenging to collect, store and interpret the tremendous amount of data they create. A sensor analytics system can help by integrating event-monitoring, storage and analytics software in a cohesive package that will provide a holistic view of sensor data. Such a system has three parts: the sensors that monitor events in real-time, a scalable data store and an analytics engine. Instead of analyzing all data as it is being created, many engines perform time-series or event-driven analytics, using algorithms to sample data and sophisticated data modeling techniques to predict outcomes. These approaches may change, however, as advancements in big data analytics, object storage and event stream processing technologies will make real-time analysis easier and less expensive to carry out.

Most sensor analytics systems analyze data at the source as well as in the cloud. Intermediate data analysis may also be carried out at a sensor hub that accepts inputs from multiple sensors, including accelerometers, gyroscopes, magnetometers and pressure sensors. The purpose of intermediate data analysis is to filter data locally and reduce the amount of data that needs to be transported to the cloud. This is often done for efficiency reasons, but it may also be carried out for security and compliance reasons.

The power of sensor analytics comes from not only quantifying data at a particular point in time, but by putting the data in context over time and examining how it correlates with other, related data. It is expected that as the Internet of Things (IoT) becomes a mainstream concern for many industries and wireless sensor networks become ubiquitous, the need for data scientists and other professionals who can work with the data that sensors create will grow -- as will the demand for data artists and software that helps analysts present data in a way that’s useful and easily understood.

 

Do you think conversational technologies like chatbots will soon be replaced by smarter AI agents?


Read more »



Feb
20
Recover your deleted files quickly and easily
Posted by Thang Le Toan on 20 February 2018 10:09 AM

Recuva®

Recover your deleted files quickly and easily.

Accidentally deleted an important file? Lost files after a computer crash? No problem - Recuva recovers files from your Windows computer, recycle bin, digital camera card, or MP3 player!

Download Free Version Get Recuva Pro!
  • Life saver

    Superior file recovery

    Recuva can recover pictures, music, documents, videos, emails or any other file type you’ve lost. And it can recover from any rewriteable media you have: memory cards, external hard drives, USB sticks and more!

  • Damaged disk

    Recovery from damaged disks

    Unlike most file recovery tools, Recuva can recover files from damaged or newly formatted drives. Greater flexibility means greater chance of recovery.

  • Scan

    Deep scan for buried files

    For those hard to find files, Recuva has an advanced deep scan mode that scours your drives to find any traces of files you have deleted.

  • Shredder

    Securely delete files

    Sometimes you want a file gone for good. Recuva’s secure overwrite feature uses industry- and military-standard deletion techniques to make sure your files stay erased.


Read more »




Help Desk Software by Kayako