Business Analytics refers to the skills, technologies, applications and practices for the continuous exploration of data to gain insight that drive business decisions. Business Analytics is multi-faceted. It combines multiple forms of analytics and applies the right method to deliver expected results. It focuses on developing new insights using techniques including, data mining, predictive analytics, natural language processing, artificial intelligence, statistical analysis and quantitative analysis. In addition, domain knowledge is a key component of the business analytics portfolio. Business Analytics can then be viewed as the combination of domain knowledge and all forms of analytics in a way that creates analytic applications focused on enabling specific business outcomes.
Analytic applications have a set of business outcomes that they must enable. For fraud, its reducing loss, for quality & safety, it might be avoiding expensive recalls. Understanding how to enable these outcomes is the first step in determining the make-up of each specific application. For example, in the case of insurance fraud, it’s not enough to use statistical analysis to predict fraud. You need a strong focus on text, domain expertise, and the ability to visually portray organized crime rings. Insight gained through this analysis may be used as input for human decisions or may drive fully automated decisions. Database capacity, processor speeds and software enhancements will continue to drive even more sophisticated analytic applications. The key components of business analytics are:
- Domain Experts – true business analytics requires domain expertise. All too often, companies buy analytic software thinking it’s another piece of middleware, and are often frustrated when results are difficult to attain. In our previous example, without a domain expert that knows how to look for fraud – analytic software is flying blind.
- Knowledge Management – acquiring knowledge and expertise from domain experts is only half the battle. Capturing this knowledge and automating its application in a consistent manner is the critical success factor.
- Data and Text Mining – new insights are found in structured data and text by using data and text mining techniques. Patterns, trends, clusters and anomalies can all be understood through data mining.
- Text Analytics – over 80% of available insight resides in text. Yet almost all analytic efforts today exclude this insight. Business analytics must include the insight from text to be effective.
- Statistical Analysis and Predictive Models – the use of statistical methods to validate assumptions and/or predict future outcomes
- Visual Analytics – insight from large volumes of data is best represented visually
- Reporting and Analysis – traditional Business Intelligence
- Financial Performance and Strategy Management – Budgeting and planning, financial consolidation, score-carding and strategy management, financial analytics and related reporting
There is a massive explosion of data occurring on a number of levels. The notion of data overload was echoed in a previous 2010 IBM CEO study titled “Capitalizing on Complexity”. In this study, a large number of CEOs described their organizations as data rich, but insight poor. Many voiced frustration over their inability to transform available data into feasible action plans. This notion of turning data into insight, and insight to action is a common and growing theme.
According to Pricewaterhouse-Coopers, there are approximately 75 to 100 million blogs and 10-20 million Internet discussion boards and forums in the English language alone. As the Forrester diagram describes, more consumers are moving up the ladder and becoming creators of content. In addition, estimates show the volume of unstructured data (email, audio, video, Web pages, etc.) doubles every three months. Effectively managing and harnessing this vast amount of information presents both a great challenge and a great opportunity.
Data is flowing through medical devices, scientific devices, sensors, monitors, detectors, other supply chain devices, instrumented cars and roads, instrumented domestic appliances, etc. Everything will be instrumented and from this instrumentation comes data. This data will be analyzed to find insight that drives smarter decisions. The utility sector provides a great example of the growing need for analytics. The smart grid and the gradual installation of intelligent endpoints, smart meters and other devices will generate volumes of data. Smart grid utilities are evolving into brokers of information. The data tsunami that will wash over utilities in the coming years is a formidable IT challenge, but it is also a huge opportunity to move beyond simple meter-to-cash functions and into real-time optimization of their operations. This type of instrumentation is playing out in many industries. As this occurs, industry players will be challenged to leverage the data generated by these devices.
Inside the enterprise, consider the increasing volumes of emails, Word documents, PDFs, Excel worksheets and free form text fields that contain everything from budgets and forecasts to customer proposals, contracts, call center notes and expense reports. Outside the enterprise, the growth of web-based content, which is primarily unstructured, continues to accelerate –everything from social media, comments in blogs, forums and social networks, to survey verbatim and wiki pages. Most industry analysts estimate more than 80% of the intelligence required to make smarter decisions is contained in unstructured data or text.
The Analytic Footprint will Change in the next 24 Months
The survey results in a recent MIT Sloan report support both an aggressive adoption of analytics and a shift in the analytic footprint. According to the report, many traditional forms of analytics will be surpassed in the next 24 months. The authors produced a very effective visual that shows this shift from today’s analytic footprint to the future footprint. Although listed as number one, the authors describe visualization as dashboards and scorecards – the traditional methods of visualization. New and emerging methods help accelerate time-to-insight. These new approaches help us absorb insight from large volumes of data in rapid fashion. The analytics identified as creating the most value in 24 months are:
- Data visualization
- Simulations and scenario development
- Analytics applied within business processes
- Advanced statistical techniques
Advanced Analytics will Continue to Evolve
Companies and organizations continue to invest millions of dollars capturing, storing and maintaining all types of business data to drive sales and revenue, optimize operations, manage risk and ensure compliance. Most of this investment has been in technologies and applications that manage structured data – coded information residing in relational data base management systems in the form of rows and columns. Current methods such as traditional business intelligence (BI) are more about querying and reporting and focus on answering questions such as what happened, how many, how often, and what actions are needed. New forms of advanced analytics are required to address the business imperatives described earlier.
- Text or unstructured data more closely resembles natural language, or the way humans communicate in the spoken and written word. Computers have long required that information be coded for automatic data processing. Business Intelligence technologies were designed to operate on structured data and have limited capabilities in analyzing text. But this is rapidly changing. Research in the late 1990s led to breakthrough software that allows computers to understand and process free-form text, offering government and commercial organizations the opportunity to leverage the vast amounts of information contained in text and other non-structured formats. The software has its roots in the Intelligence community and was developed by Universities and Start-ups. The technology allows users to extract and analyze facts like who, what, where, when and why; then allows users to drill down to understand people, places and events and how they are related. The software is now classified as “Text Analytics” software and an entire market has formed around it.
- Predictive analytics, statistical analysis and data mining received mainstream attention for the first time in 2010. In a recent hype cycle diagram, Gartner shows predictive analytics approaching main stream adoption in the next two years. Predictive models exploit patterns found in historical and transactional data to identify risks and opportunities. Models guide decision making by capturing relationships among many factors to allow assessment of risk or opportunity associated with a particular set of conditions. In a recent IBM 2010 Global CFO Study, the majority of respondents said they plan to use advanced analytics to uncover correlations among seemingly unrelated pieces of information and find patterns nearly impossible to detect manually. These analytics enable the discovery of unknown facts by supporting analytic methods against unknown or changing collections of lightly prepared data. Such capabilities will also pave the way to predictive analytics that are delivered at lightning speed to end users.
- Stream computing is presenting many possibilities for effectively analyzing data in motion. There is no better way to appreciate the power of emerging analytic technology than viewing it in the context of life saving applications. Stream computing, data mining and predictive analytics are being used to save stroke victims. This solution looks for patterns in data that could help identify patients who are experiencing a severe complication to ruptured brain aneurysms. The examples extend beyond healthcare across all industries. This combination of technologies enables high speed, scalable and complex analytics of data streams in motion.
- There will be a shift to strategic analytics driving operational process – as opposed to aligning with or reporting on those same processes. In other words, let intelligence drive process as opposed to looking for intelligence after the fact. To enable this, there must be minimal latency between event and decision point. You need the information when you have the customer or prospect on the phone, or when the traffic is showing possible signs of congestion. There is renewed interest in event-driven architectures, with enabling technologies like complex event processing and predictive analytics. The two combine to enable the processing of data in real time and the application of predictive models that help determine future outcomes.
- Mash-ups will allow us to pull together all relevant data for a particular decision, while social technologies allow us to extract wisdom from the crowd. In so doing, this collaborative process enables us to confidently determine the appropriate next action in the context of a given process. This is one example of the impact that strategic analytics and social capabilities will have on corporate processes in the next two years.
- The term “context” will be used frequently in the coming years. From the search experience, to data management, the semantic web and beyond, context is the key to managing an ever expanding universe of data. Context-Aware Computing is the use of information about an end users environment, activities, connections and preferences to improve the quality of interaction with that end user. Gartner believes a contextually aware system anticipates the user’s needs and proactively serves up the most appropriate and customized content, product or service. Contextual awareness however can also be applied to the growing data management challenge. Identifying and collecting data can be automated if context is determined. That data which has value can be retained and everything else eliminated. In addition, data can be organized by using known context. For example, email can be stored in one inbox with no organization to it, or reviewed, classified and placed into meaningful folders. Once classified, finding relevant email is simplified. This review and classification process can be automated and applied to all data and content.
Business Analytics will Enable Outcomes
Business Analytics focuses on answering questions such as why is this happening, what if these trends continue, what will happen next (predict), what is the best that can happen (optimize). There is a growing view that prescribing outcomes is the ultimate role of analytics; that is, identifying those actions that deliver the right business outcomes. Organizations should first define the insights needed to meet business objectives, and then identify data that provides that insight. Too often, companies start with data.
The previously mentioned IBM study also revealed that analytics-driven organizations had 33 percent more revenue growth with 32 percent more return on capital invested. Organizations expect value from emerging analytic techniques to soar. The growth of innovative analytic applications will serve as a means to help individuals across the organization consume and act upon insights derived through complex analysis. Some examples of innovative use:
- GPS-enabled navigation devices already superimpose real time traffic patterns and alerts onto navigation maps and suggest the best routes to drivers.
- Analytic algorithms are used to forecast attrition probabilities, pinpoint at-risk customers and recommend precise retention strategies.
- Dashboards that now reflect last quarter sales will also show potential next quarter sales under a variety of different conditions – a new media mix, a price change, or a larger sales team
- Simulations evaluating alternative scenarios will automatically recommend optimal approaches – such as the best media mix to introduce a specific product to a specific segment, or the ideal number of sales professionals to assign to a particular new territory.
Analytic Excellence is an Evolutionary Process
A recent MIT Sloan report effectively uses the maturity model concept to describe how organizations typically evolve to analytic excellence. The authors point out that organizations begin with efficiency goals and then address growth objectives after experience is gained. The authors believe this is a common practice, but not necessarily a best practice. They see the traditional analytic adoption path starting in data-intensive areas like financial management, operations, and sales and marketing. As companies move up the maturity curve, they branch out into new functions, such as strategy, product research, customer service, and customer experience. In the opinion of the authors, these patterns suggest that success in one area stimulates adoption in others. They suggest that this allows organizations to increase their level of sophistication.
The authors of the MIT Sloan special report through their analysis of survey results have created three levels of analytic capabilities:
- Aspirational – use analytics to justify actions
- Experienced – use analytics to guide actions
- Transformed – use analytics to prescribe actions
The report provides a very nice matrix that describes these levels in the context of a maturity model. In reviewing business challenges outlined in the matrix, there is one very interesting dynamic: the transition from cost and efficiencies to revenue growth, customer retention and customer acquisition.
The authors of the report found that as the value of analytics grows, organizations are likely to seek a wider range of capabilities – and more advanced use of existing ones. The survey indicated that this dynamic is leading some organizations to create a centralized analytics unit that makes it possible to share analytic resources efficiently and effectively. These centralized enterprise units are the primary source of analytics, providing a home for more advanced skills within the organization. This same dynamic will lead to the appointment of Chief Analytics Officers (CAO) starting in 2011.
The availability of strong business-focused analytical talent will be the greatest constraint on a company’s analytical capabilities. The Outsourcing of analytics will become an attractive alternative as the need for specialized skills will lead organizations to look for outside help. Outsourcing analytics allows a company to focus on taking action based on insights delivered by the outsourcer. The outsourcer can leverage these specialized resources across multiple clients. As the importance of analytics grows, organizations will have an option to outsource. Expect to see more of this in 2011.
We will see more organizations establish enterprise data management functions to coordinate data across business units. We will also see smarter approaches such as information lifecycle management as opposed to the common approach of throwing more hardware at the growing data problem. The information management challenge will grow as millions of next-generation tech-savvy users use feeds and mash-ups to bring data together into usable parts so they can answer their own questions. This gives rise to new challenges, including data security and governance.