Machine data (Internet of Things) or “data exhaust” analysis is one of the fastest growing segments of “big data”–generated by websites, applications, servers, networks, mobile devices and other sources. The goal is to aggregate, parse and visualize this data – log files, scripts, messages, alerts, changes, IT configurations, tickets, user profiles etc – to spot trends and act.
Machine data comes in many forms. Take for instance, what the Bosch Group is doing in Germany and Schnieder Electric in France.
The Bosch Group has embarked on a series of initiatives across business units that make use of data and analytics to provide so-called intelligent customer offerings. These include intelligent fleet management, intelligent vehicle-charging infrastructures, intelligent energy management, intelligent security video analysis, and many more. To identify and develop these innovative services, Bosch created a Software Innovations group that focuses heavily on big data, analytics, and the “Internet of Things.”
Similarly, the Schneider Electric focuses primarily on energy management, including energy optimization, smart-grid management, and building automation. Its Advanced Distribution Management System, for example, handles energy distribution in utility companies. ADMS monitors and controls network devices, manages service outages, and dispatches crews. It gives utilities the ability to integrate millions of data points on network performance and lets engineers use analytics to monitor the network.
By monitoring and analyzing data from customer clickstreams, transactions, log files to network activity and call records–and more, there is new breed of startups that are racing to convert “invisible” machine data into useful performance insights. The label for this type of analytics – operational or application performance intelligence.
In this posting we cover a low profile big data company, Splunk. Splunk has >3500 customers already. Splunk’s potential comes from its presence in the growing cloud-analytics space. With companies gathering incredible amounts of data, they need help making sense of it and using it to optimize their business efficiency, and Splunk’s services give users the opportunity to get more from the information they gather.
Next best offer, next best action, interaction optimization, and experience optimization typically have similar architecture. Machine learning and multivariate statistical analysis are at the heart of these cutting edge Behavioral Analytics strategies. Typically firms use statistical tools for segmentation models, behavioral propensity modeling, and market basket analysis.
The bleeding edge in next best offer is increasingly around:
- Applying machine learning to find connections between product tastes and different affinity statements
- Developing low-latency algorithms that help show the right product at the right time to a customer
- Developing rich customer affinity profiles through a variety of feedback loops as well as third-party data source (e.g. Facebook user demos and taste graph)
Targeted Offer Solutions
Here are just a few examples of analytics at work
- Target predicts customer pregnancy from shopping behavior, thus identifying prospects to contact with offers related to the needs of a newborn’s parents.
- Tesco (UK) annually issues 100 million personalized coupons at grocery cash registers across 13 countries. Predictive analytics increased redemption rates by a factor of 3.6.
- Netflix predicts which movies you will like based on what you watched.
- Life insurance companies can predicts the likelihood an elderly insurance policy holder will die within 18 months in order to trigger end-of-life counseling.
- Con Edison predicts energy distribution cable failure, updating risk levels that are displayed on operators’ screens three times an hour in New York City.
Now you are interested. So what about your organization. Do you have the right toolset, dataset, skillset and mindset for analytics? Do you want to enable end users to get access to their data without having to go through intermediaries?
The challenge facing managers in every industry is not trivial… how do you effectively derive insights from the deluge of data? How do you structure and execute analytics programs (Infrastructure + Applications + Business Insights) with limited budgets?
Procurement organizations tend to swim in data. One of the most important strategies for any best-in class procurement organization is spend analytics. In conjunction with sourcing, category, contract management and purchasing, spend analytics provides a window into spend behavior to drive cost reduction and cost avoidance efforts.
As a result, we are seeing a lot of interest around Spend BI and Analytics projects. Chief Procurement Officers and other Sourcing/Procurement leaders of Global, large and even mid-market firms are increasingly focusing on spend data analytics as part of a new wave of spend rationalization projects. Read more
A recent study by the Nucleus Research says that Analytics pays back $10.66 for every dollar spent. The study is based on data from 60 case studies and relates to investments in Business Intelligence, Performance Management and predictive analytics. Not surprising are the areas where they saw ROI increase – revenue, gross margin and expenses.
Enterprises have used various metrics to track the effectiveness of Business Analytics. Cycle Time to Information (CTI) is a metric that measures the elapsed time between the occurrence of a significant event and the time this information is available to a decision maker who has to act on that information. Cycle Time to Action (CTA) is variation of this metric which measures the elapsed time to act on information after an event occurs. These metrics are useful to track the efficiency of a Business Analytics infrastructure and the elimination of manual processes to increase productivity. As the volume of data increases in an enterprise, automation in data management will become more complex in the future. Read more