Skip to content

Posts tagged ‘Databases’

25
May

Big Data Analytics Use Cases


Are you data-flooded, data-driven, data informed? Are you outcome oriented, insight driven or hindsight driven?

Are you a firm where executives claim – “Data is our competitive advantage.” Or sprout analogies like, “data is the new oil”.

The challenge I found in most companies is not dearth of vision… everyone has a strategy and a 100,000 ft general view of the importance or value of data. Every executive can parrot the importance of data and being data-driven.

The challenge is the next step….so, how are you going to create new data products? How are you going to execute a data driven strategy? How are you going to monetize data assets? What are the right business use cases to focus on? How to map the use case to underlying models and data requirements? What platform is a good long-term bet?  The devil is in these details.

Everyone is searching for new ways to turn data into $$$ (monetize data assets). Everyone is looking for new levers to extract value from data.  But data ingesting and modeling is simply a means to an end. The end is not just more reports, dashboards, heatmaps, knowledge, or wisdom. The target is fact based decisions, guided machine learning and actions. Another target is arming users to do data discovery and insight generation without involving IT teams…so called User-Driven Business Intelligence.

In other words, what is the use case that shapes the context for “Raw Data -> Aggregated Data -> Intelligence -> Insights -> Decisions -> Operational Impact -> Financial Outcomes -> Value creation.”  What are the right use cases for the emerging hybrid data ecosystem (with structured and unstructured data)?

Read more »

23
May

Data Monetization: Turning Data into $$$


DataExplosionThe billion dollar question facing executives everywhere:

  • How do I monetize my data? How do we turn data into dollars?
  • What small data or big data monetization strategies should I adopt?
  • Which analytical investments and strategies really increase revenue?
  • What pilots should I run to test data monetization ideas out?

Data Monetization is the process of converting data (raw data or aggregate data) into something useful and valuable – help make decisions (such as predictive maintenance) based on multiple sources of insight.  Data monetization creates opportunities for organizations with significant data volume to leverage untapped or under-tapped information and create new sources of revenue (e.g., cross-sell and upsell lift;   or prevention of equipment breakdowns).

But, data monetization requires a new IT clock-speed that most firms are struggling with. Aberdeen Research found that the average time it takes for IT to complete BI support requests, with traditional BI software, is 8 days to add a column to a report and 30 days to build a new dashboard.  For an individual information worker trying to find an answer, make a decision, or solve a problem, this is simply untenable. For an organization that is trying to differentiate itself on information innovation or data-driven decision making, it is a major barrier to strategy execution.

To speed up insight generation and decision making (all elements of data monetization) business users are bypassing IT and investing in data visualization (Tableau) or data discovery platforms (Qlikview). These platforms help users ask and answer their own stream of questions and follow their own path to insight. Unlike traditional BI that provides dashboards, heatmaps and canned reports, these tools provide a discovery platform rather than a pre-determined path.

Also companies like Marketo which create marketing automation software are getting into the customer engagement and data monetization game. Their focus is to enable marketing professionals  find more future customers; to build, sustain, and grow relationships with those buyers over time; and to cope with the sheer pace and complexity of engaging with customers in real time across the web, email, social media, online and offline events, video, e-commerce storefronts, mobile devices and a variety of other channels. And in many companies, marketing knits these digital interactions together across multiple disconnected systems. The ability to interact seamlessly with customers across multiple fast-moving digital channels requires an engagement strategy enabled by data and analytic insights. 

Read more »

18
Mar

Data-as-a-Service (DaaS)


datamartproliferation

If the analytics team wrestles with getting access to data, how timely are the insights?

To address the question…Global CIO are shifting their strategy — “need to build data-as-a-service offering for my data” to enable the analytics users in the organization.   The more advanced CIOs are asking – “how should I build data science capabilities as a shared foundation service?”

The CIO challenge is not trivial. Successful organizations today operate within application and data eco-systems which extend across front-to-back functions (sales & marketing all the way to fulfillment and service) and well beyond their own boundaries. They must connect digitally to their suppliers, partners, distributors, resellers, regulators and customers. Each of these have their “data fabrics” and applications which were never designed to connect, so with all the data-as-a-service and big data rhetoric, the application development community being asked to “work magic” in bringing them together.

Underutilization and the complexity of managing growing data sprawl is not new. But the urgency to address this is increasing dramatically during the last several years. Data-as-a-Service (DaaS) is seen as a big opportunity in  improving IT efficiency and performance through centralization of resources. DaaS strategies have increased dramatically  in the last few years with the maturation of technologies such as data virtualization, data integration, MDM,  SOA, BPM  and Platform-as-a-service.

The questions which are accelerating the Data-as-a-Service (DaaS) trend:  How to deliver the right data to the right place at the right time? How to “virtualize” the data often trapped inside applications?  How to support changing business requirements (analytics, reporting, and performance management) in spite of ever changing data volumes and complexity.

Read more »

20
Aug

Innovation and Big Data: A Roadmap


The bleeding edge of data and insight innovation is around next generation digital consumer experience.  Consumer behaviors are rapidly evolving….always connected, always sharing, always aware. Obviously new technology like Big Data drives and transforms  consumer behavior and empowerment.

With the influx of money, attention and entrepreneurial energy, there is a massive amount of innovation taking place to solve data centric problems (such as the high cost of collecting,  cleaning, curating, analyzing, maintaining, predicting) in new ways.

There are two distinct patterns in data-centric  innovation:

  • Disruptive innovation like predictive search which brings a very different value proposition to tasks like discover, engage, explore and buy and/or creates new markets!!
  • Sustaining innovation like mobile dashboards,   visualization  or data supply chain management which improves self service and performance of existing products and services.

With either pattern the managerial challenge is moving from big picture strategy to day-to-day execution.  Execution of big data or data-driven decision making requires a multi-year evolving roadmap around toolset, skillset, dataset, and mindset.

Airline loyalty programs are a great example of multi-year evolving competitive roadmaps. Let’s look at BA’s Know Me project.

British Airways “Know Me” Project

British Airways (BA) has focused on competitiveness via customer insight. It has petabytes of customer information from its Executive Club loyalty program and its website. BA decided to put customer big data to work in its Know Me program. The goal of the program is to understand customers better than any other airline, and leverage customer insight accumulated across billions of touch points to work.

BA’s Know Me program  is using data and applying it to customer decision points in following ways:

  • Personal recognition—This involves recognizing customers for being loyal to BA, and expressing appreciation with targeted benefits and recognition activities
  • Personalization — based on irregular disruptions like being stuck on a freeway due to an accident – A pre-emptive text message… We are sorry that you are missing your flight departure to Chicago. Would you like a seat on the next one at 5:15PM.  Please reply Yes or No.
  • Service excellence and recovery—BA will track the service it provides to its customers and aim to keep it at a high level. Given air travel constant problems and disruptions, BA wants to understand what problems its customers experience, and do its best to recover a positive overall result
  • Offers that inspire and motivate—BA’s best customers are business travelers who don’t have time for irrelevant offers, so Know Me program analyzes customer data to construct relevant and targeted “next best offers” for their consideration.

The information to support these objectives is integrated across a variety of systems, and applied in real-time customer interactions at check-in locations and lounges. Even on BA planes, service personnel have iPads that display customer situations and authorized offers. Some aspects of the Know Me program have already been rolled out, while others are still under development.

The Need for New Data Roadmaps

New IT paradigms (cloud resident apps, mobile apps, multi-channel, always-on etc.) are creating more and more complex integration landscapes with live, “right-now” and real-time data. With data increasingly critical to business strategy, the problems of poor quality data,  fragmentation, and lack of lineage are also taking center stage.

The big change taking place in the application landscape: application owners of the past expected to own their data. However, applications of the future will leverage data – a profound change that is driving the data-centric enterprise.  The applications of the future need one “logical” place to go that provides the business view of the data to enable agile assembly.

Established and startup vendors are racing to fill this new information management void.  The establish vendors are expanding on this current enterprise footprint by adding more features and capabilities. For example, the Oracle BI stack (hardware – databases – platform – prebuilt content) illustrates the data landscape changes taking place from hardware to mobile BI apps.  Similar stack evolution is being followed by SAP AG, IBM, Teradata and others.  The startup vendors typically are building around disruptive technology or niche point solutions.

To enable this future of information management,  there are three clusters of “parallel” innovation waves: (1) technology/infrastructure centric; (2) business/problem centric; and (3) organizational innovation.

IBM summarize this wave of innovation in this Investor Day slide:

datadrivers

Data Infrastructure Innovation

  • Data sources and integration — Where does the raw data come from?
  • Data aggregation and virtualization- Where it stored and how is it retrieved?
  • Clean high quality data — How does the raw data get processed in order to be useful?

Even in the technology/infrastructure centric side there are multiple paths of disruptive innovation that are taking along different technology stacks shown below.  

Read more »

28
Feb

Proctor & Gamble – Business Sphere and Decision Cockpits


English: Logo for Procter & Gamble. Source of ...

Data-driven DNA is about having the right toolset, mindset, skillset and dataset to evolve a major brand and seize today’s omni-channel opportunities. Whether it’s retooling and retraining for the multiscreen attention economy, or introducing digital innovations that transform both retail and healthcare, P&G is bringing data into every part of its core strategies to fight for the customer.

—————————

Striving for market leadership in consumer products is a non-stop managerial quest.  In the struggle for survival, the fittest win out at the expense of their rivals because they succeed in adapting themselves best to their environment. 

CMOs and CIOs everywhere agree that analytics is essential to sales & marketing and that its primary purpose is to gain access to customer insight and intelligence along the market funnel – awareness, consideration, preference, purchase and loyalty.

In this posting we illustrate a best-in-class “run-the-business” with Data/Analytics Case Study at P&G. The case study demonstrates four key characteristics of data market leaders:

  1. A shared belief that data is a core asset that can be used to enhance operations, customer ser­vice, marketing and strategy
  2. More effective leverage of more data – corporate, product, channel, and customer –  for faster results
  3. Technology is only a tool, it is not the answer..!

  4. Support for analytics by senior managers who embrace new ideas and are willing to shift power and resources to those who make data-driven decisions

This case study of a novel construct called Business Cockpit (also called LaunchTower in the Biotech and Pharmaceutical Industry) illustrates the way Business Analytics is becoming more central in retail and CPG decision making.

Here is a quick summary of P&G Analytics program:

  • Primary focus on improving management decisions at scale – did the analysis to identify time gap between information and application to decision making
  •  “Information and Decision Solutions” (IT)  embeds over 300 analysts in leadership teams
  • Over 50 “Business Suites” for executive  information viewing and decision-making
  • “Decision cockpits” on 50,000 desktops
  • 35% of marketing budget on digital
  • Real-time social media sentiment analysis for  “Consumer Pulse”
  • Focused on how to best apply and visualize information instead of discussion/debate about validity of data
DatatoAnalyticsModel
mycockpit-pg
 

P&G Overview

Read more »

22
Jan

Data Scientist Infographic & Managed Analytics


The exploding demand for analytics professionals has exceeded all expectations, and is driven by the Big Data tidal wave.  Big data is a term commonly applied to large data sets where volume, variety, velocity, or multi-structured data complexity are beyond the ability of commonly used software tools to efficiently capture, manage, and process.

To get value from big data, ‘quants’ or data scientists are becoming analytic innovators who create tremendous business value within an organization, quickly exploring and uncovering game-changing insights from vast volumes of data, as opposed to merely accessing transactional data for operational reporting.

This EMC infographic summarizing their Data Scientist study supports my hypothesis – Data is becoming new oil and we need a new category of professionals to handle the downstream and upstream aspects of drilling, refining and distribution. Data is one of the most valuable assets within an organization. With business process automation, the amount of data  being generated, stored and analyzed by organizations is exploding.

Following up on our previous blog post – Are you one of these — Data Scientist, Analytics Guru, Math Geek or Quant Jock? — I am convinced that future jobs are going to be centered around “Raw Data -> Aggregate Data -> Intelligence ->Insight -> Decisions”  data chain.   We are simply industrializing the chain as machines/automation takes over the lower end of the spectrum. Also Web 2.0 and Social Media are creating an interesting data feedback loop – users contribute to the products they use via likes, comments, etc.

CIOs are faced with the daunting task of unlocking the value of their data efficiently in the time-frame required to make accurate decisions. To support the CIOs, companies like IBM are attempting to become a one-stop shop by a rapid-fire $14 Bln plus acquisition strategy:  Cognos,  Netezza, SPSS,  ILog, Solid, CoreMetrics, Algorithmics, Unica, Datacap, OpenPages, Clarity Systems, Emptoris, DemandTec (for retail).  IBM also has other information management assets like Ascential, Filenet, Watson, DB2 etc.  They are building a formidable ecosystem around data. They see this as a $20Bln per year opportunity in managing the data, understanding the data and then acting on the data. Read more »

19
Dec

Big Data Investment Theme – Fidelity Investments


Fidelity Investments put out an interesting analysis on Big Data as a Macro Investment Themes for clients.  Since everyone has an underperforming investment portfolio in this current market, I reproduced the article here to generate some ideas.

Fidelity Investments

Key Takeaways

  • New types of large data sets have emerged because of advances in technology, including mobile computing, and these data are being examined to generate new revenue streams.
  • More traditional types of business data have also expanded exponentially, and companies increasingly want and need to analyze this information visually and in real time.
  • Big data will be driven by providers of Internet media platforms, data amalgamation applications, and integrated business software and hardware systems.

Investment Theme – Big Data

The concept of “big data” generally refers to two concurrent developments. First, the pace of data accumulation has accelerated as a wider array of devices collect a variety of information about more activities: website clicks, online transactions, social media posts, and even high-definition surveillance videos.

A key driver of this flood of information has been the proliferation of mobile computing devices, such as smartphones and tablets. Mobile data alone are expected to grow at a cumulative annualized rate of 92% between 2010 and 2015 (see Exhibit 1, below). Read more »

4
Sep

Is Your BI Project in Trouble?


Enterprise Business Intelligence (BI) project failure can happen for a variety of reasons.  Sometimes it’s due to frequent scope changes with a fixed schedule constraint, unexpected and unplanned-for “must-have” requirements changes, loss of key team members onshore or offshore,  chronic effort under-estimation, lack of proper work breakdown structure, lack of QA, and so on.

Regardless of the causes, BI, Analytics, performance management failed projects waste billions of dollars (and hours) each year.

Over the years, I have seen a lot of well-intentioned custom development, commercial, off-the-shelf package customization – SAP, Oracle, Peoplesoft ERP, CRM, SCM – and other enterprise data-warehouse projects get into trouble for a variety of reasons.  Troubled projects usually need triage, recovery, and turn-around skills to straighten things out quickly.

I am afraid that BI and Corporate Performance Management is reaching a phase in its hype cycle where we are beginning to see growing demand for troubled project recovery. It doesn’t take genius to realize that BI/Analytics project demand is growing as it is one of few remaining IT initiatives that can make companies more competitive. However, demand doesn’t imply project success. Read more »

28
Aug

The Curious Case of Salesforce and Workday: Data Integration in the Cloud


The growing enterprise adoption of Salesforce SFA/CRM, Workday HR, Netsuite ERP, Oracle on Demand, Force.com for apps and Amazon Web Services for e-commerce will result in more fragmented enterprise data scattered across the cloud.

Automating the moving, monitoring, securing and synchronization of data is no longer a “nice-to-have” but “must-have” capability.

Data quality and integration issues — aggregating data from the myriad sources and services within an organization — are CIOs and IT Architects top concern about SaaS and the main reason they hesitate to adopt it (Data security is another  concern). They have seen this hosted data silo and data jungle problem too many times in the past. They know how this movie is likely to unfold.

Developing strategic (data governance), tactical (consistent data integration requirements) or operational (vendor selection) strategies to deal with this emerging “internal-to-cloud” data quality problem is a growing priority in my humble opinion. Otherwise most enterprises are going to get less than optimal value from various SaaS solutions. Things are likely to get out of control pretty quickly. Read more »

13
Aug

Analytics-as-a-Service: Understanding how Amazon.com is changing the rules


“By 2014, 30% of analytic applications will use proactive, predictive and forecasting capabilities”  Gartner Forecast

“More firms will adopt Amazon EC2 or EMR or Google App Engine platforms for data analytics. Put in a credit card, by an hour or months worth of compute and storage data. Charge for what you use. No sign up period or fee. Ability to fire up complex analytic systems. Can be a small or large player”    Ravi Kalakota’s forecast 

—————————-

Big data Analytics = Technologies and techniques for working productively with data, at any scale.

Analytics-as-a-Service is cloud based… Elastic and highly scalable, No upfront capital expense. Only pay for what you use, Available on-demand

The combination of the two is the emerging new trend.  Why?  Many organizations are starting to think about “analytics-as-a-service” as they struggle to cope with the problem of analyzing massive amounts of data to find patterns, extract signals from background noise and make predictions. In our discussions with CIOs and others, we are increasingly talking about leveraging the private or public cloud computing to build an analytics-as-a-service model.

Analytics-as-a-Service is an umbrella term I am using to encapsulate “Data-as-a-Service” and “Hadoop-as-a-Service” strategies.  It is more sexy 🙂

The strategic goal is to harness data to drive insights and better decisions faster than competition as a core competency.  Executing this goal requires developing state-of-the-art capabilities around three facets:  algorithms, platform building blocks, and infrastructure.

Analytics is moving out of the IT function and into business — marketing,  research and development, into strategy.  As result of this shift, the focus is greater on speed-to-insight than on common or low-cost platforms.   In most IT organizations it takes anywhere from 6 weeks to 6 months to procure and configure servers.  Then another several months to load, configure and test software. Not very fast for a business user who needs to churn data and test hypothesis. Hence cloud-as-a-analytics alternative is gaining traction with business users.

Read more »

%d bloggers like this: