Why CIOs Should Turn To Cloud Based Data Analysis in 2015

Originally Published on DataFloq

cloudsroad (1)

CIOs are under tremendous pressure to quickly deliver big data platforms that can enable enterprises to unlock the potential of big data and better serve their customers, partners and internal stakeholders. Early  adopter CIOs of big data report clear advantages of seriously considering and choosing the cloud for data analysis. These CIOs make a clear distinction between business critical and business enabling systems and processes. They understand the value that the cloud brings to data analysis and exploration and how it enables the business arm to innovate, react and grow the businesses.

Here are the 5 biggest reported advantages of choosing the cloud for data analysis

Speed – Faster Time to Market

Be it the speed of getting started with data analysis, the time it takes to have a software stack that can enable analysis or the time it takes to provision access to data, a cloud based system offers a faster boot time for the data initiative. This is music to the business ears as they are able to extract value from data sooner than later.

The cloud also offers faster exploration, experimentation, action and reaction based on data analysis. For example, a cloud-based system can be made to auto scale given the number of users querying the system, the number of concurrent ongoing analysis, the data that is entering the system and the data that is being stored or processed in the system. Without any long hardware procurement times, the cloud can often be the difference between critical data analysis that drives business growth and missed opportunities.

Another consideration mentioned by CIOs is the opportunity cost of building out full scale analytics systems. With limited budgets and time, focusing on generating core business value turns out to be more beneficial than spending those resources on reinventing a software stack that has already been built by a vendor.

Extensibility – Adjusting to Change

A very unique advantage of operating in the cloud is the ability to adjust to changes in business, the industry or competition. Dynamic enterprises introduce new products, kill underperforming products, invest in mergers and acquisitions. Each such activity creates new systems, processes and data sets. Having a cloud based stack that not just scales but offers a consistent interface reduces the problem of combining this data (and securing and maintaining) from a O(n!) problem to a O(n) problem making it a much cheaper proposition.

Cost – Lower, Cheaper

CIOs love the fact that cloud based data analysis stacks are cheaper to build and operate. Requiring no initial investment, CIOs get to pay for what they use and if the cloud auto scales, it makes for simpler capacity growth plans and easier to perform long term planning without the danger of over provisioning. Given the required data analysis capacity can often be spiky (varies sharply by time depending on planning and competitive activities), is impacted by how prevalent the data driven culture is in an enterprise (and how the culture changes over time) and the volume and variety of data sources (this can be change at the rate of how the enterprise grows and maneuvers), it is very hard for the CIO to predict required capacity. Imperfect estimates can lead to wasted resources or/and unused wasted capacity.

Risk Mitigation – Changing Technological Landscape

Data analysis technologies and options are in a flux. Especially in the area of big data, technologies are growing and maturing at different rates with new technologies being introduced regularly. In addition, it is very clear given the growth of these modern data processing and analysis tools and the recent activity of analytics and BI vendors, the current capabilities available to business are not addressing the pain points. There is a danger of moving in too early and adopting and depending on a certain stack might end up being the wrong decision or leave the CIO with a high cost to upgrade and maintain the stack at the rate it is changing. Investing in a cloud based data analysis system hedges this risk for the CIO. Among the options available for the CIO in the cloud are Infrastructure as a Service, Platform as a Service or Analytics as a Service and the CIO can choose the optimal solution for them depending on bigger tradeoffs and decisions beyond the data analysis use cases.

IT as the Enabler

Tasked with security and health of data and processes, CIOs see their role changing to an enabler role where they are able to ensure that the data and processes are protected while still maintaining control in the cloud. For example, identifying and tasking employees as the data stewards ensures that a single person or team understands the structure and relevancy of various data sets and can act as the guide and central point of authority to enable various employees to analyze and collaborate. The IT team’s role can now focus on acting as the Data Management team and ensure that feedback and business pain points are quickly addressed and the learnings are incorporated into the data analysis pipeline.

A cloud based data analysis system also offers the flexibility to let the analysis inform the business process and workflow design. A well designed cloud based data analysis solution and its insights should be pluggable into the enterprise’s business workflow through well defined clean interfaces such as an insight export API. This ensures that any lessons learnt by IT can be easily fed back as enhancements to the business.

Similarly, a cloud based data analysis solution is better designed for harmonization with external data sources, both public and premium. The effort required to integrate external data sources and build a refresh pipeline for these sources is sometimes not worth the initial cost given business needs to iterate with multiple such sources in their quest for critical insights. A cloud based analytics solution offers a central point for such external data to be collected. This frees up IT to focus on providing services to procure such external data sources and make them available for analysis as opposed to procurement and infrastructure services to provision the data sources.

A cloud based solution also enables IT to serve as deal maker of sorts by enabling data sharing through data evangelism. IT does not have to focus on many to many data sharing between multiple sub organizations and arms of the enterprise but serve as a data and insight publisher focusing on the proliferation of data set knowledge and insights across the enterprise and filling a critical gap in enterprises of missed data connections and insights that go uncovered.

The ‘Adjacent Possible’ of Big Data: What Evolution Teaches About Insights Generation

Originally published on WIRED

brunkfordbraun/Flickr

Stuart Kauffman in 2002 introduced the the “adjacent possible” theory. This theory proposes that biological systems are able to morph into more complex systems by making incremental, relatively less energy consuming changes in their make up. Steven Johnson uses this concept in his book “Where Good Ideas Come From” to describe how new insights can be generated in previously unexplored areas.

The theory of “adjacent possible extends to the insights generation process. In fact, it offers a highly predictable and deterministic path of generating business value through insights from data analysis. For enterprises struggling to get started with data analysis of their big data, the theory of “adjacent possible” offers an easy to adopt and implement framework of incremental data analysis.

Why Is the Theory of Adjacent Possible Relevant to Insights Generation

Enterprises often embark on their big data journeys with the hope and expectation that business critical insights will be revealed almost immediately just from the virtue of being on a big data journey and they building out their data infrastructure. The expectation is that insights can be generated often within the same quarter as when the infrastructure and data pipelines have been setup. In addition, typically the insights generation process is driven by analysts who report up through the typical management chain. This puts undue pressure on the analysts and managers to show predictable, regular delivery of value and this forces the process of insights generation to fit into project scope and delivery. However, the insights generation process is too ambiguous, too experimental that it rarely fits into the bounds of a committed project.

Deterministic delivery of insights is not what enterprises find on the other side of their initial big data investment. What enterprises almost always find is that data sources are in a disarray, multiple data sets need to be combined while not primed for blending, data quality is low, analytics generation is slow, derived insights are not trustworthy, the enterprise lacks the agility to implement the insights or the enterprise lacks the feedback loop to verify the value of the insights. Even when everything goes right, the value of the insights is simply miniscule and insignificant to the bottom line.

This is the time when the enterprise has to adjust its expectations and its analytics modus operandi. If pipeline problems exist, they need to be fixed. If quality problems exist, they need to be diagnosed (data source quality vs. data analysis quality). In addition, an adjacent possible approach to insights needs to be considered and adopted.

The Adjacent Possible for Discovering Interesting Data

Looking adjacently from the data set that is the main target of analysis can uncover other related data sets that offer more context, signals and potential insights through their blending with the main data set. Enterprises can introspect the attributes of the records in their main data sets and look for other data sets whose attributes are adjacent to them. These datasets can be found within the walls of the enterprise or outside. Enterprises that are looking for adjacent data sets can look at both public and premium data set sources. These data sets should be imported and harmonized with existing data sets to create new data sets that contain a broader and crisper set of observations with a higher probability of generating higher quality insights.

The Adjacent Possible for Exploratory Data Analysis

In the process of data analysis, one can apply the principle of adjacent possible to uncovering hidden patterns in data. An iterative approach towards segmentation analysis with a focus on attribution through micro segmentation, root cause analysis change and predictive analysis and anomaly detection through outlier analysis can lead to a wider set of insights and conclusions to drive business strategy and tactics.

Experimentation with different attributes such as time, location and other categorical dimensions can and should be the initial analytical approach. An iterative approach to incremental segmentation analysis to identify segments where changes in key KPIs or measures can be attributed to, is a good starting point. The application of adjacent possible requires the iterative inclusion of additional attributes to fine tune the segmentation scheme can lead to insights into significant segments and cohorts. In addition, adjacent possible theory can also help in identifying systemic problems in the business process workflow. This can be achieved by walking upstream or downstream in the business workflow and by diagnosing the point of process workflow breakdown or slowdown through the identification of attributes that correlate highly with the breakdown/slowdown.

The Adjacent Possible for Business Context

The process of data analysis is often fraught with silo’d context i.e. the analyst often does not have the full business context to understand the data or understand the motivation for a business driven question or understand the implications of their insights. Applying the theory of adjacent possible here implies that by introducing the idea of collaboration to the insights generation process by inviting and including team members who each might have a slice of the business context from their point of view can lead to higher valued conclusions and insights. Combining the context from each of these team members to design, verify, authenticate and validate the insights generation process and its results is the key to generating high quality insights swiftly and deterministically.

Making incremental progress in the enterprise’s insights discovery efforts is a significant and valuable method to uncover insights with massive business implications. The insights generation process should be treated as an exercise in adjacent possible and incremental insights identification should be encouraged and valued. As this theory is put in practice, enterprises will find themselves with a steady churn of incrementally valuable insights with incrementally higher business impact.

The 2+2=5 Principle and the Perils of Analytics in a Vacuum

Published Originally on Wired

Strategic decision making in enterprises playing in a competitive field requires collaborative information seeking (CIS). Complex situations require analysis that spans multiple sessions with multiple participants (that collectively represent the entire context) who spend time jointly exploring, evaluating, and gathering relevant information to drive conclusions and decisions. This is the core of the 2+2=5 principle.

Analytics in a vacuum (i.e non collaborative analytics) due to missing or partial context is highly likely to be of low quality, lacking key and relevant information and fraught with incorrect assumptions. Other characteristics of non collaborative analytics is the usage of general purpose systems and tools like IM and email that are not designed for analytics. These tools lead to enterprises drowning in a sea of spreadsheets, context lost across thousands of IMs and email and an outcome that is guaranteed to be sub optimal.

A common but incorrect approach to collaborative analytics is to think of it as a post analysis activity. This is the approach to collaboration for most analytics and BI products. Post analysis publishing of results and insights is very important however, pre-publishing collaboration plays a key role in ensuring that the generated results are accurate, informative and relevant. Analysis that terminates at the publishing point has a very short half life.

Enterprises need to think of analysis as a living and breathing story that gets bigger over time as more people collaborate and lead to more data, new data, disparate data leads to the inclusion of more context negating incorrect assumptions, missing or low quality data issues and incorrect semantical understanding of data.

Here are the most common pitfalls that we have observed, of analytics carried out in a vacuum.

Wasted resources. If multiple teams or employees are seeking the same information or attempting to solve the same analytical problem, a non collaborative approach leads to wasted resources and suboptimal results.

Collaboration can help the enterprise streamline and divide and conquer the problem more efficiently and faster with lower time and manpower. Deconstructing an analytical hypothesis into smaller questions and distributing them across multiple employees leads to faster results.

Silo’ed analysis and conclusions. If results of analysis, insights and decisions are not shared systematically across the organization, enterprises face a loss of productivity. This lack of context between employees tasked with the same goals causes organizational misalignment and lack of coherence in strategy.

Enterprises need to ensure that there is common understanding of key data driven insights that are driving organizational strategy. In addition, the process to arrive at these insights should be transparent and repeatable, assumptions made should be clearly documented and a process/mechanism to challenge or question interpretations should be defined and publicized.

Assumptions and biases. Analytics done in a vacuum is hostage the the personal beliefs, assumptions, biases, clarity of purpose and the comprehensiveness of the context in the analyzer’s mind. Without collaboration, such biases remain uncorrected and lead to flawed foundations for strategic decisions.

A process around and freedom to challenge, inspect and reference key interpretation and analytical decisions made en route to the insight is critical for enterprises to enable and proliferate high quality insights in the organization.

Drive-by analysis. When left unchecked with top down pressure to use analytics to drive strategic decision making, enterprises see an uptake in what we call “drive-by analysis.” In this case, employees jump in to their favorite analytical tool, run some analysis to support their argument and publish these results.

This behavior leads to another danger of analytics without collaboration. These can be instances where users, without full context and understanding of of the data, semantics etc perform analysis to make critical decisions. Without supervision, these analytics can lead the organization down the wrong path. Supervision, fact checking and corroboration are needed to ensure that correct decisions are made.

Arbitration. Collaboration without a process for challenge, arbitration and an arbitration authority is often found to be, almost always at a later point in time when it is too late, littered with misinterpretations and factually misaligned or deviated from strategic patterns identified in the past.

Subject matter experts or other employees with the bigger picture, knowledge and understanding of the various moving parts of the organization need to, at every step of the analysis, verify and arbitrate on assumptions and insights before these insights are disseminated across the enterprise and used to affect strategic change.

Collaboration theory has proven that information seeking in complex situations is better accomplished through active collaboration. There is a trend in the analytics industry to think of collaborative analytics as a vanity feature and simple sharing of results is being touted as collaborative analytics. However, collaboration in analytics requires a multi pronged strategy with key processes and a product that delivers those capabilities, namely an investment in processes to allow arbitration, fact checking, interrogation and corroboration of analytics; and an investment in analytical products that are designed and optimized for collaborative analytics.

Data as Currency & Dealmaker

Published Originally on Apigee

The amount of data collected by companies in all sectors and markets has grown exponentially in the past decade. Businesses increasingly interact with customers through social and business networks, through apps (and APIs) and therefore businesses collecting data from new and diverse locations outside the walls of their enterprises and systems-of-record.

The following is a perspective on 5 ways in which data is changing how we do business in 2012 and beyond.

Data as currency and dealmaker

Similar to the discovery of oil in Texas at the turn of the last century, enterprises that have been collecting and storing data will be the ones primed to leverage their data for new opportunities and striking new business deals.

Add to this new data sources like the explosion of social data, which provides a window into your real-world and real-time customers’ behavior. The data accumulates quickly and changes frequently, and the ability to capture, analyze and derive insights from it will be key to offering true customer-centric value across companies, and even entire industries.

Data is fast becoming the de-facto currency for facilitating business deals. Enterprises will be able to command monetary and opportunistic conditions in return for providing access to their data. Google’s custom search for websites is an example. By providing indexed data and search functionality to websites, Google in return has the ability to show ads and generate revenue on the website.

We will also see the emergence of data network effects: enterprises will be able to enrich their existing data sets by requiring that other enterprises who purchase their data return (or feedback) the enriched data to augment the original data set. Enterprises sitting on the most desired data will be able to continuously add value to their existing data sets.

Collaborations through data

I believe a new model of collaboration based on data is emerging. Enterprises realize that they can partner with other enterprises and create innovative, new business value by sharing and operating with shared or semi-shared data stores.

Apart from shared data storage and processing costs, enterprises will be able to leverage faster time-to-market and build enhanced data-driven features using such collaboration models. They could use shared data stores as primary operating data backends thereby realizing near real-time data updates and availability for their products and services.

The academic world has several examples and parallels to this notion where data sets are frequently generated, updated and shared between various research projects leading to better quality and more expansive research and insights.

Data marketplaces

While the Internet is full of open data, there’s plenty of data that companies will be willing to pay for – particularly if it’s timely, curated, well aggregated, and insightful. As a result, data marketplaces are a burgeoning business. Witness Microsofts data platform,Thompson Reuters Content Marketplace strategyUrban Mapping and many more.

Data can be defined by attributes such as “Latency of delivery for processing”, “quality”, “sample ratio”, “dimensions”, “context”, “source trustworthiness”, and so on. As data becomes a key enabler of business, it becomes an asset that can be bid upon and acquired by the highest bidder. The price associated with acquiring data will be determined by the data attributes. For example, a real-time feed of a data source might cost more than historical data sets. Data sets at 100% sample ratio might cost more than data at lower fidelity.

Ability to access and synthesize data is a competitive edge. To gain and maintain this edge, enterprises will have to add the cost of acquiring data to their variable operating costs. At the same time, enterprises will have to protect their data as they they protect other corporate assets. Protection (and insurance) against loss, theft, corruption will be required to ensure continued success.

End users will stake claim to their data

With the rise of social networks and even with the consumerization of IT, data is also becoming more personal. We trade our personal data for services every day as we interact with Facebook and other sites.

End users who are the generators of data that enterprises collect and use to improve their businesses will stake claim to their data and demand a share of the value with the enterprise. In addition, end users will demand and gravitate towards enterprises that give them the ability to track, view and control the data they generate for enterprises. Enterprises may have to either “forget” users because users demand it or compensate them for their data.

The jury is still out but the tide may already have turned in this direction in Europe. Data protection regulations may allow for a “right to be forgotten” law through which users will have the right to demand that data held on them be deleted if there are “no legitimate grounds” for it to be kept. This includes if a user leaves a service or social network, like Google or Facebook – the company will have to permanently delete any data that it retains.

Data disintermediation

The concept of disintermediation – of removing the middlemen and vendors and giving consumers direct access to information that would otherwise require a “mediator”  has been an active topic in the information industry and gains momentum in 2012 as data becomes currency.

We will see more and more enterprises exposing their data schemas, formats and other related capabilities publicly though a common data description language and data explorer capabilities accessible by both humans and machines.

Enterprises (or their automatic agents) will be able to crawl the web (or some other data network) and discover new data sources that serve their needs. Enterprises will have the ability to walk the data models and understand the structure and schema of various data sets and understand the intricacies of using these data sources.

Measure What Matters: Six Metrics Every CDO Should Know

Published Originally on Apigee

The chief digital officer has to juggle multiple priorities, foci, and investments, ranging from within the enterprise to its edge. Having been charged with growing an enterprise’s business, CDOs need to enable their companies to successfully undergo a digital transformation. This digital transformation includes enabling the enterprise to plan, build, and maintain products as well as market to, sell to, acquire, retain, and support users through digital or digitally enhanced products and processes.

Defining a successful digital business strategy requires a deep understanding of users’ preferences and behaviors and also requires the ability to track changes in user behavior and their consumption and demand patterns. Understanding users’ preferences and behaviors requires the ability to track customer behavior over time and across channels. Tracking changes requires a set of analytics that enables the CDO to measure, at an aggregate level, the behavior of segments and micro-segments of users and also to understand, at the individual level, the current state, engagement, and problems faced by a user or a partner.

Building a digital enterprise requires the ability to track and accelerate innovation, agility, and experimentation in the enterprise. Democratizing access to data and building-block services for developers requires a systematic audit, curation, and exposure of enterprise capabilities as reusable APIs with the ability to track, monitor, and aid the usage of such services by developers and partners efficiently, quickly, and successfully.

Here are the six dimensions of an analytics plan that a CDO should build and track to enable better decision making.

Business KPIs

A CDO’s main goal is to grow the enterprises’ business. To achieve this, the CDO must track two key types of business KPIs: traditional business KPIs and digital KPIs.

Traditional Business KPIs are those that the enterprise uses to run the business, such as customers, average revenue per user (ARPU), churn, and revenue/profit.

Digital KPIs include traffic and revenue from digital touchpoints and the total and rate of acquisition of new users, customers, developers, partners, and devices. Tracking business KPIs involves tracking both the absolute numbers and the trends, which can signify changing consumption and demand patterns and serve to alert the CDO about potential problems with customer satisfaction or the services supply chain.

Specifically, a successful CDO will:

  • set up organizational structure and processes to understand and attribute KPI changes to market, competitive, or product forces
  • define marketing and product strategies to drive usage, revenue, customer, developer, and partner acquisition and retention

CDO’s Business KPI Dashboard

Digital Transformation

Digital transformation can be defined as, and measured by, the acceleration of innovation and agility in an enterprise that ultimately leads to new, compelling user experiences and is marked by higher usage and revenue.

CDOs should measure the following aspects of digital transformation to detect organizational and personnel challenges around innovation, process roadblocks that hurt agility, and a lack of crisp product and/or platform positioning that impacts both reach and the level of partner and developer engagement.

Innovation: The ability of the enterprise to bring new, compelling products and services to market, measured in the APIs and apps delivered to users. New products can be defined as new products for existing users, products designed to attract new users, or new markets and products designed to attract users of competitive products.

Agility: The ability of the enterprise to improve its products and services, measured by the rate of improvement of apps and APIs. In other words, how quickly can an enterprise expose its services as APIs and how quickly and easily can these APIs be adopted by developers and consumed by apps?

Reach: The ability of the enterprise to attract new users, developers, and partners to their platform, products, and services, measured by the rate of new user, developer, and partner acquisition and the churn rate of these users, partners, and developers.

Time to Maturity: The time taken by APIs and apps to “go live” and be used by real users, as measured by the time from the first definition of the app or API to when it is available for consumption.

Partner and developer engagement: Developer and partner engagement with the enterprises’ platform as measured by the rate and breadth of platform features usage over time, including the rate-of and time-to success of developers and partners.

Ecosystem density: The measure of the “consumption” and “supplier” relationships that an enterprise has with other businesses (through APIs). Let’s take as an example an API that allows you to send photos to print from your mobile device. When used by and offered from services like Shutterfly, Flickr, and Instagram, for example, this API is the core of a much denser and robust ecosystem than if it were being used solely by any single app or website.

Similarly, say an app were to consume not only the print API but also APIs that offer users related services, such as viewing photos online and creating albums and slideshows. Then that app offers a richer experience to its users than if it only offered the print API functionality. The progress and success of an enterprise’s digital transformation can be measured by the density of the ecosystem—by how connected and how integral a part the enterprise plays in its digital supply chains and how robust the complex partnership models and supply chains are.

Specifically, a successful CDO will:

  • audit and optimize organizational setup and process efficiency regularly to understand rates of innovation and agility and identify internal roadblocks
  • commision strategies for improvement of developer and partner engagement through new products and services and better support and training
  • understand and remove bottlenecks in partner and developer onboarding, including the most common reasons for failed or prolonged onboarding process

CDO’s Digital Transformation Dashboard

Channels

The most pronounced impact of a digital transformation is evident in the changing behavior and transaction patterns of an enterprises’ users. Digital channels sometimes replace or cannibalize traditional channels, but more often they help and enhance multi-channel transactions. As customers navigate and interact with the enterprise across multiple channels, a CDO needs to be constantly aware of the shift in those customers’ product access and acquisition patterns. This awareness is shuttled into strategic investment decisions across channels and often into building bridges between channels to enable easy context switching for users.

Channel awareness is turning out to be one of the key tenets of data-driven decision making for the CDO.

A successful CDO will:

  • define product strategies to enable better cross-channel usage of your products and services
  • define and design channel-specific workflows and cross-channel workflows to adapt to end user usage patterns
  • track
    • traffic and revenue by channel
    • the most common channels where high value transactions begin and end
    • transactions that transcend multiple channels
    • users that use multiple channels to start and end transactions
    • users that shift and change the channels through which they interact with the enterprise

CDO’s Channel Tracking Dashboard

Apps and APIs

The CDO brings the app and API revolution to the enterprise by exposing old and arcane services as reusable, lightweight, and accessible APIs, designed to be consumed by lightweight and purpose-built apps.

CDOs track app and API metrics to understand and track the adoption, engagement, and usage of their products and services and to determine, optimize, and fine-tune investment decisions. Metrics include revenue, traffic, QoS, unique users, ratings of apps (consumer) and APIs (partner/developer), active apps, devices, and geo-distribution of traffic and revenue.

A successful CDO will:

  • track and understand trends and changes in KPIs, inlcuding unique users, usage, and app ratings to implement product strategies to build better products
  • use KPIs as an impetus to explore new market and customer segment opportunities and make timely investment decisions

CDO’s Apps and APIs Dashboard

Developers and Partners

Developers and partners are the channels to grow the enterprise. A healthy developer community and a diverse partner ecosystem is a sign of a thriving enterprise and a leading indicator of digital success. CDOs should measure the cost and likelihood of developers and partners successfully onboarding to their platform and launching new and innovative apps that are desired and used by users. Metrics such as cost of developer acquisition (CODA), partner onboarding success rate, and partner onboarding time are key to tracking the health of the developer and partner community. At any point, the CDO should have information about the revenue and traffic from a partner/developer, the QoS experienced by the partner/developer, apps built by these partners and developers, and the unique users delivered via these apps. This information is used by the CDO to fine-tune developer/partner onboarding processes and to craft marketing strategies that attract, retain, and engage developers and partners.

A successful CDO will:

  • define strategies to reduce cost of developer and partner acquisition and onboarding
  • remove bottlenecks and provide a better developer experience to strengthen platform adoption

CDO’s Developers and Partners Dashboard

News: Internal, Ecosystem, and External

Last but not least, CDOs need to stay abreast of relevant news and information that impacts their industry, ecosystem, enterprise, organization, or specific app or API team. CDOs should track how their apps and APIs are being talked about on social media, and listen to learn about product or service issues that are likely to cause dissatisfaction for developers, partners, and users. In addition, they should closely track how new releases and versions of their services, APIs, and apps impact usage and revenue.

A successful CDO will:

  • manage the perception of a business’ products and services on social media and arrest and address negative trends and user and developer dissatisfaction
  • design and implement market and competitive research pipelines to uncover new trends and changing end-user behavior patterns

CDO’s News & Releases Dashboard

Conclusion

A CDO is tasked with a challenging job: to be the chief digital strategist for the enterprise, and to shake up an enterprise and make it digitally relevant and able to successfully adapt to changing user preferences, behavior, expectations, and access patterns. A data-driven approach and culture is the best asset that a CDO can nurture in the enterprise to make objective decisions and track the impact of those decisions and actions.

If you are a CDO, we would love to hear from you about analytics and other techniques that you are using to bring about the digital revolution in your enterprise!

Big Data Funded by the Millions : Here’s 3 Ways to Turn Data Science into Business

Published Originally on SiliconAngle with Ryan Cox

Academia.edu continues to make scientific research a more open and networked practice, raising $11.1 million in Series B funding from Khosla Ventures, Spark Capital and True Ventures today. According to AllThingsD, we learned that the San Francisco-based company now has 4.5 million registered researchers and eight million monthly visitors. Reiterating the growing interest in data-driven startups is ResearchGate, which recently raised a $35 million Series C round. Hello Big Data.

The more data science receives funding, the more this trend presents a business opportunity. The true hidden gem of data science is storytelling, and the ability to extrapolate data to tell a story that others with access to the same data simply don’t see. Seeing meaning in the data where others don’t is a true game changer in the data science and business intelligence conversation.  But how do you turn data science into profit?

Here are some quick and easy steps from Kumar Srivastava is the Product Management Lead for Apigee Insights at Apigee on turning your data science into a business science.

3 ways to turn data science into business science

  • Arm your data scientists with the business context

Ensure that your data scientists do not work in isolation and that they interface and work very closely with the business owner and the product managers. Data scientists need to understand the business drivers, business critical issues, and the enterprise and product strategy.

  • Capture business state as KPIs

Push your data scientists to implement business-focused key performance indicators (KPIs) using the data that is being generated through the use of your products and services. Encourage your data scientists to fill the gaps in your instrumentation required to implement the defined KPIs.

  • Encourage insights that predict a result from a recommended action

Encourage your data scientists to deliver insights that take the following shape:

  1. Predictions of enhanced business value
  2. Demonstrated through desired movement in business KPIs when recommended actions are implemented

Data science as business science. The concepts of storytelling and culture are making there ways deep into Big Data, with business intelligence being the key differentiator in successful understanding of the data.

About Kumar

Kumar Srivastava is the Product Management Lead for Apigee Insights and Apigee Analytics products at Apigee. Before Apigee, he was at Microsoft where he worked on several different products such as Bing, Online Safety, Hotmail Anti-Spam and PC Safety and Security services. Prior to Microsoft, he was at Columbia University working as a graduate researcher in areas such as VOIP Spam, Social Networks and Trust, Authentication & Identity Management systems.

Your Big Data Needs Some TLC

Published Originally on Wired

In this customer-driven world, more and more businesses are relying on data to derive deep insights about the behavior and experience of end users with a business’ products. Yet end user logs, while interesting, often lack a 360-degree view of the “context” in which users consume a business’ products and services. The ability to analyze these logs in the relevant context is key to getting the maximum business value from big data analysis.

Basic contextual analysis requires a little TLC: Time, Location and Channel.

Thinking within a TLC framework will simplify the identification, collection, assimilation and analysis of context and make it more value driven. Enterprises can apply TLC for better attribution and explanation of end user behavior, to identify patterns and understand profiles that generate insights, and ultimately to enable the business to deliver better, customized, personalized products, services and experiences.

So what does it mean for business owners in the app economy to give their data and analytics a dose of TLC?

Time

Does the time of day, the week, the month, or a particular event impact app usage?

The hypothesis is that certain events at a point in time and certain classes of events have a positive or negative impact on app usage.

Are there different patterns of app usage on weekends versus weekdays or on mornings versus afternoons versus evenings? Are you a retail business hurtling towards Black Friday (the biggest shopping day in the year in the USA)? What patterns have you observed in recent years? What can you expect from your store locator app, your catalog app, your gift card and coupons app… in the days before the event and on black Friday itself?

Are you running a Super Bowl ad? When it airs, will it drive traffic to your web and mobile apps? Will it cause a spike in API traffic?

Business executives need to understand how external events like these impact the use of their apps. Making the correlations and understanding the contexts in which the apps are used can then be used to promote or discourage certain usage of the app for maximum business impact.

  • What external events impact the use of my app?
  • Are there patterns? What types of external events impact the use of my app?
  • As users use an app over a time, do their usage patterns change? Does the how/why/what of app usage change?

Location

Does app usage lead to cross-channel transactions such as store foot traffic or web based fulfillment?

Retailers deploy mobile apps to enable enhanced shopping experiences and sometimes with the purpose to drive foot traffic to their stores.

Where are users before, during and after they use an app? Business owners can use information about where their apps are being used and where they are being the most effective to tune the user experience and maximize impact.

Is your store locator app used most in the vicinity of your store, or most in the vicinity of your competitors’ stores? Do users follow through and walk into your store after using the store locator app, the catalog app…?

Is there a pattern to where users are when they access a gas station app? Are they in the vicinity of a gas station and trying to find the cheapest gas? Are they trying to find the gas vendor to whose rewards program they belong? Are they in a rural setting and looking for the closest gas station?

Location information provides the app developer and service provider with context to answer questions that help chart a customer’s journey of interacting with the service provider across multiple channels and across multiple locations, allowing the identification of patterns that signify and impact the customer’s search, discovery, decision and transaction.

Business owner should be asking questions like:

  • Where are the users before and after they use the app?
  • Are users using the apps in the vicinity of retail stores? How close are they to the stores?
  • Are the users using the apps in the vicinity of competitor stores? How close are they to the stores?
  • Do users use apps and then walk into the stores? Vice versa?
  • Do multiple users use the app in the vicinity of a single store?

Channels

Are online or mobile channels increasing? How do my business channels impact and improve transactions on neighboring channels?

The hypothesis is that the multiple channels of your business are symbiotic.

Does enabling one channel cannibalize, harm or improve business on other channels? Is your mobile app driving more traffic to your store… to your web site?

A powerful example of enabling business with apps, and the impact across their channels comes from Walgreens. The pharmacy chain made mobile technology a key part of its strategy and finds that half of the 12 million visits a week to its numerous online sites come from mobile devices. Additionally, Walgreens indicates that the customers who engage with Walgreens in person, online and via mobile apps spend six times more than those who only visit stores.

Some questions for business owners to ask about their channels include:

  • What is my strongest channel?
  • For multi channel transactions
    Do transactions transcend multiple channels – that is, do users channel hop?

    • Which channel is responsible for starting most transactions?
    • Which channel is responsible for successfully completing most transactions?
    • Which channel is responsible for most abandoned transactions?
  • How and what does each channel contribute to the users’ needs towards driving improved experience and transactions?

TLC for the User Experience

Consumers today are “always addressable”. We are increasingly surrounded by digital screens, which make us reachable at anytime, in anyplace and on any device. This leads to a new type of problem and opportunity that I like to call “screen optimization.”

Screen optimization is the opportunity and the ability of a service provider to optimize the message and content delivered according to the user’s context – time, location, channel, and position in their journey.

  • Adjust and adapt a user’s experience to their context and screen across the various digital touch points on the customer’s journey
  • Adapt the content delivered to a user’s surrounding screens (mobile device to highway billboards) according to the user’s context
  • Provide a personalized, mobile-centric experience that enables a user to orchestrate their multi-channel experience successfully
  • Enable a user to enter and experience the appropriate channel given their stage in the journey of interaction and transaction with your business

So, apply a little TLC to your data and analytics and create better, customized, personalized products, services and experiences for consumers.