It’s the End of the (Analytics and BI) World as We Know It

Published Originally on Wired

“That’s great, it starts with an earthquake, birds and snakes, an aeroplane, and Lenny Bruce is not afraid.” –REM, “It’s the End of the World as We Know It (and I Feel Fine)”

REM’s famous “It’s the End of the World…”song rode high on the college radio circuit back in the late 1980s. It was a catchy tune, but it also stands out because of its rapid-fire, stream-of-consciousness lyrics and — at least in my mind — it symbolizes a key aspect of the future of data analytics.

The stream-of-consciousness narrative is a tool used by writers to depict their characters’ thought processes. It also represents a change in approach that traditional analytics product builders have to embrace and understand in order to boost the agility and efficiency of the data analysis process.

Traditional analytics products were designed for data scientists and business intelligence specialists; these users were responsible for not only correctly interpreting the requests from the business users, but also delivering accurate information to these users. In this brave new world, the decision makers expect to be empowered themselves, with tools that deliver information needed to make decisions required for their roles and their day to day responsibilities. They need tools that enable agility through directed, specific answers to their questions.

Decision-Making Delays

Gone are the days when the user of analytics tools shouldered the burden of forming a question and framing it according to the parameters and interfaces of the analytical product. This would be followed by a response that would need to be interpreted, insights gleaned and shared. Users would have to repeat this process if they had any follow up questions.

The drive to make these analytics products more powerful also made them difficult to use to business users. This led to a vicious cycle: the tools appealed only to analysts and data scientists, leading to these products becoming even more adapted to their needs. Analytics became the responsibility of a select group of people. The limited population of these experts caused delays in data-driven decision making. Additionally, they were isolated from the business context to inform their analysis.

Precision Data Drill-Downs

In this new world, the business decision makers realize that they need access to information they can use to make decisions and course correct if needed. The distance between the analysis and the actor is shrinking, and employees now feel the need to be empowered and armed with data and analytics. This means that analytics products that are one size fits all do not make sense any more.

As the decision makers look for analytics that makes their day to day job successful, they will look towards these new analytics tools to offer the same capabilities and luxuries that having a separate analytics team provides, including the ability to ask questions repeatedly based on responses to a previous question.

This is why modern analytics products have to support the user’s “stream of consciousness” and offer the ability to repeatedly ask questions to drill down with precision and comprehensiveness. This enables users to arrive at the analysis that leads to a decision that leads to an action that generates business value.

Stream of conciousness support can only be offered through new lightweight mini analytics apps that are purpose-built for specific user roles and functions and deliver information and analytics for specific use cases that users in a particular role care about. Modern analytics products have to become combinations of apps to empower users and make their jobs decision and action-oriented.

Changes in People, Process, and Product

Closely related to the change in analytics tools is a change in the usage patterns of these tools. There are generally three types of employees involved in the usage of traditional analytics tools:

  • The analyzer, who collects, analyzes, interprets, and shares analyses of collected data
  • The decision maker, who generates and decides on the options for actions
  • The actor, who acts on the results

These employees act separately to lead an enterprise toward becoming data-driven, but it’s

a process fraught with inefficiencies, misinterpretations, and biases in data collection, analysis, and interpretation. The human latency and error potential makes the process slow and often inconsistent.

In the competitive new world, however, enterprises can’t afford such inefficiencies. Increasingly, we are seeing the need for the analyzer, decision maker, and actor to converge into one person, enabling faster data-driven actions and shorter time to value and growth.

This change will force analytics products to be designed for the decision maker/actor as opposed to the analyzer. They’ll be easy to master, simple to use, and tailored to cater to the needs of a specific use case or task.

Instant Insight

The process of analytics in the current world tends to be after-the-fact analysis of data that drives a product or marketing strategy and action.

However, in the new world, analytics products will need to provide insight into events as they happen, driven by user actions and behavior. Products will need the ability to change or impact the behavior of users, their transactions, and the workings of products and services in real time.

Analytics and BI Products and Platforms

In the traditional analytics world, analytics products tend to be bulky and broad in their flexibility and capabilities. These capabilities range from “data collection” to “analysis” to “visualization.” Traditional analytics products tend to offer different interfaces to the decision makers and the analyzers.

However, in the new world of analytics, products will need to be minimalistic. Analytics products will be tailored to the skills and needs of their particular users. They will directly provide recommendations for specific actions tied directly to a particular use case. They will provide, in real time, the impact of these actions and offer options and recommendations to the user to fine tune, if needed.

The Decision Maker’s Stream of Consciousness

In context of the changing people, process, and product constraints, analytics products will need to adapt to the needs of decision makers and their process of thinking, analyzing, and arriving at decisions. For every enterprise, a study of the decision maker’s job will reveal a certain set of decisions and actions that form the core of their responsibilities.

As we mentioned earlier, yesterday’s successful analytical products will morph into a set of mini analytics apps that deliver the analysis, recommendations, and actions that need to be carried out for each of these decisions/actions. Such mini apps will be tuned and optimized individually for each use case individually for each enterprise.

These apps will also empower the decision maker’s stream of consciousness. This will be achieved by emulating the decision maker’s thought process as a series of analytics layered to offer a decision path to the user. In addition, these mini apps will enable the exploration of tangential questions that arise in the user’s decision making process.

Analytics products will evolve to become more predictive, recommendation-based, and action oriented; the focus will be on driving action and reaction. This doesn’t mean that the process of data collection, cleansing, transformation, and preparation is obsolete. However, it does mean that the analysis is pre-determined and pre-defined to deliver information to drive value for specific use cases that form the core of the decision maker’s responsibility in an enterprise.

This way, users can spend more time reacting to their discoveries, tapping into their streams-of-consciousness, taking action, and reacting again to fine-tune the analysis

Advertisements

Virtual Sensors and the Butterfly Effect

Originally Published on Wired.

In the early 1960s, chaos theory pioneer Edward Lorenz famously asked, “Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?” Lorenz theorized that small initial differences in an atmospheric system could result in large and unexpected future impacts.

Similar “butterfly effects” can surface in the increasingly interconnected and complex universe of enterprise partnerships and supply-chain and cross-product relationships. It’s a world where new or evolving products, services, partnerships, and changes in demand can have unexpected and surprising effects on users and other products, services, traffic, and transactions in a company’s ecosystem.

Monitoring these complex relationships and the potentially important changes that can reverberate through an enterprise’s network calls for an interconnected system of virtual “sensors,” which can be configured and tuned to help make sense of these unexpected changes. As enterprises increasingly interface with customers, partners, and employees via apps and application programming interfaces (APIs), setting up a monitoring network like this becomes a particularly important part of data analysis.

What are Sensors?

Traditional sensors are often defined as “converters” that transform a physically measured quantity into a signal that an observer can understand. Sensors are defined by their sensitivity and by their ability to have a minimal effect on what they measure.

Physical sensors can capture aspects of the external environment like light, motion, temperature, and moisture. They’re widely used in business, too. Retailers can employ them to measure foot traffic outside or inside their stores, in front of vending machines, or around product or brand categories. Airlines use physical sensors to measure how weather patterns affect boarding and take-off delays. Using a diversity of sensors enables the definition of an environment around the usage of a product or service in the physical world.

Besides investing in traditional data processing technologies, cutting-edge enterprises map their digital world by defining and building so-called virtual sensors. Virtual sensors collect information from the intersection of the physical and digital worlds to generate and measure events that define the usage of a digital product or service. A virtual sensor could be a data processing algorithm that can be tuned and configured to generate results that are relevant for the enterprise. The generated alert notifies the enterprise of a change in the environment or ecosystem in which the user is using a product or service.

How to Build a Virtual Sensor Network

Building a network of virtual sensors for your business calls for requirements similar to those of a physical sensor system:

  • Sensitivity, or the ability to detect events and signals with configurable thresholds of severity
  • Speed, or the ability to speedily collect and process signals to generate business-critical events
  • Diversity, or the ability to collect, collate, and combine signals from multiple sensors with the goal of generating business-critical events

To begin charting the web of relationships that impacts the demand and usage of various enterprises’ products and services, businesses should determine which other products and services in the marketplace are complements, supplements, and substitutes to their own. Deep understanding of such evolving and complex relationships can help enterprises with planning partnerships.

  • Supplementary products and services enhance the experience of another product or service. For example, flat panel TVs are enhanced by wall mounts, stands, warranty services, cable services, and streaming movie services.
  • Complementary products and services work in concert with other products and services to complete the experience for the end user. Demand for car tires, for example, tends to generate demand for gasoline.
  • Substitute products and services have an inverse effect on each other’s demand. For example, two retailers offering the same selection of products targeted to the same consumer.

Understanding these relationships is the starting point of creating a network of sensors to monitor the impact of changes in traffic or transactions of an outside product or service on an enterprise’s own products and services. Detecting this change within the appropriate sensitivity can often be the difference between an enterprise’s failure or success.

Take for example, a web portal that aggregates content from several content providers. This portal uses APIs to connect to these third-parties. In many cases, these content providers are automatically queried by the aggregator, regardless of whether an end user is interested in the content. If for any reason there is a spike in usage of the portal on a particular day, this will automatically trigger spikes in the traffic for each of the content providers. Without understanding the complementary connection to the portal and the associated shifting demand properties of the connection, the content providers will find it difficult to interpret the traffic spike, which will eat up resources and leave legitimate traffic unserviced.

Here’s a similar example. Let’s say a service can support 100 API calls spread among 10 partners. If this service receives an unexpected and unwanted spike in traffic from one partner that eats up half of its capacity, then it will only have 50 API calls left to distribute among the other nine partners. This in turn can lead to lost transactions and dissatisfied users.

With an awareness of the network, however, the service would understand that this one partner routinely only sends 10 calls on a normal day, and would be able to put restrictions in place that wouldn’t let the extra 40 calls eat up the capacity of other partners.

In these kinds of situations, virtual sensors can provide the awareness and insights into this web of interdependency, and help make sense of traffic spikes that otherwise might seem incomprehensible.

Sensor-Aware Data Investments

Building a network of physical and virtual sensors entails collecting diverse signals from a complex map of data sources and processing them to generate events that can help enterprises understand the environments around their end users. Investing in these networks enables enterprises to track and monitor external signals generated from sources that have the ability to impact the enterprise’s traffic, transactions, and overall health.

This ability, in turn, helps digitally aware businesses negate potential troubles caused by the digital butterfly effect, and take advantage of the opportunities presented by a strong grasp of what’s happening in user and partner ecosystems.

How Data Analysis Drives the Customer Journey

Originally Published on Wired

Driving down Highway 1 on the Big Sur coastline in Northern California, it’s easy to miss the signs that dot the roadside. After all, the stunning views of the Pacific crashing against the rocks can be a major distraction. The signage along this windy, treacherous stretch of road, however, is pretty important — neglecting to slow down to 15 MPH for that upcoming hairpin turn could spell trouble.

Careful planning and even science goes into figuring out where to place signs, whether they are for safety, navigation, or convenience. It takes a detailed understanding of the conditions and the driving experience to determine this. To help drivers plan, manage, and correct their journey trajectories, interstate highway signs follow a strict pattern in shape, color, size, location, and height, depending on the type of information being displayed.

Like the traffic engineers and transportation departments that navigate this process, enterprises face a similar challenge when mapping, building, and optimizing digital customer journeys. To create innovative and information-rich digital experiences that provide customers with a satisfying journey, a business must understand the stages and channels that consumers travel through to reach their destination. Customer journeys are multi-stage and multi-channel, and users require information at each stage to make the right decisions as they move toward their destination.

Signposts on the Customer Journey

To understand what kind of information must be provided — and when it must be supplied — it’s important to understand the stages users travel through as they form decisions to purchase or consume products or services.

  • Search: The user starts on a path toward a transaction by searching for products or services that can deliver on his or her use case
  • Discover: The user narrows down the search results to a set of products or services that meet the use case requirements
  • Consider: The user evaluates the short-listed set of products and services
  • Decide: The user makes a decision on the product or service
  • Sign up/set up: The user completes the setup or sign up required to begin using the chosen product or service
  • Configure: The user configures and personalizes the product or service, to the extent possible, to best deliver on the user’s requirements
  • Act: The user uses the product or service regularly
  • Engage: The user’s usage peaks, collecting significant levels of activity, transaction value, time spent on the product, and the willingness to recommend the product or service to their professional or personal networks
  • Abandon: The user displays diminishing usage of the product or service compared to the configuring, active, and engaged levels
  • Exit: The user ceases use of the product or service entirely

Analyzing how a customer uses information as they navigate their journey is key to unlocking more transactions and higher usage, and also to understanding and delivering on the needs of the customer at each stage of their journey.

At the same time, it’s critical to instrument products and services to capture data about usage and behavior surrounding a product or service, and to build the processes to analyze the data to classify and detect where the user is on their journey. Finally, it’s important to figure out the information required by the user at each stage. This analysis determines the shape, form, channel, and content of the information that will be made available to users at each point of their transactional journey.

The highway system offers inspiration for designing an information architecture that guides the customer on a successful journey. In fact, there are close parallels between the various types of highway signs and the kind of information users need when moving along the transaction path.

  • Regulatory: Information that conveys the correct usage of the product or service, such as terms of use or credit card processing and storage features
  • Warning: Information that offers “guardrails” to customers to ensure that they do not go off track and use the product in an unintended, unexpected way; examples in a digital world include notifications to inform users on how to protect themselves from spammers
  • Guide: Information that enables customers to make decisions and move ahead efficiently; examples include first-run wizards to get the user up and running and productive with the product or service
  • Services: Information that enhances the customer experience, including FAQs, knowledge bases, product training, references, and documentation
  • Construction: Information about missing, incomplete, or work-in-progress experiences in a product that enable the user to adjust their expectations; this includes time-sensitive information designed to proactively notify the user of possible breakdowns or upcoming changes in their experience, including maintenance outages and new releases

Information Analytics

Information analytics is the class of analytics designed to derive insights from data produced by end users during their customer journey. Information analytics provides two key insights into the data and the value it creates.

First, it enables the identification of the subsets of data that drive maximum value to the business. Certain data sets in the enterprise’s data store are more valuable than others and, within a data set, certain records are more valuable than others. Value in this case is defined by how users employ the information to make decisions that eventually and consistently drive value to the business.

For example, Yelp can track the correlation between a certain subset of all restaurant reviews on their site and the likelihood of users reading them and going to the reviewed restaurants. Such reviews can then be automatically promoted and ranked higher to ensure that all users get the information that has a higher probability of driving a transaction—a restaurant visit, in this case.

Secondly, information analytics enables businesses to identify customer segments that use information to make decisions that drive the most business transactions. Understanding and identifying such segments is extremely important, as it enables the enterprise to not only adapt the information delivery for the specific needs of the customer segment but also price and package the information for maximum business value.

For example, information in a weather provider’s database in its raw form is usable by different consumers for different use cases. However, the usage of this information by someone planning a casual trip is very different than a commodities trader who is betting on future commodity prices. Understanding the value derived by a user from the enterprise’ information is key to appropriate pricing and value generation for the enterprise.

Information Delivery

Mining and analyzing how users access information is critical to identifying, tracking, and improving key performance indicators (KPIs) around user engagement and user retention. If the enterprise does not augment the product experience with accurate, timely, and relevant information (according to the user’s location, channel and time of usage), users will be left dissatisfied, disoriented, and disengaged.

At the same time, a user’s information access should be mined to determine the combination of information, channel, and journey stage that drives value to the enterprise. Enterprises need to identify such combinations and promote them to all users of the product and service and subsequently enable a larger portion of the user base to derive similar value.

Mining the information access patterns of users can enable enterprises to build a map of the various touch points on their customer’s journey, along with a guide to the right information required for each touchpoint (by the user or by the enterprise) in the appropriate form delivered through the optimal channel. Such a map, when built and actively managed, ends up capturing the use of information by customers in their journey and correlates this with their continued engagement with — or eventual abandonment of — the product.

Enabling successful journeys for customers as they find and use products and services is critical to both business success and continued customer satisfaction. Contextual information, provided at the right time through the right channel to enable user decisions, is almost always the difference between an engaged user and an unsatisfied one — and a transaction that drives business value.

Delight, the Awesome Product Metric That Rules Them All

Published Originally on Entrepreneur.com

Product success can be measured in numerous ways, including the rate of user signups, the number of popular features, the frequency of use and the duration of sessions. But the one metric that’s hardest to measure but most significant is delight.

In short, delight produces long-lasting loyalty and passion in users. It persuades and convinces them to not only continue using a product but also encourage everyone around to do so, too.

Delightful products stand out from the competition. Often, such products have little to no advertising because it’s not needed. These products are characterized by the ease of discovery, learning, use and reuse. Delightful products are talked about, tweeted about, shared and possess extensive word-of-mouth.

Members of a development team should understand what delight looks like. They need to postulate, hypothesize and understand what it would mean. They should determine how to detect the difference between a delighted user and an indifferent one.

The raison d’être of any product should be delighting the customer. The faster a product achieves this goal, the sooner it embeds itself into a user’s work flow, creates a sticky consumer experience and makes it hard for the customer to walk away.

The moment when a user is delighted for the first time directly maps to when that person could be considered likely to convert into an engaged customer. Engagement is that point when the user has bought into the value proposition of the product and adopts it as a means to solving his or her problems.

Delight causes users to be transformed into a company’s forward-marketing team. Fueled by euphoria, these users talk about the product to friends and family and on social media and their thoughts are circulated across their networks.

That same passion encourages customers to join the company’s user communities, contributing best practices and support techniques to other users. Delighted users share the capabilities of a product that’s pleased them (similar to cheat codes in gaming). This, in turn, spreads the delight to other users.

Creating sticky experiences.

If you’re not sure which features are pleasing users, this doesn’t mean there’s no delight.

You might simply be missing the feedback loop that’s required to capture that delight. Understand the types of features that are delighting users and those that are not, diagnose the root cause for delight or the lack thereof. It could be that you’re targeting the wrong category of users, that your market is changing or a new unsupported use case is developing that your product is primed to serve.

Delight can restore users who abandoned your product or prevent ones on the brink of bailing from doing so and instead restore them as active users. Understanding what delights users is a great way of ensuring that other features can leverage these insights in the quest to be delightful. Piggybacking on top of delightful features (by connecting new features to proven ones) can make the whole product better.

Resolving problems.

It’s important to track problems, issues and outages. When users encounter problems while using your product that prevent them from completing what they have in mind or the item does not live up to its marketing promise and only barely delivers on customers’ needs, the inverse of delight happens.

Understand whether a consumer’s usage of a product drops after an outage or whether a change in an opinion coincides with a bug.

No products are devoid of issues. But building delightful features (and focusing on this metric) leads to an insurance policy of sorts. Delighted users are more likely to forgive mistakes or outages. Take a popular service like Gmail or Facebook. Outages happen but the delight factor that these products bring prompts users to easily forget them.

Measuring and optimizing for success.

The faster a user becomes delighted with your product, more likely he or she is to stick with it and look beyond any outages and problems. This is how companies like Apple have ended up with fanatical users who wait for weeks in line to get their hands on the next product. Users do seemingly unexplainable things when fueled by passion and delight.

So how does one measure delight? Start with auditing the capabilities of your product and identifying the set of features that map directly to its core value. Measure usage of these features, social mentions, reviews and support questions.

If delight is not spotted, you may have one of two problems. Either the set of features that you believe are central to your value proposition are not the right ones or you’re measuring delight incorrectly. Go back and understand if you’re addressing the needs of the users who matter and whether you have the right sensors in place to learn if these consumers are delighted.