Bill Schmarzo – InFocus Blog | Dell EMC Services https://infocus.dellemc.com DELL EMC Global Services Blog Tue, 07 Aug 2018 19:04:52 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.7 The Customer Journey Digital Transformation Workbook https://infocus.dellemc.com/william_schmarzo/the-customer-journey-digital-transformation-workbook/ https://infocus.dellemc.com/william_schmarzo/the-customer-journey-digital-transformation-workbook/#respond Wed, 04 Apr 2018 09:00:34 +0000 https://infocus.dellemc.com/?p=34399 Digital Transformation is becoming a business mandate. Why? Because in our evolving world, successful digital transformation will be the difference whether a business survives or wilts in the modern economy. Consequently, Professor Sidaoui and I felt it was critical that we prepare our University of San Francisco students for a world where digital transformation was […]

The post The Customer Journey Digital Transformation Workbook appeared first on InFocus Blog | Dell EMC Services.

]]>
Digital Transformation is becoming a business mandate.

Why?

Figure 1: “Driving Business Strategies with Data Science: Big Data MBA”

Because in our evolving world, successful digital transformation will be the difference whether a business survives or wilts in the modern economy. Consequently, Professor Sidaoui and I felt it was critical that we prepare our University of San Francisco students for a world where digital transformation was the business norm. We sought to train our students – tomorrow’s business leaders – to embrace the “Big Data MBA” concepts in creating digital transformation-based business models.

To support this training, we created a methodology that guided the students through a digital transformation exercise. Two important digital transformation foundations surfaced as we created, tested, and refined the methodology, which are:

  • Economics, which is the branch of knowledge focused on the production, consumption, and transfer of wealth (see the blog “Is 2018 the Tipping Point in Digital Transformation?” for more details).
  • Customer Journey, which is the source of wealth (see the blog “Don’t Follow the Money; Follow the Customer!” for more details on the critical importance of understanding your customers).

Let’s review these foundational concepts of Digital Economics and Customer Journey in more detail, before we walk through the “Customer Journey Digital Transformation” methodology.

Key Digital Economic Concepts

The first foundational concept to a successful Digital Transformation is to understand digital economic basics. The McKinsey article “Why Digital Strategies Fail” highlights how the role of technology in our economy has expanded. Additionally, the article highlights incumbent businesses that are struggling to embrace new innovations out of fear of upsetting profit streams, cannibalizing existing business lines, and disrupting established management structures.

“Most digital strategies don’t reflect how digital is changing economic fundamentals, industry dynamics, or what it means to compete.”

Figure 2: Demand and Supply Curve

Digital transformation is disrupting traditional business models by changing the economics of the business. That is, transforming the sources of wealth creation and wealth transfer. There are some key economic concepts that we need to understand in order to prepare ourselves for digital transformation.

  • Economic Rent is any payment to an owner or factor of production in excess of the amount required by the owner to proceed with the transaction. Economic rent would not exist if markets were perfect, since competitive pressures and perfect access to market information would stabilize supply and demand. Closely related to Economic Rent, Economic Surplus is the monetary gain obtained by consumers because they are able to purchase a product for a price that is less than the highest price that they would be willing to pay.
  • Economies of Scale occur when a proportionate saving in costs, market access, distribution, and brand name recognition is gained via increased levels of production. In a digital world, the first mover advantage is not usually a single advantage, but rather a set of advantages that a company obtains by being first to develop and market a product. For example, digital renders distribution intermediaries obsolete (with limitless choice and price transparency). Digital offerings can be reproduced almost freely, instantly, and perfectly, shifting value to hyper-scale players, while driving marginal costs to zero and compressing prices.
  • Supply and Demand is the amount of a commodity, product, or service available and the desire of buyers for it, considered as factors regulating its price. Supply and demand fundamentals are upended by digital strategies. With a digitally transformed business model, companies have the scale to reach a nearly limitless customer base, use “artificial intelligence” to engineer superior levels of service, and benefit from frictionless supply chains. A digitally transformed business model can monetize their customers across traditional industry boundaries. Customer insights, less than products and content, become sources of wealth creation. Facebook and Google are major marketing players while producing no content. Uber and Airbnb sell global mobility and lodging without owning a single car or hotel.
  • Network Effect (or Metcalfe’s Law) states that the value of a telecommunications network is proportional to the square of the number of connected users of the system (n2). Metcalfe’s Law when applied to digital is driving winner-takes-all economics. Just as sobering as the shift of profit pools to customers, is when scale and network effects dominate markets, economic value rises to the top. It’s no longer distributed across the usual (large) number of participants. (Think about how Amazon’s market capitalization towers above that of other retailers, or how the iPhone regularly captures over 90 percent of smartphone industry profits.) This means that a company whose strategic goal is to maintain share relative to peers could be doomed—unless the company is already the market leader.

Understanding the Customer Journey Mapping

The second foundational concept to a successful Digital Transformation is to understand the customer journey.  If the customer is the source of wealth creation, then your Digital Transformation must be driven from the perspective of the customer. It is critical to focus on the customer perspective regardless of artificial industry borders.

There are several design thinking concepts and techniques to employ to help us passionately focus on the customer journey. For example, Customer Journey Mapping provides a step-by-step guide to putting the customer you serve at the center of your design process, and to come up with new answers to difficult customer problems and challenges [1].

A customer journey map depicts the stages customers go through when interacting with a company, from buying products online to accessing customer service on the phone, to airing grievances on social media. One of my favorite customer journey maps comes from “The Art of Opportunity” (see Figure 3).

Figure 3: The Art of Opportunity Customer Journey Mapping

Creating the Customer Journey Digital Transformation Workbook

Building upon digital economics and the customer journey mapping, the Customer Journey Digital Transformation worksheet helps organizations identify where and how to apply their digital assets to digitally transform their key business and operational processes, products, and assets to improve efficiency, enhance customer value, manage risk, and uncover new monetization opportunities[2].

The worksheet takes participants through the following steps:

  • Capture the Customer Event and the Customer’s Objectives for that Event. Customer Events could include planning a vacation, buying a house, or purchasing insurance.  In the business-to-business (B2B) space, Customer Events could include launching new IOT-connected devices, selling customer retention solutions, or developing new healthcare services.
  • Captures “What Does Success Look Like?” from the customer perspective. When the event is completed, what does the customer want to have accomplished, and how should the customer feel about the Event.
  • Brainstorm and capture the customer’s “Impediments to Success.” What are the situations, events, or constraints that might impact the success of the event?
  • Identify the major “Value Chain Stages” that comprise that Customer Objective. See the blog “Big Data MBA: Course 101A – Unit III” for a quick refresher on Michael Porter Value Chain Analysis process.
  • What are the Decisions or Tasks that comprise each of the Value Chain stages? What needs to be accomplished within each of the Value Chain stages (and we are not focused on how that is being done at this point)?
  • What are the Metrics or Key Performance Indicators (KPIs) against which progress and successful tasks in that value chain stage will be measured?
  • What are the Recommendations we want to deliver to the customer in the execution of their key decisions or tasks? This is the stage where we start to identify how predictive and prescriptive analytics around customer, product and/or operational insights might improve the execution of the customer’s key Decisions or Tasks.
  • Identify the key Business Entities around which we want to uncover and leverage analytic insights.
  • Create a more compelling, differentiated “smart” User Experience. Think “smart” entities or “things” (insurance policies, travel itineraries, financial accounts, job resumes, college transcripts, tax returns, credit cards, medical records, passports, drivers licenses, work visas) built using modern application development techniques (i.e., Minimum Viable Products) to integrate, learn, and continuously become more relevant by virtue of each customer interaction.

Figure 4 shows the Customer Journey Digital Transformation worksheet (Appendix A contains the Customer Journey Digital Transformation worksheet template).

Figure 4: Customer Journey Digital Transformation Worksheet (1 of 3)

Let’s walk through an example.

Customer Journey Digital Transformation Example: Taking a Vacation

Our first assignment was to apply the Customer Journey Digital Transformation to customers who are trying to “Take a Vacation.” We want to capture the following with respect to “Taking a Vacation” from the vacationer’s perspective:

  • Customer “Taking a Vacation” Objectives
  • What Does “Taking a Vacation” Success Look or Feel Like (a surprisingly interesting and effective exercise)
  • Impediments to “Taking a Vacation” Success

Next we identify the value chain stages that comprise the “Taking a Vacation” event. For our classroom exercise, we came up with the following “Taking a Vacation” stages:

  • Plan Vacation
  • Prepare for Vacation
  • Enjoy Vacation
  • Return from Vacation
  • Vacation Afterglow

Then for each of these stages, we brainstormed to identify the following:

  • What are the key decisions or tasks that comprise each “Taking a Vacation” stage?
  • What are the metrics against which we will measure progress and success of the “Taking a Vacation” decisions or tasks?
  • What are the recommendations (predictions) that we would like to provide in support of the “Taking a Vacation” decisions or tasks?
  • What data might you need to make those “Taking a Vacation” recommendations (predictions)?
  • Finally, what are “Taking a Vacation” customer experience or entities through which recommendations and associated learnings be rendered to the Vacationer?

See Figure 5 and Appendix B for more details on the “Taking A Vacation” Customer Journey Digital Transformation.

Figure 5: Digital Transformation Worksheet: Taking a Vacation (2 of 3)

Customer Journey Digital Transformation Summary

We gave our University of San Francisco (USF) students an assignment to help the university embrace digital transformation. By digitally transforming its operational and educational models, the university will be better prepared to serve the holistic lifetime educational needs of its customers (students).

In a future blog, I will share the results of that exercise. Needless to say, the students’ blew me away with their creativity.


Appendix A: Customer Journey Digital Transformation Worksheet Template


Appendix B: [Taking a Vacation] Journey Digital Transformation Example (Page 1 of 3)


Appendix B: [Taking a Vacation] Journey Digital Transformation Example (Page 2 of 3)


Appendix B: [Taking a Vacation] Journey Digital Transformation Example (Page 3 of 3)


[1]What is Digital Transformation?

[2] “IDEO Design Kit Methods

The post The Customer Journey Digital Transformation Workbook appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/the-customer-journey-digital-transformation-workbook/feed/ 0
A Digital Transformation Lesson: Open Source Business Models https://infocus.dellemc.com/william_schmarzo/a-digital-transformation-lesson-open-source-business-models/ https://infocus.dellemc.com/william_schmarzo/a-digital-transformation-lesson-open-source-business-models/#respond Thu, 29 Mar 2018 09:00:53 +0000 https://infocus.dellemc.com/?p=34595 The year was 1994 and I had the fortunate opportunity to stumble upon a company – Cygnus Support – that was “selling free software.” I remember telling my mom that I was Vice President of Sales & Marketing of a company that was selling free software. After a very long pause, she replied, “Is your […]

The post A Digital Transformation Lesson: Open Source Business Models appeared first on InFocus Blog | Dell EMC Services.

]]>
The year was 1994 and I had the fortunate opportunity to stumble upon a company – Cygnus Support – that was “selling free software.” I remember telling my mom that I was Vice President of Sales & Marketing of a company that was selling free software. After a very long pause, she replied, “Is your resume up to date?”

Cygnus Support sold support contracts and custom consulting projects for GNU development tools (gcc, g++, gdb) to companies looking to accelerate their time-to-market in the embedded systems market. Our value proposition was very clear and compelling for embedded product customers. We could fix compiler bugs in days, not months, which not only accelerated time-to-market, but also reduced the size of their embedded code by avoiding costly workarounds.

At the time, Cygnus Support executed a new, rarely-seen business model. They leveraged the open source concept to put development tools into the hands of software developers that fast-tracked time-to-value and de-risked product development efforts.  Eventually, Red Hat bought Cygnus Support and validated the open source business model (see Figure 1).

Figure 1: Red Hat Stock Price Performance

Fast forward to today where open source projects are the norm. Hadoop notably launched the open source Big Data market, laying the foundation for open source projects in IOT (Liota, Kafka, Nautilus, EdgeX Foundry), and machine learning and deep learning (TensorFlow, Apache Spark ML, Caffe, Torch).

We only need to observe their rise in popularity to understand more organizations seek an open source model to steer their overall business plan. However, identifying the best, or at least the most optimal, open source strategy remains unclear. To understand the best course of action, let’s go to our old friend, and 18th century economist, Adam Smith for some guidance.

Understanding Adam Smith and Sources of Value Creation

Adam Smith, in his seminal book “The Wealth of Nations,” described value creation in two ways:

  • “Value in Exchange” – defined as the value of an asset based upon how much you can get paid for the asset. Our common accounting practice is based upon the “value in exchange” concept (think about how what you pay for an item determines depreciation schedules).
  • “Value in Use” – defined as the value of an asset based upon how much value you can generate from the use of that asset. This is a discussion where economics is the branch of knowledge concerned with the production, consumption, and transfer of wealth.

Some open source companies, like Red Hat (Linux) and Hortonworks (Hadoop), have adopted the “value in exchange” business model, with the goal of selling support contracts and custom development services. That was the business model that we pioneered at Cygnus Support.

Other open source companies, like Google (TensorFlow) and Facebook (Torch), have embraced open source projects with the goal of leveraging the open source community to improve their respective platforms upon which they create value.

Key point is this: Google and Facebook don’t sell these open source products, but instead use the open source community to expand the capabilities of the platforms upon which they create new sources of value. This “value in use” strategy exploits the dynamics that a community can create a better product faster than the creator (Google, Facebook) can create on their own.

So, from a business model perspective, it’s a value in exchange (Red Hat and Hortonworks) versus value in use (Google and Facebook) decisions. It’s a product (to be sold) versus tool (to be used) decision. It’s a business model decision.

Open Source Business Model Lessons from Google

Google’s TensorFlow business model is based upon getting more developers to embrace and expand the capabilities of TensorFlow faster than Google could do on its own. The article “Reasons Why Google’s Latest AI-TensorFlow is Open Sourced” highlights Google’s open source strategy:

“In order to expedite the evolution of its ML and move towards a robust AI, TensorFlow needs to be exposed to new data sets, some of which might be proprietary data of the company/user that decides to use TensorFlow for applications. Google hopes that once TensorFlow is deployed across applications by different users, these users can then contribute to the original source code of TensorFlow with their upgraded code, as mandated under the Apache APA license. This would aid the company to roll out a [more] comprehensive AI engine in the future.”

Business Models and Value Chain Analysis

Let’s call upon another old-school friend of the Big Data MBA community – Michael Porter – to understand how his Value Chain Analysis technique can help guide our business model and digital transformation discussion.

In my original Strata presentation back in 2012, I shared with the audience how they could use Michael Porter’s classic (Old School) Value Chain Analysis technique to identify where and how to apply big data analytics. The goal was to deliver material financial, operational, and competitive benefits to the organization (see Figure 2).

Figure 2: Michael Porter Value Chain Analysis

You can find the original blog “Big Data MBA: Course 101A – Unit III” here (there were two other MBA techniques that I covered that day and you can find links to those materials in the first paragraph of the blog).

Let’s say that you are in the retail business and looking to leverage big data analytics to “optimize in-store merchandising effectiveness.” Let’s use the Value Chain technique to understand where and how to apply big data analytics.

  • Inbound Logistics: Use real-time Point of Sales (POS) analytics to predict (score) out-of-stock situations and prescribe corrective actions to suppliers and distributors as to what products, in what quantities, to deliver to what stores at what times.
  • Operations: Use real-time POS and RFID data to: predict merchandise demand; forecast slow product sales; prescribe sales and promotional actions to mitigate the impact of slow and non-movers; and optimize in-store / on-site inventory.
  • Outbound Logistics: Integrate and analyze social media with real-time in-store mobile app data and external event data (e.g., a large area event, unplanned construction work on a major travel artery) to identify and quantify merchandising and store traffic trends and model event-driven logistics impacts. Respond by prescribing in-store merchandising actions in order to optimize merchandising insights that impact stock and inventory levels for in-flight campaigns.
  • Sales / Marketing: Use conversion attribution analysis across search, display, mobile, and social media to quantify the online variables that are driving merchandising performance in order to optimize ad placement, keyword bids, and messaging in-real time.
  • Service: Combine social media and POS data with your customer loyalty data to create more-frequent, higher-fidelity customer scores for retention, fraud, up-sell/cross-sell, and net promoter scores that guide customer loyalty programs and promotions.
  • Infrastructure: Deploy predictive, real-time merchandising dashboards that predict in-store and department merchandising problems and prescribe corrective actions.
  • Human Resources: Combine social media data with data from local job sites and competitors’ hiring pages to predict at-risk employees and prescribe retention actions.
  • Technology: Use in-memory analytics to predict merchandising performance problems and prescribe corrective actions via an actionable mobile dashboard app.
  • Procurement: Combine merchandising images and in-store video surveillance data with POS data to quantify in-store partner promotional program effectiveness and negotiate better terms and conditions with key suppliers.

Michael Porter’s Value Chain Analysis provides a framework against which we can make decisions about where and how we can apply our digital assets to derive and drive new sources of value creation.

Digital Transformation is About Business Model Transformations

Digital transformation is not about application development or cloud or even big data analytics. Digital transformation is about leveraging the organization’s newly minted digital assets to derive and drive new sources of value creation.

Digital Transformation is about the integration of digital assets (data, analytics) and capabilities (AppDev) into an organization’s processes, products, and assets to create new sources of value creation – to improve operational efficiency, enhance customer value, manage risk, and uncover new monetization opportunities (see Figure 3).

Digital Transformation starts with an understanding of the organization’s value creation process for if you are not delivering new sources of organizational and customer value, why bother.

Note: I will have a paper forthcoming that details some recent work at the University of San Francisco and the National University of Ireland Galway on the Digital Transformation process. Think Value Chain Analysis meets Economics meets Design Thinking to help guide an organization’s digital transformation process.

The post A Digital Transformation Lesson: Open Source Business Models appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/a-digital-transformation-lesson-open-source-business-models/feed/ 0
Is 2018 the Tipping Point in Digital Transformation? https://infocus.dellemc.com/william_schmarzo/is-2018-the-tipping-point-in-digital-transformation/ https://infocus.dellemc.com/william_schmarzo/is-2018-the-tipping-point-in-digital-transformation/#respond Wed, 28 Mar 2018 09:00:24 +0000 https://infocus.dellemc.com/?p=34358 “Survival, in the cool economics of biology, means simply the persistence of one’s own genes in the generations to follow.” —Lewis Thomas   A recent article in The Economist titled “The Year of the Incumbent” postulates that 2018 is the year that the incumbents “get back into the game” by stealing the momentum from technology […]

The post Is 2018 the Tipping Point in Digital Transformation? appeared first on InFocus Blog | Dell EMC Services.

]]>
“Survival, in the cool economics of biology, means simply the persistence of one’s own genes in the generations to follow.” Lewis Thomas

 

A recent article in The Economist titled “The Year of the Incumbent” postulates that 2018 is the year that the incumbents “get back into the game” by stealing the momentum from technology startups to reclaim their spots atop the market valuation charts.

Tech firms have captured 42% of the rise in the value of America’s stock market since 2014 as investors forecast they will win an ever-bigger share of corporate profits.

As technology, and its role in our economy, has expanded, incumbent firms have struggled to embrace new innovations out of fear of upsetting profit streams, cannibalizing existing business lines, and disrupting established management structures. The McKinsey article “Why Digital Strategies Fail” explains why incumbents fail when embracing digital transformation:

Most digital strategies don’t reflect how digital is changing economic fundamentals, industry dynamics, or what it means to compete.

Understanding the Economics of Digital Transformation

Most digital transformations fail because organizations misunderstand the economics of a digital business. The incumbents, despite long histories of revenue growth and profitability, have an economic blind spot when it comes to digital transformation. Let’s drill into the finer economic details that incumbents need to contemplate as they embark on their digital transformations.

  • Digital is destroying economic rent. One of the first concepts we learned in microeconomics was economic rent—profit earned in excess of a company’s cost of capital. Digital is destroying this equation by creating more value for customers than for firms. For example, digital competitors with niche products and agile delivery offerings are forcing organizations to unbundle profitable product and service offerings. This results in more freedom of choice for customers to buy only what they need (and not being forced to buy what they don’t need). This is shifting the profit pools and decision making away from the firms and towards the customers. Digital also renders physical distribution intermediaries obsolete. Consider, how healthy is your nearest big-box store? With digital distribution providing limitless choice and price transparency (thanks Google Search), digital offerings can be reproduced freely, instantly, and perfectly, shifting value to hyper-scale players while driving marginal costs – and product margins – towards zero.
  • Digital is driving winner-takes-all economics. Just as sobering as the shift of profit pools to customers is rising economic value of scale and network effects increasingly dominating markets. Profits are no longer distributed across a large number of participants. Think about how Amazon’s market capitalization towers above that of other retailers or how the iPhone regularly captures over 90+ percent of ALL the smartphone industry profits. This means that a company whose strategic goal is to maintain share relative to peers could be doomed—unless the company is already the market leader and prepared to pay the price to remain the market leader.
  • Metcalfe’s Law [1], while not exactly an economic theory, enables first-movers to build “economic moats” around their business models via an exponentially growing web of interconnected users and businesses. But first movers don’t always win; just ask AOL and MySpace. Consequently, first movers also must be aware of the economic impacts on their business models or a fast second mover will disrupt their business models by disintermediating their customer base (see the blog “The New Normal: Big Data Business Model Disintermediation and Disruption” for more details on business model disruption and customer disintermediation).
  • Consumer surplus is the difference between the total amount that consumers are willing and able to pay for a good or service (indicated by the demand curve), and the total amount that they actually pay (i.e. the market price). Digital economics is moving the demand curve towards perfect price elasticity, where the consumer surplus approaches zero because the price that people pay more closely matches what they are willing to pay (see Figure 1).

Figure 1: Consumer Surplus and Price Elasticity

Being “Amazoned”

A new, terrifying phrase has entered the lexicon of business jargon: being “Amazoned.” What does it mean? Ever since Amazon acquired Whole Foods, companies across a wide swath of industries have received a wake-up call. Traditional brick and mortar industries, like grocery stores, are susceptible to purchase by digital giants.

Let’s say that you are in the retail industry and are looking to digitally transform your business model. The scenario outlined in Figure 2 provides an example of what one might to expect (pulled from my blog “The 4 Laws of Digital Transformation”):

Figure 2: 4 Laws of Digital Transformation

The scenario in Figure 1 isn’t just optimizing the ordering process. The scenario in Figure 1 requires the complete re-wiring of the organization’s business model and value creation process: from demand planning, to procurement, to quality control, to logistics, to inventory management, to distribution, to marketing, to store operations.

And the entire value creation and capture process starts with understanding, optimizing and simplifying the customer experience; to create a more compelling, differentiated customer experience that builds loyalty and ultimately advocacy.

Summary: Digital Transformation and Data Monetization

Organizations are going to have a hard time ignoring the “siren song” of digital transformation. Economic pressures, market demands, and customer expectations are forcing businesses, particularly legacy corporations that have yet to see the breaking digital wave, to embrace digital transformation. However, market pressures alone shouldn’t spur action. There are plenty of positives resulting from a transformed business, namely new data monetization opportunities, new customer acquisition methods, and new sources of customer, product, and operational market value.

Right now, we are helping clients understand the role and impact of data monetization in their digital transformation strategies, which includes a “Data Monetization Workbook” to guide their understanding of how to create new sources of customer, product and operational value.

Figure 3: Data Monetization Workbook

Watch this space for more details as we test, learn and refine the workbook.

[1] Metcalfe’s law states that the value of a telecommunications network is proportional to the square of the number of connected users of the system (n2).

 

The post Is 2018 the Tipping Point in Digital Transformation? appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/is-2018-the-tipping-point-in-digital-transformation/feed/ 0
Data Analytics and Human Heuristics: How to Avoid Making Poor Decisions https://infocus.dellemc.com/william_schmarzo/data-analytics-and-human-heuristics-how-to-avoid-making-poor-decisions/ https://infocus.dellemc.com/william_schmarzo/data-analytics-and-human-heuristics-how-to-avoid-making-poor-decisions/#respond Mon, 19 Mar 2018 09:00:47 +0000 https://infocus.dellemc.com/?p=34249 The “hot hand,” a metaphor applied frequently to the game of basketball, is the idea that a basketball shooter, after making several consecutive shots, will experience a higher than normal success rate on his or her ensuing shots. I discussed the “hot hand” concept, and its flaw, at a TDWI (The Data Warehouse Institute) conference […]

The post Data Analytics and Human Heuristics: How to Avoid Making Poor Decisions appeared first on InFocus Blog | Dell EMC Services.

]]>
The “hot hand,” a metaphor applied frequently to the game of basketball, is the idea that a basketball shooter, after making several consecutive shots, will experience a higher than normal success rate on his or her ensuing shots. I discussed the “hot hand” concept, and its flaw, at a TDWI (The Data Warehouse Institute) conference many years ago.

Figure 1: “The Hot Hand in Basketball” by Thomas Gilovich, Robert Vallone and Amos Tversky

Much to my surprise, I noticed that Amos Tversky was the source of that original analysis. This is the same Amos Tversky who, along with Daniel Kahneman, was featured in Michael Lewis’s most recent book “The Undoing Project: A Friendship That Changed Our Minds, an account of Tversky and Kahnemans’ extraordinary bond that incited a revolution in Big Data studies, among other things, and made much of Lewis’ own work possible.

Given how much I love Michael Lewis’ work (his book “Moneyball” is still my favorite data science introduction book), I couldn’t wait to dive into this one!

Lewis is a marvelous writer with an uncanny ability to explain complex subjects in very understandable terms. I encourage you to buy and read the book, and appreciate Lewis’ uncovering Tversky and Kahnemans’ considerable work in behavior economics. (By the way, Kahneman recently published his critically acclaimed book on “Thinking, Fast and Slow.”)

Behavioral Economics: The Violent Collision of Economics and Psychology

Tversky and Kahneman were the original “Economic Psychologists.” They researched, identified, and validated heuristics in human judgment and decision-making that led to common and predictable errors in the human psyche.

Their landmark research paper, published in 1979, was titled “Prospect Theory: An Analysis of Decision under Risk.” Prospect Theory explores behavioral economic theory and analyzes the way people choose between probabilistic alternatives that involve risk. Prospect Theory investigates how people make decisions based on the potential value of losses and gains using certain heuristics, rather than the final outcome. The model is descriptive. It tries to model real-life choices, rather than optimal decisions, as normative models do.

Figure 2: The Formidable “Trio”

Throughout their lifetime collaboration, the duo discovered, tested, validated, and published several types of Human Heuristics that lead human decision-making astray including:

  • Availability is a judgmental heuristic in which a person evaluates the frequency of classes or the probability of events by availability. An example would be the ease with which relevant instances come to mind. In general, availability is correlated with ecological frequency, but it is also affected by other factors. Consequently, the reliance on the availability heuristic leads to systematic biases. Such biases are demonstrated in the judged frequency of classes of words, of combinatorial outcomes, and of repeated events. The phenomenon of illusory correlation is explained as an availability bias. The effects of the availability of incidents and scenarios on subjective probability are discussed.
  • Representation is “the degree to which [an event] (1) is similar in essential characteristics to its parent population, and (2) reflects the salient features of the process by which it is generated.” When people rely on representativeness to make judgments, they are likely to judge wrongly. The fact that something is more representative does not actually make it more likely. The representativeness heuristic is simply described as assessing similarity of objects, and organizing them around the category prototype (e.g., like goes with like, and causes and effects should resemble each other). This heuristic is used because it is an easy computation.
  • Anchoring is a heuristic used in many situations where people estimate a number. According to Tversky and Kahneman’s original description, it involves starting from a readily available number—the “anchor”—and shifting either up or down to reach an answer that seems plausible. In Tversky and Kahneman’s experiments, people did not shift far enough away from the anchor. Hence the anchor contaminates the estimate, even if it is clearly irrelevant.
  • Simulation is a psychological heuristic, or simplified mental strategy, according to which people determine the likelihood of an event based on how easy it is to picture the event mentally. Partially as a result, people experience more regret over outcomes that are easier to imagine, such as “near misses.”
  • Framing is an example of cognitive bias, in which people react to a particular choice in different ways depending on how it is presented; e.g. as a loss or as a gain. People tend to avoid risk when a positive frame is presented, but seek risks when a negative frame is presented. Gain and loss are defined in the scenario as descriptions of outcomes (e.g., lives lost or saved, disease patients treated and not treated, lives saved and lost during accidents, etc.).
  • Confirmation Bias is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses. To quote The Undoing Project, “The human mind was just bad at seeing things it did not expect to see, and a bit too eager to see what it expected to see.”

There are more as well, but I think these form the foundation for a potential degree in behavioral economics.

Framing Example: 401(k) Auto-enrollment

One of the best examples of exploiting these human biases to one’s advantage is the 401(k) auto-enrollment movement as detailed in the article “These Simple Moves by Your Employer Can Dramatically Improve Your Retirement.” Features like automatic enrollment and automatic escalation of contributions, with an opt-out provision, turn inertia into an asset. These features are now broadly employed and have greatly boosted both participation and deferral rates. Among companies with a 401(k) plan, 70% have some kind of auto feature, according to benefits consultant Aon Hewitt. Merrill found that plans with auto enrollment had 32% more participants, and those with an auto escalation feature had 46% more participants increasing their contributions.

Applying “The Undoing Project” Learnings

Having a solid understanding of human decision-making biases can help ensure that your analytics insights are delivered in the most effective and objective ways. While truly understanding human decision-making biases requires a heavy dose of training, analytics presented correctly and thoroughly can raise awareness of biases to which your stakeholders can unwittingly fall prey.

Sources:

Figure 1: “The Hot Hand in Basketball” by Thomas Gilovich, Robert Vallone, Amos Tversky

Figure 2: Photo Courtesy of Gulf Times, “Minding the Mindset” by Nick Romeo

 

The post Data Analytics and Human Heuristics: How to Avoid Making Poor Decisions appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/data-analytics-and-human-heuristics-how-to-avoid-making-poor-decisions/feed/ 0
Artificial Intelligence: 6 Step Solution Decomposition Process https://infocus.dellemc.com/william_schmarzo/artificial-intelligence-6-step-solution-decomposition-process/ https://infocus.dellemc.com/william_schmarzo/artificial-intelligence-6-step-solution-decomposition-process/#respond Mon, 05 Mar 2018 10:00:00 +0000 https://infocus.dellemc.com/?p=34195 It’s simple. The conversation is simple because the objective is simple: How do I become more effective at leveraging (big) data and analytics (artificial intelligence) to power my business? Success with artificial intelligence doesn’t begin with technology, but rather the business, and more specifically the people and processes running the business. Before deploying technology, leaders […]

The post Artificial Intelligence: 6 Step Solution Decomposition Process appeared first on InFocus Blog | Dell EMC Services.

]]>
It’s simple. The conversation is simple because the objective is simple:

How do I become more effective at leveraging (big) data and analytics (artificial intelligence) to power my business?

Success with artificial intelligence doesn’t begin with technology, but rather the business, and more specifically the people and processes running the business. Before deploying technology, leaders should seek to understand (envision) how artificial intelligence could power a profitable business, and drive compelling customer and operational outcomes.

Collaboration with stakeholders and key constituents is critical to understanding the decisions and needs of the business. While every organization’s needs vary, there exists a consistent, transparent process that can drive a more stable and widespread adoption of artificial intelligence.

Note: throughout this blog, when I use the term “artificial intelligence,” I mean that to include other advanced analytics such as deep learning, machine learning (supervised, unsupervised, reinforcement), data mining, predictive analytics, and statistics (see Figure 1).

Figure 1: The Evolution of AI, ML and DL (Source: Nvidia)

Artificial Intelligence Solution Decomposition Process

I teach a “Solution Decomposition Process” course at the University of San Francisco as well as at other universities whenever I guest lecture (like I will be doing at NUI Galway, Ireland on March 16 from 6:00pm to 8:00pm). I also spend considerable time teaching the “Solution Decomposition Process” to executive teams to help them successfully adopt artificial intelligence into their business. Whether lecturing or meeting with a small group of executives, I always begin by first addressing a basic question:

What do I mean by success?

If success is simply adopting and deploying advanced analytics technologies, then you don’t need strategic guidance to chart that journey.

However if success means deriving and delivering business value, and becoming more effective at leveraging big data and advanced analytics to power your business – then I have the process for you (see Figure 2)!

Figure 2: Solution Decomposition Process

 

Figure 2 outlines the “Solution Decomposition Process” that is designed to ensure that artificial intelligence is deriving and driving new sources of business value. The power of this process is its simplicity. By staying focused on the business or operational objectives and tasks, businesses can successfully transform how they use data and analytics to produce optimal outcomes.

There are six key steps to the Solution Decomposition Process to undertake before deploying AI solutions to derive and drive business value. Let’s take a quick look at each.

Step 1: Identify and Understand Your Targeted Business Initiative

Modernizing your data center is not a business initiative. Moving to the cloud is .not a business initiative. Installing an “Analytics as a Service” platform is not a business initiative.

So what is a business initiative?

A business initiative is a senior executive mandate that seeks measurable and material financial impact on the value of the business.

Here is a checklist of the key characteristics of a business initiative:

  • Sense of urgency mandating results be delivered in 12-18 months
  • Important to success and survival of the business
  • Compelling and material financial impact (ROI)
  • Clear business executive ownership – someone on the executive team is not sleeping at night due to their concerns on this initiative
  • Analytically friendly in that customer, product, and operational insights have material impact on initiative success
  • Bounty of potential data sources to be mined for actionable insights in support of the business initiative
  • Strong CIO leadership and IT business collaboration

A thorough understanding of the business initiative is key before starting the journey, including:

  • What are the targeted financial outcomes or returns from the business initiative?
  • What are the metrics that will measure success?
  • How will this initiative impact the customer for better or worse?
  • What are the potential (and likely) impediments to success?

For example, PNC Financial Services Group’s annual report mentions the business initiative to “grow profitability through the acquisition and retention of customers and deepening relationships.” We will use this “increase customer retention/reduce customer attrition” business initiative for the rest of this exercise.

Figure 3: PNC Financial Services Group 2015 Annual Report

Step 2: Identify Your Stakeholders and Constituents

Next, identify the business stakeholders and constituents who either impact or are impacted by the targeted business initiative. This includes internal stakeholders (e.g., sales, finance, marketing, logistics, manufacturing) as well as external constituents like partners, suppliers, and don’t forget, the customers!

Start embracing some simple Design Thinking techniques. Create a single-slide persona for each stakeholder and key constituent as a way to make the key stakeholders “come to life” (see Figure 4).

Figure 4: Create Persona for Key Stakeholders

 

See the blog, “Design Thinking: Future-proof Yourself from AI,” for more insights about the role of design thinking, artificial intelligence and machine learning.

Step 3: Identify Key Decisions

Next, identify the decisions that the stakeholders and constituents need to make to support the targeted business initiative. Be sure to invest the time upfront to identify, validate, vet, and prioritize the decisions because: 1) not all decisions are of equal value and 2) there may be some decisions that need to be made prior to other decisions (see Figure 5).

Figure 5: Identify, Prioritize and Create Decisions Roadmap

 

See the blog “The #1 IOT Challenge: Use Case Identification, Validation and Prioritization” for more details on how to identify, validate, and prioritize the organization’s key decisions.

Step 4:  Identify Predictive Analytics

The next step is challenging because it requires organizations to change their mindsets with respect to how they currently leverage data and analytics. Organizations need to guide their stakeholders through a process of identifying the most important predictions that will support the targeted initiatives. This process starts by identifying the most important questions that the stakeholders are asking today in support of their key decisions.

Questions can then be converted into predictive analytics. For example, instead of asking: “What was customer attrition last month?” we want to predict: “What will customer attrition likely be next month?” See Figure 6.

Figure 6: Creating Predictive Analytics

 

See the blog “Business Analytics: Moving From Descriptive To Predictive Analytics” for more details on the differences between descriptive, predictive, and prescriptive analytics.

Step 5: Brainstorm Data That Might Be Better Predictors of Performance

Next, we want to collaborate with the business stakeholders and constituents to brainstorm what data they might need to make those predictions. Continuing with our “increase customer retention / reduce customer attrition” business initiative, we highlight one of top predictions:

“What will customer attrition likely be next month?”

To support the data brainstorming exercise, we would simply add the phrase “and what data might I need to make that prediction?” to the desired prediction. The results of this exercise might look like Figure 7.

Figure 7: Brainstorming Data that Might be Better Predictors of Performance

 

See the blog “Data Science: Identifying Variables That Might Be Better Predictors” for more details on how to brainstorm data sources that might be better predictors of performance.

Step 6:  Implement Technology

The final step – not the first step – is now to identify the architecture, systems, and technology necessary to support the business initiative. Understanding in detail the business, data, and analytic requirements helps determine what technologies are needed – and what technologies are not yet needed – as IT builds out their big data architecture and infrastructure (see Figure 8).

Figure 8: Data Lake Components

 

The availability of scale out architectures and cloud environments ensures storage and compute can be expanded as needed, reducing the need to overspend on technology while achieving a compelling return on investment.

Summary

There is a logical workflow in successfully adopting and tapping into the potential of artificial intelligence. Decision makers all too often put their immediate attention to the technology, but there is a host of activities to complete beforehand. If you are serious about monetizing data to drive business value, it is imperative to begin with the business – the people, customers, and stakeholders – that have different roles and responsibilities in determining the success of a business initiative. Start small, work your way outward to identify the supporting decisions and financial value – then get to the technology.

The post Artificial Intelligence: 6 Step Solution Decomposition Process appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/artificial-intelligence-6-step-solution-decomposition-process/feed/ 0
Great Data Scientists Don’t Just Think Outside the Box, They Redefine the Box https://infocus.dellemc.com/william_schmarzo/great-data-scientists-dont-just-think-outside-the-box-they-redefine-the-box/ https://infocus.dellemc.com/william_schmarzo/great-data-scientists-dont-just-think-outside-the-box-they-redefine-the-box/#comments Wed, 28 Feb 2018 10:00:35 +0000 https://infocus.dellemc.com/?p=34078 Special thanks to Michael Shepherd, AI Research Strategist, Dell EMC Services, for his co-authorship. Learn more about Michael at the bottom of this post. Imagine you wanted to determine how much solar energy could be generated from adding solar cells to a particular house. This is what Google’s Project Sunroof does with Deep Learning. Enter […]

The post Great Data Scientists Don’t Just Think Outside the Box, They Redefine the Box appeared first on InFocus Blog | Dell EMC Services.

]]>
Special thanks to Michael Shepherd, AI Research Strategist, Dell EMC Services, for his co-authorship. Learn more about Michael at the bottom of this post.

Imagine you wanted to determine how much solar energy could be generated from adding solar cells to a particular house. This is what Google’s Project Sunroof does with Deep Learning. Enter an address and Google uses a Deep Learning framework to estimate how much money you could save in energy costs with solar cells over 20 years (see Figure 1).

Figure 1: Google Project Sunroof Project

It’s a very cool application of Deep Learning. But let’s assume there “might” be an even better way to estimate solar energy savings. For example, you want to use Deep Learning to estimate how much solar energy we could generate with solar panels on the Golden Gate Bridge (that probably wouldn’t be a very popular decision in San Francisco). The obvious application would be to analyze several photos of the Golden Gate Bridge and estimate clear skies based upon cloud coverage.

However instead of estimating the potential solar energy generation based upon “cloud coverage,” what if we wanted to use “sunlight reflection” to generate the solar energy estimate (see Figure 2)?

Figure 2: Determining Best Predictive Variables for the Golden Gate Bridge

Or maybe you want to test another metric based upon the “sharpness of the shadows” generated by the bridge? Or another metric based upon how many people in the photo are wearing sunglasses? Or yet another metric based upon…

How do you know which of these variables – clouds or reflection or shadows or sunglasses or anything else – is the better predictor of solar energy generation? You try them all!

This thought process highlights an important behavioral trait of the best data scientists; the best data scientists have strong imaginative skills for not just “thinking outside the box” – but actually redefining the box – in trying to find variables and metrics that might be better predictors of performance.

The word “might” is a powerful enabler. “Might” is used to say or indicate that something is possible. It’s a data scientist’s most important concept, because “might” gives the data scientist the license to explore, be wrong, learn and try again.

“It Can’t Be Done” Is Not a Data Scientist Term

Andrew Ng, artificial intelligence visionary and fearless leader for many of us, wrote a recent article titled, “What Artificial Intelligence Can and Can’t Do Right Now.” In the article, Andrew states the following:

“Surprisingly, despite AI’s breadth of impact, the types of it being deployed are still extremely limited. Almost all of AI’s recent progress is through one type, in which some input data (A) is used to quickly generate some simple response (B). For example:”

Figure 3: What Machine Learning Can Do

While the use cases are limited today, the creativity at which data scientists are leveraging Big Data and existing Machine Learning and Deep Learning technologies is staggering. Let me give you one example of how data scientists from one of our Services teams at Dell EMC are thinking outside the box, to uncover new ways to help our customers avoid issues in their IT environment and create a more effortless support experience.

Predicting Hard Drive Failures

Let’s say that you are capturing over 260+ different pieces of telemetry data several times a minute for the life of a device. Most of these 260+ variables have incomplete or sparse data, the collection timing doesn’t always line up nice and neat, and getting time continuity across the devices is a major challenge.

If you were using a traditional Machine Learning algorithm, the data science team would have to spend an overwhelming amount of time 1) feature engineering new variables based on domain knowledge, and 2) using trial-and-error to determine which combinations of variables should even be included in the Machine Learning model.

Instead, our Dell EMC Services data scientists used a Patent Pending approach to Deep Learning to “pixelate” the data. They turned the over 260+ variables into device performance “images.” Then once they created these “images,” the team leveraged a recurrent neural network to find “shapes” and repeatable patterns out of random pixels (see Figure 3).

Figure 4: Pixelating Telemetry Data

A recurrent neural network (RNN) is a class of artificial neural network where connections between units form a directed cycle. RNNs can use their internal memory to process arbitrary sequences of inputs, which typically makes RNNs ideal for handwriting or speech recognition. Except in this case, instead of trying to decipher handwriting into words, the data science team used the RNN to decipher the seemingly random pixels into a prediction on the state of the device (see Figure 4).

Figure 5: Using RNN’s to Identify Shapes and Patterns Buried in the Telemetry Data

I love this example because the team didn’t feel constrained to try to fit the square peg into the round “Machine Learning” hole. Instead, they used Deep Learning in a different context to decipher seemingly random pixels into a prediction of the health of a device. The data scientists didn’t wait until someone developed a better Machine Learning algorithm. Instead, they looked at the wide variety of Machine Learning and Deep Learning tools and algorithms available to them, and applied them to a different, but related use case. If we can predict the health of a device and the potential problems that could occur with that device, then we can also help customers prevent those problems, significantly enhancing their support experience and positively impacting their environment.

Summary

One of a data scientist’s most important characteristics is that they refuse to take “it can’t be done” as an answer. They are willing to try different variables and metrics, and different type of advanced analytic algorithms, to see if there is another way to predict performance.

By the way, I included this image just because I thought it was cool. This graphic measures the activity between different IT systems. Just like with data science, this image shows there’s no lack of variables to consider when building your Machine Learning and Deep Learning models!

Want more information on how Dell EMC Services uses data science?

Check out the “Decoding Customer DNA with Data Science” blog by Doug Schmitt, President, Dell EMC Global Services, and watch for the upcoming podcasts “A Conversation with Two Data Geeks” to hear directly from the data scientists behind our transformative technologies.


Meet Michael Shepherd in Vegas, April 30 – May 3!

Want to learn more about predictive customer intelligence? Come meet Dell EMC’s data geeks, Michael Shepherd and Dr. Rex Martin, in person at Dell Technologies World, and hear how they use data science and Dell EMC technologies to predict personalized outcomes and recommend preventative actions for our customers. Join their “Predictive customer intelligence: Using data science to transform the customer service experiencepresentation.

 


I would like to thank my co-author Michael Shepherd, AI Research Strategist, Dell EMC Services. Michael holds U.S. patents in both hardware and software and is a Technical Evangelist who provides vision through transformational AI data science. With experience in supply chain, manufacturing and services, he enjoys demonstrating real scenarios with the SupportAssist Intelligence Engine showing how predictive and proactive AI platforms running at the “speed of thought” are feasible in every industry.

The post Great Data Scientists Don’t Just Think Outside the Box, They Redefine the Box appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/great-data-scientists-dont-just-think-outside-the-box-they-redefine-the-box/feed/ 2
The 3% Edge: How Data Drives Success in Business and the Olympics https://infocus.dellemc.com/william_schmarzo/at-the-3-edge-how-data-drives-success-in-business-and-the-olympics/ https://infocus.dellemc.com/william_schmarzo/at-the-3-edge-how-data-drives-success-in-business-and-the-olympics/#respond Tue, 20 Feb 2018 10:00:41 +0000 https://infocus.dellemc.com/?p=34133 A recent Bloomberg BusinessWeek article entitled “The Tech Guy Building Wearables for America’s Olympians” profiles Mounir Zok, the man in charge of the U.S. Olympic Committee’s technology and innovation. The article discusses how Mr. Zok is bringing a Silicon Valley data scientist mentality to help America’s Olympic athletes more effectively leverage data and analytics to […]

The post The 3% Edge: How Data Drives Success in Business and the Olympics appeared first on InFocus Blog | Dell EMC Services.

]]>
A recent Bloomberg BusinessWeek article entitled “The Tech Guy Building Wearables for America’s Olympians” profiles Mounir Zok, the man in charge of the U.S. Olympic Committee’s technology and innovation. The article discusses how Mr. Zok is bringing a Silicon Valley data scientist mentality to help America’s Olympic athletes more effectively leverage data and analytics to win Olympic medals.

To quote the article:

Zok won’t say who his partners were in the development process or even which athletes are using the suits; any hints might tip off Olympic engineers in other countries, erasing the USOC’s advantage.“I call it the 1 percent question,” he says.“Olympic events typically come down to a 1 percent advantage. So what’s the one question that, if we can provide an answer, will give our athletes that 1 percent edge?

Wait a second, what is this “1% edge,” and is that something that we can apply to the business world? I wanted to drill into this “1% edge” to not only verify the number, but to further understand how the “1% edge” might apply to organizations trying to effectively leverage data and analytics to power their businesses (see “Demystifying the Big Data Business Model Maturity Index”).

Verifying the 1% Edge

To start validating this 1% edge, I evaluated single athlete sports, where focusing on the singular performer is easier than a team sport. Here’s what I found.

Professional Golf. The top 5 worldwide professional golfers (as measured by strokes per round) are only 3 percent better than players #96 – #100. Even more amazing is that while the top 5 professional golfers are only separated by 3 percent in their stroke average, from golfers #96 through #100, the golfers ranked #96 – #100 earned 89.5 percent less than the top 5 (see Figure 1)!

Figure 1: YTD Statistics, Farmers Insurance Open, January 28, 2018

The 3 percent edge is quite evident in golf. Three strokes can be the difference between victory and defeat, and it also demonstrates the disparity in earning potential.

2016 Olympics Men’s Track. Next I looked at the 2016 Olympics men’s track events: 100 meter dash, 400 meter dash and marathon. The difference between the very best and those dreaming of gold medals was again only a small percentage, specifically fractions of seconds in sprinting events.

Figure 2: 2016 Olympic Men’s 100 Meter Results

 

Figure 3: 2016 Olympic Men’s 400 Meter Results

 

Figure 4: 2016 Olympic Men’s Marathon Results

In summary:

  • The difference between a gold medal and no medal was between 1.22% to 2.28%
  • The difference between a gold medal and 8th place IN THE OLYMPICS was between 2.40% to 3.67%

Think about the years of hard work and commitment these world-class athletes put into preparing for these events, only to finish out of the medals by approximately 2%. So while the “1% edge” may not be entirely accurate, I think a 1% to 3% difference on average looks about right for athletes (and organizations) that want to be considered world class.

Applying the 3% Edge to Become World Class

What does a 3 percent edge mean to your business? What does it mean to be 3 percent better in retaining customers, or bringing new products to market, or reducing hospital readmissions, or preventing unplanned maintenance?

While I couldn’t find any readily available metrics about world class in these business areas, I came back to the seminal research from Frederick F. Reichheld and W. Earl Sasser, Jr. highlighted in the classic “Harvard Business Review” article “Zero Defections: Quality Comes to Services” written in 1990. The bottom line from their research: increasing customer retention rates by 5% increases profits by 25% to 95% (see Figure 5).

Figure 5: Profitability Impact from a 5% Increase in Customer Retention

When these research results were published in 1990, they startled so many marketing executives that it set off a rush to acquire Customer Relationship Management (CRM) applications like Siebel Systems.

The Power of Compounding 1% Improvements

One of the most powerful concepts behind “hitting lots of 3 percent singles versus a single 20 percent homerun” is the concept of compounding. So what does a “3 percent compounding” actually look like? Let’s walk through a fraud example.

Let’s say you have a $1 million run-rate business with an annual 10 percent fraud rate. That results in $100K in annual fraud losses. What if, through the use of advanced analytics, you were able to reduce the fraud fate by 3 percent each year? What is the cumulative effective of a 3 percent annual improvement over five and 10 years?

Figure 6: 3% Compounded Impact on Fraud

While the results start off pretty small, it doesn’t take much time until the compounding and cumulative effects of a 3 percent improvement provide a significant financial return. And though it may not make much sense to look beyond five years (due to customer turnover, technology, evolving competition and market changes), even at five years the financial return is significant.

Take it a step further and consider the impact when combining multiple use cases, such as:

  • Waste and spoilage reduction
  • Energy effectiveness
  • Preventative maintenance
  • Unplanned network or grid downtime
  • Hospital acquired infections
  • Unplanned Hospital readmissions
  • Power outages
  • Delayed deliveries

A business that acquires a 3 percent compounding effect across numerous use cases begins to look like a business that can achieve compound growth.

Summary

I believe there is tremendous growth opportunity for organizations that have the data and analytical disciplines to drill into what a 3 percent improvement in performance might mean to the overall health of their business. Such analysis would not only highlight the power of even small improvements, but offer clarity into what parts of the business should be prioritized for further acceleration.

Sources:

Table 1: YTD Statistics, Farmers Insurance Open, January 28, 2018

The post The 3% Edge: How Data Drives Success in Business and the Olympics appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/at-the-3-edge-how-data-drives-success-in-business-and-the-olympics/feed/ 0
Don’t Follow the Money; Follow the Customer! https://infocus.dellemc.com/william_schmarzo/dont-follow-the-money-follow-the-customer/ https://infocus.dellemc.com/william_schmarzo/dont-follow-the-money-follow-the-customer/#respond Wed, 14 Feb 2018 10:00:45 +0000 https://infocus.dellemc.com/?p=33843 “Mr. Schmarzo, we’ve noticed that your cholesterol count is at 210, so we have prescribed Fluvastatin and placed selected foods into your shopping basket to help you control your cholesterol. Complete the purchase by selecting ‘here’ and we’ll deliver the medication and groceries to your home today between 4:00 and 4:20pm.  If you complete the […]

The post Don’t Follow the Money; Follow the Customer! appeared first on InFocus Blog | Dell EMC Services.

]]>
“Mr. Schmarzo, we’ve noticed that your cholesterol count is at 210, so we have prescribed Fluvastatin and placed selected foods into your shopping basket to help you control your cholesterol. Complete the purchase by selecting ‘here’ and we’ll deliver the medication and groceries to your home today between 4:00 and 4:20pm.  If you complete the full Fluvastatin prescription, then we’ll reduce your monthly healthcare insurance payment by 5%.”

This scenario is surprisingly close to reality as mergers cause traditional healthcare industry borders (healthcare provider, healthcare payer, pharmacy) to crumble.  A recent BusinessWeek article “CVS Brings One-stop Shopping to Health Care” highlighted the potential benefits of a vertical consolidation of the healthcare ecosystem players:

The [CVS – Aetna] deal would create a behemoth that would try to shift some of Aetna customers’ care away from doctors and hospitals and into thousands of CVS stores. “Think of these stores as a hub of a new way of accessing health-care services across America,” says CVS Chief Executive Officer Larry Merlo. “We’re bringing health care to where people live and work.”

Healthcare value chain vertical integrations could provide substantial benefits to everyday consumers and patients alike, including:

  • An accelerated move to value-based care (focusing on preventive care) and away from the traditional “pay by the service” model (which rewards healthcare participants for more care)
  • A reduction in some of the dysfunctional incentives built into today’s healthcare value chain, such as pharmacy benefit managers (PBMs) profiting from back-end rebates and fees extracted from pharmacy companies

Superior understanding of customers’ behaviors and preferences and product usage patterns form the basis for industry transformations.

New Normal: Business Model Disintermediation and Disruption

Industry after industry is under attack by upstart disruptors and no industry is safe. The basis for their attack is exploiting and monetizing superior customer product preferences and buying habits.The more these disruptors know about their customers – their preferences, behaviors, tendencies, inclinations, interests, passions, associations, affiliations – the better positioned they are to create new sources of value and revenue (see Figure 1).

Figure 1: Business Model Disruption and Customer Disintermediation

Established companies are being attacked by companies that are more effective at leveraging big data technologies, new sources of customer, product and operational data, and advanced analytics (machine learning, deep learning, and artificial intelligence) to:

  • Disrupt business models by applying customer, product, operational and market insights to optimize key business and operational processes. Additionally, data-driven insights uncover new sources of revenue such as new products, services, markets, audiences, channels, partners, etc.
  • Disintermediate customer relationships by exploiting detailed customer engagement behaviors and product usage tendencies to provide a more compelling and differentiated user experience.

Check out “The New Normal: Big Data Business Model Disintermediation and Disruption” for more details on business model disruption and customer disintermediation.

The following companies are challenging traditional industry business models with superior customer preferences and buying habits:

  • Uber: The world’s largest taxi company owns 0 taxis
  • Airbnb: The largest accommodation provider does not own real estate
  • TripAdvisor: The world’s largest travel company owns 0 inventory
  • Skype, Whatsapp, WeChat: The largest phone companies do not own any telco infrastructure
  • SocietyOne: The fastest growing bank has no actual money
  • eBay: One of the world’s most valuable retailer has no inventory
  • Apple & Google: The largest software vendors write a minimal number of apps
  • Facebook: The most popular media owner does not create content
  • Netflix: The world’s largest movie house does not own any cinemas or create any content (until recently)

Industry transformations will only accelerate because leading companies realize that instead of “following the money,” they should “follow the customers.

Follow the Customer

“Follow the money” is a catchphrase used to understand an organization’s flow of the money and sources of value. Organizations use accounting, auditing, investigative, data and analytic skills to “follow the money” and determine their financial value

However this infatuation with following the money can actually lead organizations astray, and make the vulnerable to disruption and disintermediation from more nimble, more customer-focused organizations. Such organizations operate in industries where:

  • The market is too fragmented for any one organization to provide a complete customer solution and experience.
  • Customer experiences are unsatisfactory.
  • Customer outcomes are questionable, or are downright wrong.
  • “Product Mentality” permeates the Senior Executive team

For example, Amazon is vertically integrating the grocery industry with their recent acquisition of Whole Foods. Where Amazon plans to take the grocery industry (as well as the entire retail industry) starts with their mission statement:

  • Traditional Grocer: “Our goal is to be the first choice for those customers who have the opportunity to shop locally”
  • Amazon: “To be Earth’s most customer-centric company, where customers can find and discover anything they might want to buy online, at get those items quickly, at the lowest possible prices”

Amazon enhances and simplifies the customer-centric experience with a host of simple, easily accessible user experience choices such as one-click buying, mobile ordering, free and same day delivery, and more.

Check out “What is Digital Transformation?” for examples of how Amazon is leveraging customer insights to vertically integrate the grocery industry.

Optimizing the Customer Experience

80% of customers want a personalized experience from their retailer. Customers don’t want to be treated as numbers on a fact sheet and love it when organizations show a semblance of personalization towards them[1].

Providing a more holistic, more engaging customer experience starts with understanding each individual customer’s behaviors., tendencies, inclinations, biases, preferences, patterns, interests, passions, associations and affiliations.  More than just capturing the customer’s purchase and social media data, leading customer-centric organizations uncover and codify 1) what products and services a customer tends to buy and 2) what products and services customers like them buy.

Amazon is arguably the industry leader in providing a highly personalized customer experience that starts with their recommendation engine (see Figure 2).

Figure 2: Amazon Recommendation Engine

Amazon recently open-sourced their artificial intelligence framework (DSSTNE: Deep Scalable Sparse Tensor Neural Engine) that powers their recommendation engine.  Amazon’s product catalog is huge, making their purchase transactions datasets extremely sparse.  This creates a significant challenge for traditional neural network frameworks, so Amazon created DSSTNE to generate recommendations that power personalized experiences across the Amazon website and Amazon devices[2].

Dell EMC Consulting uses Analytic Profiles to capture and codify a customer’s behaviors, tendencies, inclinations, biases, preferences, patterns, interests, passions, associations and affiliations (see Figure 3).

Figure 3: Customer Analytic Profile

See “Analytic Profiles: Key to Data Monetization” for more details on Analytic Profiles.

Organizations can leverage customer insights captured in the Analytic Profiles to optimize key business and operational processes, reduce security and compliance risks, uncover new revenue opportunities, and create a more compelling customer engagement lifecycle (see Figure 4).

Figure 4: Optimizing the Customer Lifecycle

See “Optimizing the Customer Lifecycle With Customer Insights” for more details on leveraging big data and data analytics to optimize your customer’s lifecycle.

Follow the Customer Summary

Leading organizations are realizing that instead of “following the money” that they should be “following their customers” and mining their customers’ buying habits regardless of artificially defined industry boundaries (see Figure 5).

It is these customer insights that will transform the organization’s business models, disintermediate under-served customers, create new sources of revenue, and eventually transform the business into an intelligent enterprise.

Sources:

[1]Generating Recommendations at Amazon Scale with Apache Spark and Amazon DSSTNE

[2]Retail: How to Keep it Personal & Take Care of Privacy

 

The post Don’t Follow the Money; Follow the Customer! appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/dont-follow-the-money-follow-the-customer/feed/ 0
How State Governments Can Protect and Win with Big Data, AI and Privacy https://infocus.dellemc.com/william_schmarzo/how-state-governments-can-protect-and-win-with-big-data-ai-and-privacy/ https://infocus.dellemc.com/william_schmarzo/how-state-governments-can-protect-and-win-with-big-data-ai-and-privacy/#respond Tue, 06 Feb 2018 10:00:18 +0000 https://infocus.dellemc.com/?p=33527 I was recently asked to conduct a 2-hour workshop for the State of California Senior Legislators on the topic of “Big Data, Artificial Intelligence and Privacy.” Honored by the privilege of offering my perspective on these critical topics, I shared with my home-state legislators how significant opportunities await the state. I reviewed the once-in-a-generation opportunities […]

The post How State Governments Can Protect and Win with Big Data, AI and Privacy appeared first on InFocus Blog | Dell EMC Services.

]]>
I was recently asked to conduct a 2-hour workshop for the State of California Senior Legislators on the topic of “Big Data, Artificial Intelligence and Privacy.” Honored by the privilege of offering my perspective on these critical topics, I shared with my home-state legislators how significant opportunities await the state. I reviewed the once-in-a-generation opportunities awaiting the great State of California (“the State”), where decision makers could vastly improve their constituents’ quality of life, while creating new sources of value and economic growth.

Industrial Revolution Learnings

We have historical experiences and references to revisit in discerning what the government can do to nurture our “Analytics Revolution.” Notably, the Industrial Revolution, holds many lessons regarding the consequences of late and/or confusing government involvement and guidance (see Figure 1).

Figure 1: Lessons from the Industrial Revolution

 

Government’s role in the “Analytics Revolution” is clear: to carefully nurture and support industry, university, and government collaboration to encourage sustainable growth and prepare for massive changes and opportunities. The government can’t afford to stand by and let the markets decide. By the time the markets have decided, it may be too late to redirect and guide resources, especially given the interests of Russia and China in this all-important science.

Be Prepared to Action on the Nefarious

Access to sensitive information, data protection, privacy – these are all hot button issues with the citizenry. The State must be aware of the society and cultural risks associated with the idea of a “Big Brother” shadowing its people. The State must champion legislation in cooperation with industry in order to protect the masses, while not stifling creativity and innovation. That’s a tough job, but the natural conflict between “nurturing while protecting” is why the government needs to be involved early. Through early engagement, the State can then reduce concern between industrial growth and personal privacy.

The “Analytics Revolution” holds tremendous promise for the future of industry and personal achieve, but will require well-defined rules of conduct and engagement. Unsupervised growth or use may lead to information being exploited in nefarious ways with potentially damaging results.

The State must protect its constituents’ sensitive information while nurturing the industrial opportunity. That’s a tall order, but nothing less should be expected from our government, industry and society leaders.

Can’t Operate in a World of Fear

We can’t be afraid of what we don’t know. The State must increase constituents’ awareness and education of Big Data and Artificial Intelligence; what they are, what they are used for and the opportunities locked within including “The Good, the Bad, and the Ugly.”

We can’t operate in a world of fear; jump to conclusions based upon little or no information, or worse yet, misinformation or purposeful lies. Government leaders must collaborate with industry and universities to actively gain understanding of the true ramifications and capabilities of Big Data and Artificial Intelligence, before they create legislation (see Figure 2).

Figure 2: Government Leaders Must Seek Information before Jumping to Conclusions

 

It’s because I’m an educator in this field that I was so honored to be part of this discussion. In addition to discussing the economic opportunities that lie within Big Data and Artificial Intelligence, I wanted to help our legislators understand they should prioritize their own learning and education of these sciences before enacting rules and regulations.

Predict to Prevent

The opportunities for good are almost overwhelming at the government level! Whether in education, public services, traffic, fraud, crime, wild fires, public safety or population health, Big Data and Artificial Intelligence can dramatically improve outcomes while reducing costs and risks (see Figure 3).

Figure 3: Big Data and AI Reducing Crop Loss to Diseases

 

However, to take advantage of the potential of Big Data and Artificial Intelligence, The State, its agencies, and its legislators need to undergo a mind shift. They need to evolve beyond “using data and analytics to monitor agency outcomes” to understanding how to “leverage data and analytics to Predict, to Prescribe and to Prevent!”  That is, these organizations need to evolve from a mindset of reporting what happened to a mindset of predicting what’s likely to happen and prescribing corrective or preventative actions or behaviors (see Figure 4).

Figure 4: The “Predict to Prescribe to Prevent” Value Chain

 

There are numerous use cases of this “predict to prevent” value chain that will not only benefit state agencies’ operations, but also have positive and quality of life ramifications to the residents of California including the opportunity to prevent:

  • Hospital acquired infections
  • Crime
  • Traffic Jams / vehicle accidents
  • Major road maintenance
  • Cyber attacks
  • Wild fires
  • Equipment maintenance and failures
  • Electricity and utility outages
  • And more…

Role of Government

The role of government is to nurture, not necessarily to create, especially in California. California is blessed with a bounty of human capital resources including an outstanding higher education system and an active culture of corporate investing such as the Google $1B AI Fund (see “Google Commits $1 Billion In Grants To Train U.S. Workers For High-Tech Jobs”).

There is a bounty of free and low-cost Big Data and Artificial Intelligence training available. For example, Andrew Ng, one of the world’s best-known artificial-intelligence experts, is launching an online effort to create millions more AI experts across a range of industries. Ng, an early pioneer in online learning, hopes his new deep-learning course on Coursera will train people to use the most powerful idea to have emerged in AI in recent years.

California sits in rarified air when it comes to the volume of natural talent in the Big Data and Artificial Intelligence spaces. The State should seize on these assets, coordinate all of these valuable resources and ensure that this quality and depth of training is available to all.

State of California Summary

In summarizing what I told my audience, Big Data and Artificial Intelligence provide new challenges, but the opportunities for both private and public sectors are many. To harness the power of Big Data and AI, the State should focus on:

  • Minimizing impact of nefarious, illegal and dangerous activities
  • Balancing Consumer value vs. Consumer exploitation
  • Addressing inequities in data monetization opportunities
  • Re-tooling / Re-skilling the California workforce
  • Fueling innovation via university-government-business collaboration
  • Adopt regulations for ensuring citizen/customer fairness (share of the wealth)
  • Providing incentives to accelerate state-wide transformation and adoption

Figure 5: Threats to the California “Way of Life”

 

It is up to everyone — the universities, companies, and individuals — to step up and provide guidance to our government and education leaders to keep California at the forefront of our “Analytics Revolution.” This is one race where there is no silver medal for finishing second.

The post How State Governments Can Protect and Win with Big Data, AI and Privacy appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/how-state-governments-can-protect-and-win-with-big-data-ai-and-privacy/feed/ 0
The Consumerization of Artificial Intelligence https://infocus.dellemc.com/william_schmarzo/the-consumerization-of-artificial-intelligence/ https://infocus.dellemc.com/william_schmarzo/the-consumerization-of-artificial-intelligence/#respond Wed, 31 Jan 2018 10:00:05 +0000 https://infocus.dellemc.com/?p=33512 Consumerization is the design, marketing, and selling of products and services targeting the individual end consumer. Apple CEO Tim Cook recently promoted a $100-per-year iPhone app called Derm Expert. Derm Expert allows doctors to diagnose skin problems using only their iPhone.  Doctors take a photo of a patient’s skin condition and then Derm Expert diagnoses […]

The post The Consumerization of Artificial Intelligence appeared first on InFocus Blog | Dell EMC Services.

]]>
Consumerization is the design, marketing, and selling of products and services targeting the individual end consumer.

Apple CEO Tim Cook recently promoted a $100-per-year iPhone app called Derm Expert. Derm Expert allows doctors to diagnose skin problems using only their iPhone.  Doctors take a photo of a patient’s skin condition and then Derm Expert diagnoses the problem and prescribes treatment. Doctors can effectively treat patients without a high performance computer or an expensive technology environment. They just need the same iPhone that you and I use every day.

Figure 1: Derm Expert App

 

Derm Expert makes use of Apple’s Core ML framework that is built into all new iPhones. Core ML makes it possible to run Machine Learning and Deep Learning algorithms on an iPhone without having to upload the photos to the “cloud” for processing.

Apple is not the only company integrating Machine Learning and Deep Learning frameworks into their products, but it may be the first company to put such a powerful capability into the hands of millions of consumers. Whether we know it or not, we have all become “Citizens of Data Science,” and the world will never be the same.

Embedding Machine Learning Frameworks

Apple Core ML in the iPhone is an example of how industry leaders are seamlessly embedding powerful machine learning, deep learning, and artificial intelligence frameworks into their development and operating platforms. Doing so enables Apple IOS developers to create a more engaging, easy-to-use customer experience, leveraging Natural Language Processing (NLP) for voice-to-text translation (Siri) and Facial recognition. Plus, it opens the door for countless new apps and use cases that can exploit the power of these embedded frameworks.

Core ML enables developers to integrate a broad variety of machine learning algorithms into their apps with just a few lines of code. Core ML supports over 30 deep learning (neural network) algorithms, as well as Support Vector Machine (SVM) and Generalized Linear Models (GLM)[1].

For example,

  • Developers can integrate computer vision machine learning features into their app including face tracking, face detection, landmarks, text detection, rectangle detection, barcode detection, object tracking and image registration.
  • The natural language processing APIs in Core ML use machine learning to decipher text using language identification, tokenization, lemmatization and named entity recognition.

Core ML supports Vision for image analysis, Foundation for natural language processing, and GameplayKit for evaluating learned decision trees (see Figure 2).

Figure 2: Core ML Is Optimized for On-Device Performance, Which Minimizes Memory Footprint and Power Consumption

 

Machine Learning and Deep Learning Microprocessor Specialization

Artificial intelligence, machine learning and deep learning (AI | ML | DL) require massive amounts of computer processing power. And while the current solution is just to throw more processors at the problem, eventually that solution won’t scale as the processing needs and the volume of detailed, real-time data increase3.

One of the developments leading to the consumerization of artificial intelligence is the ability to exploit microprocessor or hardware specialization. The traditional Central Processing Unit (CPU) is being replaced by special-purpose microprocessors built to execute complex machine learning and deep learning algorithms.

This includes:

  • Graphics Processing Unit (GPU): a specialized electronic circuit designed to render 2D and 3D graphics together with a CPU. It is also known as a graphics card in the gamer’s culture. Now GPUs are being harnessed more broadly to accelerate computational workloads in areas such as financial modeling, cutting-edge scientific research, deep learning, analytics, and oil and gas exploration etc.
  • Tensor Processing Unit (TPU): a custom-built integrated circuit developed specifically for machine learning and tailored for TensorFlow (Google’s open-source machine learning framework).TPU is designed to handle common machine learning and neural networking calculations for training and inference, specifically: matrix multiply, dot product, and quantization transforms. On production, AI workloads that utilize neural network inference, the TPU is 15 times to 30 times faster than contemporary GPUs and CPUs, according to Google.

Intel is designing a new chip specifically for Deep Learning called the Intel® Nervana™ Neural Network Processor (NNP)[4]. The Intel Nervana NNP supports deep learning primitives such as matrix multiplication and convolutions. Intel Nervana NNP enables better memory management for Deep Learning algorithms to achieve high levels of utilization of the massive amount of compute on each die.

The bottom-line translates to achieving faster training time for Deep Learning models.

Finally, a new company called “Groq” is building a special purpose chip that will run 400 trillion operations per second, more than twice as fast as Google’s TPU[5].

What do all these advancements in GPU and TPU mean to you the consumer?

“Smart” apps that leverage these powerful processors and the embedded AI | ML | DL frameworks to learn more about you to provide a hyper-personalized, prescriptive user experience.

It’ll be like a really smart, highly attentive personal assistant on steroids!

The Power of AI in Your Hands

Unknowingly over the past few years, artificial intelligence worked its way into our everyday lives. Give a command to Siri or Alexa and AI kicks in to translate what you said and look up answer. Upload a photo to Facebook and AI identifies the people in the photo. Enter a destination into Waze or Google Maps and AI provides updated recommendations on the best route. Push a button and AI parallel parks your car all by itself (dang, where was that during my driver’s test!).

With advances in computer processors and embedded AI | ML | DL frameworks, we are just beginning to see the use cases. And like the Derm Expert app highlights, the way that we live will never be the same.

Sources:

[1]Build More Intelligent Apps With Machine Learning

Figure 2: Core ML

[3]Are limitations of CPU speed and memory prevent us from creating AI systems

[4]Intel® Nervana™ Neural Network Processors (NNP) Redefine AI Silicon

[5]Groq Says It Will Reveal Potent Artificial Intelligence Chip Next Year

The post The Consumerization of Artificial Intelligence appeared first on InFocus Blog | Dell EMC Services.

]]>
https://infocus.dellemc.com/william_schmarzo/the-consumerization-of-artificial-intelligence/feed/ 0